Dec 02 06:43:59 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Dec 02 06:43:59 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 02 06:43:59 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Dec 02 06:43:59 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 02 06:43:59 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 02 06:43:59 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 02 06:43:59 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 02 06:43:59 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Dec 02 06:43:59 localhost kernel: signal: max sigframe size: 1776
Dec 02 06:43:59 localhost kernel: BIOS-provided physical RAM map:
Dec 02 06:43:59 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 02 06:43:59 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 02 06:43:59 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 02 06:43:59 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 02 06:43:59 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 02 06:43:59 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 02 06:43:59 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 02 06:43:59 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Dec 02 06:43:59 localhost kernel: NX (Execute Disable) protection: active
Dec 02 06:43:59 localhost kernel: SMBIOS 2.8 present.
Dec 02 06:43:59 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 02 06:43:59 localhost kernel: Hypervisor detected: KVM
Dec 02 06:43:59 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 02 06:43:59 localhost kernel: kvm-clock: using sched offset of 1907332051 cycles
Dec 02 06:43:59 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 02 06:43:59 localhost kernel: tsc: Detected 2799.998 MHz processor
Dec 02 06:43:59 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 02 06:43:59 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 02 06:43:59 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Dec 02 06:43:59 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 02 06:43:59 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 02 06:43:59 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 02 06:43:59 localhost kernel: Using GB pages for direct mapping
Dec 02 06:43:59 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Dec 02 06:43:59 localhost kernel: ACPI: Early table checksum verification disabled
Dec 02 06:43:59 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 02 06:43:59 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 06:43:59 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 06:43:59 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 06:43:59 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 02 06:43:59 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 06:43:59 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 06:43:59 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 02 06:43:59 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 02 06:43:59 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 02 06:43:59 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 02 06:43:59 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 02 06:43:59 localhost kernel: No NUMA configuration found
Dec 02 06:43:59 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Dec 02 06:43:59 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff]
Dec 02 06:43:59 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB)
Dec 02 06:43:59 localhost kernel: Zone ranges:
Dec 02 06:43:59 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 02 06:43:59 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 02 06:43:59 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Dec 02 06:43:59 localhost kernel:   Device   empty
Dec 02 06:43:59 localhost kernel: Movable zone start for each node
Dec 02 06:43:59 localhost kernel: Early memory node ranges
Dec 02 06:43:59 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 02 06:43:59 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 02 06:43:59 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Dec 02 06:43:59 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Dec 02 06:43:59 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 02 06:43:59 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 02 06:43:59 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 02 06:43:59 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 02 06:43:59 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 02 06:43:59 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 02 06:43:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 02 06:43:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 02 06:43:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 02 06:43:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 02 06:43:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 02 06:43:59 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 02 06:43:59 localhost kernel: TSC deadline timer available
Dec 02 06:43:59 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Dec 02 06:43:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 02 06:43:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 02 06:43:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 02 06:43:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 02 06:43:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 02 06:43:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 02 06:43:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 02 06:43:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 02 06:43:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 02 06:43:59 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 02 06:43:59 localhost kernel: Booting paravirtualized kernel on KVM
Dec 02 06:43:59 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 02 06:43:59 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 02 06:43:59 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Dec 02 06:43:59 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Dec 02 06:43:59 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 02 06:43:59 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 02 06:43:59 localhost kernel: Fallback order for Node 0: 0 
Dec 02 06:43:59 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Dec 02 06:43:59 localhost kernel: Policy zone: Normal
Dec 02 06:43:59 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Dec 02 06:43:59 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Dec 02 06:43:59 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Dec 02 06:43:59 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 02 06:43:59 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 02 06:43:59 localhost kernel: software IO TLB: area num 8.
Dec 02 06:43:59 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved)
Dec 02 06:43:59 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Dec 02 06:43:59 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 02 06:43:59 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Dec 02 06:43:59 localhost kernel: ftrace: allocated 176 pages with 3 groups
Dec 02 06:43:59 localhost kernel: Dynamic Preempt: voluntary
Dec 02 06:43:59 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 02 06:43:59 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 02 06:43:59 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 02 06:43:59 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 02 06:43:59 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 02 06:43:59 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 02 06:43:59 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 02 06:43:59 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 02 06:43:59 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 02 06:43:59 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 02 06:43:59 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Dec 02 06:43:59 localhost kernel: Console: colour VGA+ 80x25
Dec 02 06:43:59 localhost kernel: printk: console [tty0] enabled
Dec 02 06:43:59 localhost kernel: printk: console [ttyS0] enabled
Dec 02 06:43:59 localhost kernel: ACPI: Core revision 20211217
Dec 02 06:43:59 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 02 06:43:59 localhost kernel: x2apic enabled
Dec 02 06:43:59 localhost kernel: Switched APIC routing to physical x2apic.
Dec 02 06:43:59 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 02 06:43:59 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec 02 06:43:59 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 02 06:43:59 localhost kernel: LSM: Security Framework initializing
Dec 02 06:43:59 localhost kernel: Yama: becoming mindful.
Dec 02 06:43:59 localhost kernel: SELinux:  Initializing.
Dec 02 06:43:59 localhost kernel: LSM support for eBPF active
Dec 02 06:43:59 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Dec 02 06:43:59 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Dec 02 06:43:59 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 02 06:43:59 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 02 06:43:59 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 02 06:43:59 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 02 06:43:59 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 02 06:43:59 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Dec 02 06:43:59 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Dec 02 06:43:59 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 02 06:43:59 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 02 06:43:59 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 02 06:43:59 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 02 06:43:59 localhost kernel: Freeing SMP alternatives memory: 36K
Dec 02 06:43:59 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 02 06:43:59 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Dec 02 06:43:59 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Dec 02 06:43:59 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Dec 02 06:43:59 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Dec 02 06:43:59 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 02 06:43:59 localhost kernel: ... version:                0
Dec 02 06:43:59 localhost kernel: ... bit width:              48
Dec 02 06:43:59 localhost kernel: ... generic registers:      6
Dec 02 06:43:59 localhost kernel: ... value mask:             0000ffffffffffff
Dec 02 06:43:59 localhost kernel: ... max period:             00007fffffffffff
Dec 02 06:43:59 localhost kernel: ... fixed-purpose events:   0
Dec 02 06:43:59 localhost kernel: ... event mask:             000000000000003f
Dec 02 06:43:59 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 02 06:43:59 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 02 06:43:59 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 02 06:43:59 localhost kernel: x86: Booting SMP configuration:
Dec 02 06:43:59 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 02 06:43:59 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 02 06:43:59 localhost kernel: smpboot: Max logical packages: 8
Dec 02 06:43:59 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec 02 06:43:59 localhost kernel: node 0 deferred pages initialised in 26ms
Dec 02 06:43:59 localhost kernel: devtmpfs: initialized
Dec 02 06:43:59 localhost kernel: x86/mm: Memory block size: 128MB
Dec 02 06:43:59 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 02 06:43:59 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Dec 02 06:43:59 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 02 06:43:59 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 02 06:43:59 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Dec 02 06:43:59 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 02 06:43:59 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 02 06:43:59 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 02 06:43:59 localhost kernel: audit: type=2000 audit(1764657838.654:1): state=initialized audit_enabled=0 res=1
Dec 02 06:43:59 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 02 06:43:59 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 02 06:43:59 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 02 06:43:59 localhost kernel: cpuidle: using governor menu
Dec 02 06:43:59 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Dec 02 06:43:59 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 02 06:43:59 localhost kernel: PCI: Using configuration type 1 for base access
Dec 02 06:43:59 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 02 06:43:59 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 02 06:43:59 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Dec 02 06:43:59 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Dec 02 06:43:59 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Dec 02 06:43:59 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 02 06:43:59 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 02 06:43:59 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 02 06:43:59 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 02 06:43:59 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 02 06:43:59 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Dec 02 06:43:59 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Dec 02 06:43:59 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Dec 02 06:43:59 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 02 06:43:59 localhost kernel: ACPI: Interpreter enabled
Dec 02 06:43:59 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 02 06:43:59 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 02 06:43:59 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 02 06:43:59 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 02 06:43:59 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 02 06:43:59 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 02 06:43:59 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [3] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [4] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [5] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [6] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [7] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [8] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [9] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [10] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [11] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [12] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [13] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [14] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [15] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [16] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [17] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [18] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [19] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [20] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [21] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [22] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [23] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [24] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [25] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [26] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [27] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [28] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [29] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [30] registered
Dec 02 06:43:59 localhost kernel: acpiphp: Slot [31] registered
Dec 02 06:43:59 localhost kernel: PCI host bridge to bus 0000:00
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 02 06:43:59 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 02 06:43:59 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Dec 02 06:43:59 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Dec 02 06:43:59 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 02 06:43:59 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Dec 02 06:43:59 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Dec 02 06:43:59 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 02 06:43:59 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Dec 02 06:43:59 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Dec 02 06:43:59 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Dec 02 06:43:59 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 02 06:43:59 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Dec 02 06:43:59 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Dec 02 06:43:59 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Dec 02 06:43:59 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Dec 02 06:43:59 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 02 06:43:59 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Dec 02 06:43:59 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Dec 02 06:43:59 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 02 06:43:59 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Dec 02 06:43:59 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Dec 02 06:43:59 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 02 06:43:59 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 02 06:43:59 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 02 06:43:59 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 02 06:43:59 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 02 06:43:59 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 02 06:43:59 localhost kernel: iommu: Default domain type: Translated 
Dec 02 06:43:59 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Dec 02 06:43:59 localhost kernel: SCSI subsystem initialized
Dec 02 06:43:59 localhost kernel: ACPI: bus type USB registered
Dec 02 06:43:59 localhost kernel: usbcore: registered new interface driver usbfs
Dec 02 06:43:59 localhost kernel: usbcore: registered new interface driver hub
Dec 02 06:43:59 localhost kernel: usbcore: registered new device driver usb
Dec 02 06:43:59 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 02 06:43:59 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 02 06:43:59 localhost kernel: PTP clock support registered
Dec 02 06:43:59 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 02 06:43:59 localhost kernel: NetLabel: Initializing
Dec 02 06:43:59 localhost kernel: NetLabel:  domain hash size = 128
Dec 02 06:43:59 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 02 06:43:59 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 02 06:43:59 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 02 06:43:59 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 02 06:43:59 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 02 06:43:59 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 02 06:43:59 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 02 06:43:59 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 02 06:43:59 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 02 06:43:59 localhost kernel: vgaarb: loaded
Dec 02 06:43:59 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 02 06:43:59 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 02 06:43:59 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 02 06:43:59 localhost kernel: pnp: PnP ACPI init
Dec 02 06:43:59 localhost kernel: pnp 00:03: [dma 2]
Dec 02 06:43:59 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 02 06:43:59 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 02 06:43:59 localhost kernel: NET: Registered PF_INET protocol family
Dec 02 06:43:59 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Dec 02 06:43:59 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Dec 02 06:43:59 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 02 06:43:59 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 02 06:43:59 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 02 06:43:59 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Dec 02 06:43:59 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Dec 02 06:43:59 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Dec 02 06:43:59 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Dec 02 06:43:59 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 02 06:43:59 localhost kernel: NET: Registered PF_XDP protocol family
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 02 06:43:59 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 02 06:43:59 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 02 06:43:59 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 02 06:43:59 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 29760 usecs
Dec 02 06:43:59 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 02 06:43:59 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 02 06:43:59 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 02 06:43:59 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 02 06:43:59 localhost kernel: ACPI: bus type thunderbolt registered
Dec 02 06:43:59 localhost kernel: Initialise system trusted keyrings
Dec 02 06:43:59 localhost kernel: Key type blacklist registered
Dec 02 06:43:59 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Dec 02 06:43:59 localhost kernel: zbud: loaded
Dec 02 06:43:59 localhost kernel: integrity: Platform Keyring initialized
Dec 02 06:43:59 localhost kernel: NET: Registered PF_ALG protocol family
Dec 02 06:43:59 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 02 06:43:59 localhost kernel: Key type asymmetric registered
Dec 02 06:43:59 localhost kernel: Asymmetric key parser 'x509' registered
Dec 02 06:43:59 localhost kernel: Running certificate verification selftests
Dec 02 06:43:59 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 02 06:43:59 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 02 06:43:59 localhost kernel: io scheduler mq-deadline registered
Dec 02 06:43:59 localhost kernel: io scheduler kyber registered
Dec 02 06:43:59 localhost kernel: io scheduler bfq registered
Dec 02 06:43:59 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 02 06:43:59 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 02 06:43:59 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 02 06:43:59 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 02 06:43:59 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 02 06:43:59 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 02 06:43:59 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 02 06:43:59 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 02 06:43:59 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 02 06:43:59 localhost kernel: Non-volatile memory driver v1.3
Dec 02 06:43:59 localhost kernel: rdac: device handler registered
Dec 02 06:43:59 localhost kernel: hp_sw: device handler registered
Dec 02 06:43:59 localhost kernel: emc: device handler registered
Dec 02 06:43:59 localhost kernel: alua: device handler registered
Dec 02 06:43:59 localhost kernel: libphy: Fixed MDIO Bus: probed
Dec 02 06:43:59 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Dec 02 06:43:59 localhost kernel: ehci-pci: EHCI PCI platform driver
Dec 02 06:43:59 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Dec 02 06:43:59 localhost kernel: ohci-pci: OHCI PCI platform driver
Dec 02 06:43:59 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Dec 02 06:43:59 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 02 06:43:59 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 02 06:43:59 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 02 06:43:59 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 02 06:43:59 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 02 06:43:59 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 02 06:43:59 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 02 06:43:59 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Dec 02 06:43:59 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 02 06:43:59 localhost kernel: hub 1-0:1.0: USB hub found
Dec 02 06:43:59 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 02 06:43:59 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 02 06:43:59 localhost kernel: usbserial: USB Serial support registered for generic
Dec 02 06:43:59 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 02 06:43:59 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 02 06:43:59 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 02 06:43:59 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 02 06:43:59 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 02 06:43:59 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 02 06:43:59 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 02 06:43:59 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-02T06:43:58 UTC (1764657838)
Dec 02 06:43:59 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 02 06:43:59 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 02 06:43:59 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 02 06:43:59 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 02 06:43:59 localhost kernel: usbcore: registered new interface driver usbhid
Dec 02 06:43:59 localhost kernel: usbhid: USB HID core driver
Dec 02 06:43:59 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 02 06:43:59 localhost kernel: Initializing XFRM netlink socket
Dec 02 06:43:59 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 02 06:43:59 localhost kernel: Segment Routing with IPv6
Dec 02 06:43:59 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 02 06:43:59 localhost kernel: mpls_gso: MPLS GSO support
Dec 02 06:43:59 localhost kernel: IPI shorthand broadcast: enabled
Dec 02 06:43:59 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 02 06:43:59 localhost kernel: AES CTR mode by8 optimization enabled
Dec 02 06:43:59 localhost kernel: sched_clock: Marking stable (791422453, 183764266)->(1104752934, -129566215)
Dec 02 06:43:59 localhost kernel: registered taskstats version 1
Dec 02 06:43:59 localhost kernel: Loading compiled-in X.509 certificates
Dec 02 06:43:59 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Dec 02 06:43:59 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 02 06:43:59 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 02 06:43:59 localhost kernel: zswap: loaded using pool lzo/zbud
Dec 02 06:43:59 localhost kernel: page_owner is disabled
Dec 02 06:43:59 localhost kernel: Key type big_key registered
Dec 02 06:43:59 localhost kernel: Freeing initrd memory: 74232K
Dec 02 06:43:59 localhost kernel: Key type encrypted registered
Dec 02 06:43:59 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 02 06:43:59 localhost kernel: Loading compiled-in module X.509 certificates
Dec 02 06:43:59 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 02 06:43:59 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Dec 02 06:43:59 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 02 06:43:59 localhost kernel: ima: No architecture policies found
Dec 02 06:43:59 localhost kernel: evm: Initialising EVM extended attributes:
Dec 02 06:43:59 localhost kernel: evm: security.selinux
Dec 02 06:43:59 localhost kernel: evm: security.SMACK64 (disabled)
Dec 02 06:43:59 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 02 06:43:59 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 02 06:43:59 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 02 06:43:59 localhost kernel: evm: security.apparmor (disabled)
Dec 02 06:43:59 localhost kernel: evm: security.ima
Dec 02 06:43:59 localhost kernel: evm: security.capability
Dec 02 06:43:59 localhost kernel: evm: HMAC attrs: 0x1
Dec 02 06:43:59 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 02 06:43:59 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 02 06:43:59 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 02 06:43:59 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 02 06:43:59 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 02 06:43:59 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 02 06:43:59 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 02 06:43:59 localhost kernel: Freeing unused decrypted memory: 2036K
Dec 02 06:43:59 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Dec 02 06:43:59 localhost kernel: Write protecting the kernel read-only data: 26624k
Dec 02 06:43:59 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Dec 02 06:43:59 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Dec 02 06:43:59 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 02 06:43:59 localhost kernel: Run /init as init process
Dec 02 06:43:59 localhost kernel:   with arguments:
Dec 02 06:43:59 localhost kernel:     /init
Dec 02 06:43:59 localhost kernel:   with environment:
Dec 02 06:43:59 localhost kernel:     HOME=/
Dec 02 06:43:59 localhost kernel:     TERM=linux
Dec 02 06:43:59 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Dec 02 06:43:59 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 02 06:43:59 localhost systemd[1]: Detected virtualization kvm.
Dec 02 06:43:59 localhost systemd[1]: Detected architecture x86-64.
Dec 02 06:43:59 localhost systemd[1]: Running in initrd.
Dec 02 06:43:59 localhost systemd[1]: No hostname configured, using default hostname.
Dec 02 06:43:59 localhost systemd[1]: Hostname set to <localhost>.
Dec 02 06:43:59 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 02 06:43:59 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 02 06:43:59 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 02 06:43:59 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 02 06:43:59 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 02 06:43:59 localhost systemd[1]: Reached target Local File Systems.
Dec 02 06:43:59 localhost systemd[1]: Reached target Path Units.
Dec 02 06:43:59 localhost systemd[1]: Reached target Slice Units.
Dec 02 06:43:59 localhost systemd[1]: Reached target Swaps.
Dec 02 06:43:59 localhost systemd[1]: Reached target Timer Units.
Dec 02 06:43:59 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 02 06:43:59 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 02 06:43:59 localhost systemd[1]: Listening on Journal Socket.
Dec 02 06:43:59 localhost systemd[1]: Listening on udev Control Socket.
Dec 02 06:43:59 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 02 06:43:59 localhost systemd[1]: Reached target Socket Units.
Dec 02 06:43:59 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 02 06:43:59 localhost systemd[1]: Starting Journal Service...
Dec 02 06:43:59 localhost systemd[1]: Starting Load Kernel Modules...
Dec 02 06:43:59 localhost systemd[1]: Starting Create System Users...
Dec 02 06:43:59 localhost systemd[1]: Starting Setup Virtual Console...
Dec 02 06:43:59 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 02 06:43:59 localhost systemd-journald[284]: Journal started
Dec 02 06:43:59 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/f041467c26d044b9832e8db5f9b7a49d) is 8.0M, max 314.7M, 306.7M free.
Dec 02 06:43:59 localhost systemd-modules-load[285]: Module 'msr' is built in
Dec 02 06:43:59 localhost systemd[1]: Started Journal Service.
Dec 02 06:43:59 localhost systemd[1]: Finished Load Kernel Modules.
Dec 02 06:43:59 localhost systemd[1]: Finished Setup Virtual Console.
Dec 02 06:43:59 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 02 06:43:59 localhost systemd[1]: Starting dracut cmdline hook...
Dec 02 06:43:59 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 02 06:43:59 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997.
Dec 02 06:43:59 localhost systemd-sysusers[286]: Creating group 'users' with GID 100.
Dec 02 06:43:59 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81.
Dec 02 06:43:59 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 02 06:43:59 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 02 06:43:59 localhost systemd[1]: Finished Create System Users.
Dec 02 06:43:59 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 02 06:43:59 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 02 06:43:59 localhost dracut-cmdline[289]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Dec 02 06:43:59 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 02 06:43:59 localhost dracut-cmdline[289]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Dec 02 06:43:59 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 02 06:43:59 localhost systemd[1]: Finished dracut cmdline hook.
Dec 02 06:43:59 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 02 06:43:59 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 02 06:43:59 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 02 06:43:59 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Dec 02 06:43:59 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 02 06:43:59 localhost kernel: RPC: Registered udp transport module.
Dec 02 06:43:59 localhost kernel: RPC: Registered tcp transport module.
Dec 02 06:43:59 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 02 06:43:59 localhost rpc.statd[409]: Version 2.5.4 starting
Dec 02 06:43:59 localhost rpc.statd[409]: Initializing NSM state
Dec 02 06:43:59 localhost rpc.idmapd[414]: Setting log level to 0
Dec 02 06:43:59 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 02 06:43:59 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 02 06:43:59 localhost systemd-udevd[427]: Using default interface naming scheme 'rhel-9.0'.
Dec 02 06:43:59 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 02 06:43:59 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 02 06:43:59 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 02 06:43:59 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 02 06:43:59 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 02 06:43:59 localhost systemd[1]: Reached target System Initialization.
Dec 02 06:43:59 localhost systemd[1]: Reached target Basic System.
Dec 02 06:43:59 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 02 06:43:59 localhost systemd[1]: Reached target Network.
Dec 02 06:43:59 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 02 06:43:59 localhost systemd[1]: Starting dracut initqueue hook...
Dec 02 06:44:00 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Dec 02 06:44:00 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Dec 02 06:44:00 localhost kernel: GPT:20971519 != 838860799
Dec 02 06:44:00 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Dec 02 06:44:00 localhost kernel: GPT:20971519 != 838860799
Dec 02 06:44:00 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Dec 02 06:44:00 localhost kernel:  vda: vda1 vda2 vda3 vda4
Dec 02 06:44:00 localhost kernel: libata version 3.00 loaded.
Dec 02 06:44:00 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 02 06:44:00 localhost kernel: scsi host0: ata_piix
Dec 02 06:44:00 localhost kernel: scsi host1: ata_piix
Dec 02 06:44:00 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Dec 02 06:44:00 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Dec 02 06:44:00 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Dec 02 06:44:00 localhost systemd-udevd[464]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 06:44:00 localhost systemd[1]: Reached target Initrd Root Device.
Dec 02 06:44:00 localhost kernel: ata1: found unknown device (class 0)
Dec 02 06:44:00 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 02 06:44:00 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 02 06:44:00 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 02 06:44:00 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 02 06:44:00 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 02 06:44:00 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 02 06:44:00 localhost systemd[1]: Finished dracut initqueue hook.
Dec 02 06:44:00 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 02 06:44:00 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 02 06:44:00 localhost systemd[1]: Reached target Remote File Systems.
Dec 02 06:44:00 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 02 06:44:00 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 02 06:44:00 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Dec 02 06:44:00 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system.
Dec 02 06:44:00 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Dec 02 06:44:00 localhost systemd[1]: Mounting /sysroot...
Dec 02 06:44:00 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 02 06:44:00 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Dec 02 06:44:00 localhost kernel: XFS (vda4): Ending clean mount
Dec 02 06:44:00 localhost systemd[1]: Mounted /sysroot.
Dec 02 06:44:00 localhost systemd[1]: Reached target Initrd Root File System.
Dec 02 06:44:00 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 02 06:44:00 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 02 06:44:00 localhost systemd[1]: Reached target Initrd File Systems.
Dec 02 06:44:00 localhost systemd[1]: Reached target Initrd Default Target.
Dec 02 06:44:00 localhost systemd[1]: Starting dracut mount hook...
Dec 02 06:44:00 localhost systemd[1]: Finished dracut mount hook.
Dec 02 06:44:00 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 02 06:44:00 localhost rpc.idmapd[414]: exiting on signal 15
Dec 02 06:44:00 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 02 06:44:00 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 02 06:44:00 localhost systemd[1]: Stopped target Network.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Timer Units.
Dec 02 06:44:00 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 02 06:44:00 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Basic System.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Path Units.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Remote File Systems.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Slice Units.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Socket Units.
Dec 02 06:44:00 localhost systemd[1]: Stopped target System Initialization.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Local File Systems.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Swaps.
Dec 02 06:44:00 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped dracut mount hook.
Dec 02 06:44:00 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 02 06:44:00 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 02 06:44:00 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 02 06:44:00 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 02 06:44:00 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 02 06:44:00 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Load Kernel Modules.
Dec 02 06:44:00 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 02 06:44:00 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 02 06:44:00 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 02 06:44:00 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 02 06:44:00 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 02 06:44:00 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 02 06:44:00 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Closed udev Control Socket.
Dec 02 06:44:00 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Closed udev Kernel Socket.
Dec 02 06:44:00 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 02 06:44:00 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 02 06:44:00 localhost systemd[1]: Starting Cleanup udev Database...
Dec 02 06:44:00 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 02 06:44:00 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 02 06:44:00 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Stopped Create System Users.
Dec 02 06:44:00 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 02 06:44:00 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 02 06:44:00 localhost systemd[1]: Finished Cleanup udev Database.
Dec 02 06:44:00 localhost systemd[1]: Reached target Switch Root.
Dec 02 06:44:00 localhost systemd[1]: Starting Switch Root...
Dec 02 06:44:00 localhost systemd[1]: Switching root.
Dec 02 06:44:01 localhost systemd-journald[284]: Journal stopped
Dec 02 06:44:01 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd).
Dec 02 06:44:01 localhost kernel: audit: type=1404 audit(1764657841.061:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 02 06:44:01 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 06:44:01 localhost kernel: SELinux:  policy capability open_perms=1
Dec 02 06:44:01 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 06:44:01 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 02 06:44:01 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 06:44:01 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 06:44:01 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 06:44:01 localhost kernel: audit: type=1403 audit(1764657841.183:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 02 06:44:01 localhost systemd[1]: Successfully loaded SELinux policy in 125.176ms.
Dec 02 06:44:01 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 24.640ms.
Dec 02 06:44:01 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 02 06:44:01 localhost systemd[1]: Detected virtualization kvm.
Dec 02 06:44:01 localhost systemd[1]: Detected architecture x86-64.
Dec 02 06:44:01 localhost systemd-rc-local-generator[583]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 06:44:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 06:44:01 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 02 06:44:01 localhost systemd[1]: Stopped Switch Root.
Dec 02 06:44:01 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 02 06:44:01 localhost systemd[1]: Created slice Slice /system/getty.
Dec 02 06:44:01 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 02 06:44:01 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 02 06:44:01 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 02 06:44:01 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Dec 02 06:44:01 localhost systemd[1]: Created slice User and Session Slice.
Dec 02 06:44:01 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 02 06:44:01 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 02 06:44:01 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 02 06:44:01 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 02 06:44:01 localhost systemd[1]: Stopped target Switch Root.
Dec 02 06:44:01 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 02 06:44:01 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 02 06:44:01 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 02 06:44:01 localhost systemd[1]: Reached target Path Units.
Dec 02 06:44:01 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 02 06:44:01 localhost systemd[1]: Reached target Slice Units.
Dec 02 06:44:01 localhost systemd[1]: Reached target Swaps.
Dec 02 06:44:01 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 02 06:44:01 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 02 06:44:01 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 02 06:44:01 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 02 06:44:01 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 02 06:44:01 localhost systemd[1]: Listening on udev Control Socket.
Dec 02 06:44:01 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 02 06:44:01 localhost systemd[1]: Mounting Huge Pages File System...
Dec 02 06:44:01 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 02 06:44:01 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 02 06:44:01 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 02 06:44:01 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 02 06:44:01 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 02 06:44:01 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 02 06:44:01 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 02 06:44:01 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 02 06:44:01 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 02 06:44:01 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 02 06:44:01 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 02 06:44:01 localhost systemd[1]: Stopped Journal Service.
Dec 02 06:44:01 localhost kernel: fuse: init (API version 7.36)
Dec 02 06:44:01 localhost systemd[1]: Starting Journal Service...
Dec 02 06:44:01 localhost systemd[1]: Starting Load Kernel Modules...
Dec 02 06:44:01 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 02 06:44:01 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 02 06:44:01 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 02 06:44:01 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 02 06:44:01 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 02 06:44:01 localhost systemd[1]: Mounted Huge Pages File System.
Dec 02 06:44:01 localhost systemd-journald[619]: Journal started
Dec 02 06:44:01 localhost systemd-journald[619]: Runtime Journal (/run/log/journal/510530184876bdc0ebb29e7199f63471) is 8.0M, max 314.7M, 306.7M free.
Dec 02 06:44:01 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 02 06:44:01 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 02 06:44:01 localhost systemd-modules-load[620]: Module 'msr' is built in
Dec 02 06:44:01 localhost systemd[1]: Started Journal Service.
Dec 02 06:44:01 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 02 06:44:01 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 02 06:44:01 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 02 06:44:01 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 02 06:44:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 02 06:44:01 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 02 06:44:01 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 02 06:44:01 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 02 06:44:01 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 02 06:44:01 localhost kernel: ACPI: bus type drm_connector registered
Dec 02 06:44:01 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 02 06:44:01 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 02 06:44:01 localhost systemd[1]: Finished Load Kernel Modules.
Dec 02 06:44:01 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 02 06:44:01 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 02 06:44:01 localhost systemd[1]: Mounting FUSE Control File System...
Dec 02 06:44:01 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 02 06:44:01 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 02 06:44:01 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 02 06:44:01 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 02 06:44:01 localhost systemd[1]: Starting Load/Save Random Seed...
Dec 02 06:44:01 localhost systemd-journald[619]: Runtime Journal (/run/log/journal/510530184876bdc0ebb29e7199f63471) is 8.0M, max 314.7M, 306.7M free.
Dec 02 06:44:01 localhost systemd-journald[619]: Received client request to flush runtime journal.
Dec 02 06:44:01 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 02 06:44:01 localhost systemd[1]: Starting Create System Users...
Dec 02 06:44:01 localhost systemd[1]: Mounted FUSE Control File System.
Dec 02 06:44:01 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 02 06:44:01 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 02 06:44:01 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 02 06:44:01 localhost systemd[1]: Finished Load/Save Random Seed.
Dec 02 06:44:01 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 02 06:44:01 localhost systemd-sysusers[633]: Creating group 'sgx' with GID 989.
Dec 02 06:44:01 localhost systemd-sysusers[633]: Creating group 'systemd-oom' with GID 988.
Dec 02 06:44:01 localhost systemd-sysusers[633]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Dec 02 06:44:01 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 02 06:44:01 localhost systemd[1]: Finished Create System Users.
Dec 02 06:44:01 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 02 06:44:01 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 02 06:44:01 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 02 06:44:01 localhost systemd[1]: Set up automount EFI System Partition Automount.
Dec 02 06:44:02 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 02 06:44:02 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 02 06:44:02 localhost systemd-udevd[637]: Using default interface naming scheme 'rhel-9.0'.
Dec 02 06:44:02 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 02 06:44:02 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 02 06:44:02 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 02 06:44:02 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 02 06:44:02 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 02 06:44:02 localhost systemd-udevd[645]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 06:44:02 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Dec 02 06:44:02 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Dec 02 06:44:02 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Dec 02 06:44:02 localhost systemd[1]: Mounting /boot...
Dec 02 06:44:02 localhost systemd-fsck[678]: fsck.fat 4.2 (2021-01-31)
Dec 02 06:44:02 localhost systemd-fsck[678]: /dev/vda2: 12 files, 1782/51145 clusters
Dec 02 06:44:02 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Dec 02 06:44:02 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Dec 02 06:44:02 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 02 06:44:02 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 02 06:44:02 localhost kernel: XFS (vda3): Ending clean mount
Dec 02 06:44:02 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Dec 02 06:44:02 localhost systemd[1]: Mounted /boot.
Dec 02 06:44:02 localhost systemd[1]: Mounting /boot/efi...
Dec 02 06:44:02 localhost systemd[1]: Mounted /boot/efi.
Dec 02 06:44:02 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 02 06:44:02 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 02 06:44:02 localhost systemd[1]: Reached target Local File Systems.
Dec 02 06:44:02 localhost kernel: Console: switching to colour dummy device 80x25
Dec 02 06:44:02 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 02 06:44:02 localhost kernel: [drm] features: -context_init
Dec 02 06:44:02 localhost kernel: [drm] number of scanouts: 1
Dec 02 06:44:02 localhost kernel: [drm] number of cap sets: 0
Dec 02 06:44:02 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Dec 02 06:44:02 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Dec 02 06:44:02 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 02 06:44:02 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 02 06:44:02 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 02 06:44:02 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 02 06:44:02 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 02 06:44:02 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 02 06:44:02 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 02 06:44:02 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 02 06:44:02 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 02 06:44:02 localhost kernel: SVM: TSC scaling supported
Dec 02 06:44:02 localhost kernel: kvm: Nested Virtualization enabled
Dec 02 06:44:02 localhost kernel: SVM: kvm: Nested Paging enabled
Dec 02 06:44:02 localhost kernel: SVM: LBR virtualization supported
Dec 02 06:44:02 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 702 (bootctl)
Dec 02 06:44:02 localhost systemd[1]: Starting File System Check on /dev/vda2...
Dec 02 06:44:02 localhost systemd[1]: Finished File System Check on /dev/vda2.
Dec 02 06:44:02 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 02 06:44:02 localhost systemd[1]: Starting Security Auditing Service...
Dec 02 06:44:02 localhost systemd[1]: Starting RPC Bind...
Dec 02 06:44:02 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 02 06:44:02 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 02 06:44:02 localhost auditd[710]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Dec 02 06:44:02 localhost auditd[710]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Dec 02 06:44:02 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 02 06:44:02 localhost systemd[1]: Started RPC Bind.
Dec 02 06:44:02 localhost systemd[1]: Mounting EFI System Partition Automount...
Dec 02 06:44:02 localhost augenrules[715]: /sbin/augenrules: No change
Dec 02 06:44:02 localhost augenrules[730]: No rules
Dec 02 06:44:02 localhost augenrules[730]: enabled 1
Dec 02 06:44:02 localhost augenrules[730]: failure 1
Dec 02 06:44:02 localhost augenrules[730]: pid 710
Dec 02 06:44:02 localhost augenrules[730]: rate_limit 0
Dec 02 06:44:02 localhost augenrules[730]: backlog_limit 8192
Dec 02 06:44:02 localhost augenrules[730]: lost 0
Dec 02 06:44:02 localhost augenrules[730]: backlog 1
Dec 02 06:44:02 localhost augenrules[730]: backlog_wait_time 60000
Dec 02 06:44:02 localhost augenrules[730]: backlog_wait_time_actual 0
Dec 02 06:44:02 localhost augenrules[730]: enabled 1
Dec 02 06:44:02 localhost augenrules[730]: failure 1
Dec 02 06:44:02 localhost augenrules[730]: pid 710
Dec 02 06:44:02 localhost augenrules[730]: rate_limit 0
Dec 02 06:44:02 localhost augenrules[730]: backlog_limit 8192
Dec 02 06:44:02 localhost augenrules[730]: lost 0
Dec 02 06:44:02 localhost augenrules[730]: backlog 4
Dec 02 06:44:02 localhost augenrules[730]: backlog_wait_time 60000
Dec 02 06:44:02 localhost augenrules[730]: backlog_wait_time_actual 0
Dec 02 06:44:02 localhost augenrules[730]: enabled 1
Dec 02 06:44:02 localhost augenrules[730]: failure 1
Dec 02 06:44:02 localhost augenrules[730]: pid 710
Dec 02 06:44:02 localhost augenrules[730]: rate_limit 0
Dec 02 06:44:02 localhost augenrules[730]: backlog_limit 8192
Dec 02 06:44:02 localhost augenrules[730]: lost 0
Dec 02 06:44:02 localhost augenrules[730]: backlog 1
Dec 02 06:44:02 localhost augenrules[730]: backlog_wait_time 60000
Dec 02 06:44:02 localhost augenrules[730]: backlog_wait_time_actual 0
Dec 02 06:44:02 localhost systemd[1]: Mounted EFI System Partition Automount.
Dec 02 06:44:02 localhost systemd[1]: Started Security Auditing Service.
Dec 02 06:44:02 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 02 06:44:02 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 02 06:44:02 localhost systemd[1]: Starting Update is Completed...
Dec 02 06:44:02 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 02 06:44:02 localhost systemd[1]: Finished Update is Completed.
Dec 02 06:44:02 localhost systemd[1]: Reached target System Initialization.
Dec 02 06:44:02 localhost systemd[1]: Started dnf makecache --timer.
Dec 02 06:44:02 localhost systemd[1]: Started Daily rotation of log files.
Dec 02 06:44:02 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 02 06:44:02 localhost systemd[1]: Reached target Timer Units.
Dec 02 06:44:02 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 02 06:44:02 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 02 06:44:02 localhost systemd[1]: Reached target Socket Units.
Dec 02 06:44:02 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Dec 02 06:44:02 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 02 06:44:02 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 02 06:44:02 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 02 06:44:02 localhost systemd[1]: Reached target Basic System.
Dec 02 06:44:02 localhost dbus-broker-lau[742]: Ready
Dec 02 06:44:02 localhost systemd[1]: Starting NTP client/server...
Dec 02 06:44:02 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 02 06:44:02 localhost systemd[1]: Started irqbalance daemon.
Dec 02 06:44:02 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 02 06:44:02 localhost systemd[1]: Starting System Logging Service...
Dec 02 06:44:02 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 06:44:02 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 06:44:02 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 06:44:02 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 02 06:44:02 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 02 06:44:02 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 02 06:44:02 localhost systemd[1]: Starting User Login Management...
Dec 02 06:44:02 localhost systemd[1]: Started System Logging Service.
Dec 02 06:44:02 localhost rsyslogd[754]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="754" x-info="https://www.rsyslog.com"] start
Dec 02 06:44:02 localhost rsyslogd[754]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Dec 02 06:44:02 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 02 06:44:02 localhost chronyd[763]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 02 06:44:02 localhost chronyd[763]: Using right/UTC timezone to obtain leap second data
Dec 02 06:44:02 localhost chronyd[763]: Loaded seccomp filter (level 2)
Dec 02 06:44:02 localhost systemd[1]: Started NTP client/server.
Dec 02 06:44:02 localhost systemd-logind[757]: New seat seat0.
Dec 02 06:44:02 localhost systemd-logind[757]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 02 06:44:02 localhost systemd-logind[757]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 02 06:44:02 localhost systemd[1]: Started User Login Management.
Dec 02 06:44:02 localhost rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 06:44:03 localhost cloud-init[771]: Cloud-init v. 22.1-9.el9 running 'init-local' at Tue, 02 Dec 2025 06:44:03 +0000. Up 5.35 seconds.
Dec 02 06:44:03 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 02 06:44:03 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 02 06:44:03 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpibe937xh.mount: Deactivated successfully.
Dec 02 06:44:03 localhost systemd[1]: Starting Hostname Service...
Dec 02 06:44:03 localhost systemd[1]: Started Hostname Service.
Dec 02 06:44:03 np0005541913.novalocal systemd-hostnamed[785]: Hostname set to <np0005541913.novalocal> (static)
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Reached target Preparation for Network.
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Starting Network Manager...
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5670] NetworkManager (version 1.42.2-1.el9) is starting... (boot:15f9a460-af10-408f-9e3d-85f564c0683d)
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5676] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Started Network Manager.
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5700] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Reached target Network.
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5744] manager[0x5605b46de020]: monitoring kernel firmware directory '/lib/firmware'.
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5790] hostname: hostname: using hostnamed
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5791] hostname: static hostname changed from (none) to "np0005541913.novalocal"
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5798] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5922] manager[0x5605b46de020]: rfkill: Wi-Fi hardware radio set enabled
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5928] manager[0x5605b46de020]: rfkill: WWAN hardware radio set enabled
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5993] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5994] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5997] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.5998] manager: Networking is enabled by state file
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6014] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6015] settings: Loaded settings plugin: keyfile (internal)
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6044] dhcp: init: Using DHCP client 'internal'
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6052] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6065] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6071] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6079] device (lo): Activation: starting connection 'lo' (013d3e5c-fa64-43d6-9ac0-206896105ec9)
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6090] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6095] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6126] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6140] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6142] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6145] device (eth0): carrier: link connected
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6150] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6156] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6192] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6197] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6197] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6199] manager: NetworkManager state is now CONNECTING
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6206] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6212] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6215] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Reached target NFS client services.
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6283] dhcp4 (eth0): state changed new lease, address=38.102.83.144
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Reached target Remote File Systems.
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6292] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6333] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6338] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6339] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6346] device (lo): Activation: successful, device activated.
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6353] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6355] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6357] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6358] device (eth0): Activation: successful, device activated.
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6363] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 02 06:44:03 np0005541913.novalocal NetworkManager[790]: <info>  [1764657843.6365] manager: startup complete
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 02 06:44:03 np0005541913.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: Cloud-init v. 22.1-9.el9 running 'init' at Tue, 02 Dec 2025 06:44:03 +0000. Up 6.06 seconds.
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: |  eth0  | True |        38.102.83.144         | 255.255.255.0 | global | fa:16:3e:3f:40:cc |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: |  eth0  | True | fe80::f816:3eff:fe3f:40cc/64 |       .       |  link  | fa:16:3e:3f:40:cc |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 02 06:44:03 np0005541913.novalocal cloud-init[957]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 02 06:44:04 np0005541913.novalocal cloud-init[957]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 02 06:44:04 np0005541913.novalocal systemd[1]: Starting Authorization Manager...
Dec 02 06:44:04 np0005541913.novalocal polkitd[1037]: Started polkitd version 0.117
Dec 02 06:44:04 np0005541913.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Dec 02 06:44:04 np0005541913.novalocal polkitd[1037]: Loading rules from directory /etc/polkit-1/rules.d
Dec 02 06:44:04 np0005541913.novalocal polkitd[1037]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 02 06:44:04 np0005541913.novalocal polkitd[1037]: Finished loading, compiling and executing 4 rules
Dec 02 06:44:04 np0005541913.novalocal systemd[1]: Started Authorization Manager.
Dec 02 06:44:04 np0005541913.novalocal polkitd[1037]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 02 06:44:05 np0005541913.novalocal useradd[1116]: new group: name=cloud-user, GID=1001
Dec 02 06:44:05 np0005541913.novalocal useradd[1116]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 02 06:44:05 np0005541913.novalocal useradd[1116]: add 'cloud-user' to group 'adm'
Dec 02 06:44:05 np0005541913.novalocal useradd[1116]: add 'cloud-user' to group 'systemd-journal'
Dec 02 06:44:05 np0005541913.novalocal useradd[1116]: add 'cloud-user' to shadow group 'adm'
Dec 02 06:44:05 np0005541913.novalocal useradd[1116]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: Generating public/private rsa key pair.
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: The key fingerprint is:
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: SHA256:PnFcFMPZH676c2vQCBk+PI38fOzVY44G0MSIBRfxQ4s root@np0005541913.novalocal
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: The key's randomart image is:
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: +---[RSA 3072]----+
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |       .+==o++   |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |       ...+=+... |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |         E*+* ...|
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |         o %.. ..|
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |        S + * = .|
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |       . o . * *o|
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |        o   o B o|
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |         . . + = |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |            o.+..|
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: +----[SHA256]-----+
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: Generating public/private ecdsa key pair.
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: The key fingerprint is:
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: SHA256:6iMR4dY6LoVZ8ymMJCbx8nHStl9AiNZS/eSMUs4v7Xg root@np0005541913.novalocal
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: The key's randomart image is:
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: +---[ECDSA 256]---+
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |   +.o           |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |. + + + .        |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: | + + B *         |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |+.= @ * +        |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |o= % B =S        |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |  = O +.+        |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |   o =.=         |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |  . o.+ E        |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |   . ..o         |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: +----[SHA256]-----+
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: Generating public/private ed25519 key pair.
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: The key fingerprint is:
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: SHA256:aGQRCWtPRCvEgwMYFOhpjkXRq2xRH4nHoTwtqSvGbEw root@np0005541913.novalocal
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: The key's randomart image is:
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: +--[ED25519 256]--+
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |*=o=o=*=         |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |o +o***o         |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |...oX=*.         |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: | +oo.O..         |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |+E.o  + S        |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |*.+. .           |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |.B.              |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |o.               |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: |                 |
Dec 02 06:44:07 np0005541913.novalocal cloud-init[957]: +----[SHA256]-----+
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Reached target Network is Online.
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Starting Permit User Sessions...
Dec 02 06:44:07 np0005541913.novalocal sm-notify[1129]: Version 2.5.4 starting
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 02 06:44:07 np0005541913.novalocal sshd[1130]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Finished Permit User Sessions.
Dec 02 06:44:07 np0005541913.novalocal sshd[1130]: Server listening on 0.0.0.0 port 22.
Dec 02 06:44:07 np0005541913.novalocal sshd[1130]: Server listening on :: port 22.
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Started Command Scheduler.
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Started Getty on tty1.
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Reached target Login Prompts.
Dec 02 06:44:07 np0005541913.novalocal crond[1134]: (CRON) STARTUP (1.5.7)
Dec 02 06:44:07 np0005541913.novalocal crond[1134]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 02 06:44:07 np0005541913.novalocal crond[1134]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 88% if used.)
Dec 02 06:44:07 np0005541913.novalocal crond[1134]: (CRON) INFO (running with inotify support)
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Reached target Multi-User System.
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 02 06:44:07 np0005541913.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 02 06:44:08 np0005541913.novalocal kdumpctl[1133]: kdump: No kdump initial ramdisk found.
Dec 02 06:44:08 np0005541913.novalocal kdumpctl[1133]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1255]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Tue, 02 Dec 2025 06:44:08 +0000. Up 10.33 seconds.
Dec 02 06:44:08 np0005541913.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Dec 02 06:44:08 np0005541913.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Dec 02 06:44:08 np0005541913.novalocal sshd[1413]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:08 np0005541913.novalocal dracut[1417]: dracut-057-21.git20230214.el9
Dec 02 06:44:08 np0005541913.novalocal sshd[1413]: Connection reset by 38.102.83.114 port 50016 [preauth]
Dec 02 06:44:08 np0005541913.novalocal sshd[1419]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:08 np0005541913.novalocal sshd[1419]: Unable to negotiate with 38.102.83.114 port 50026: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 02 06:44:08 np0005541913.novalocal sshd[1435]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:08 np0005541913.novalocal chronyd[763]: Selected source 162.159.200.123 (2.rhel.pool.ntp.org)
Dec 02 06:44:08 np0005541913.novalocal sshd[1437]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:08 np0005541913.novalocal chronyd[763]: System clock TAI offset set to 37 seconds
Dec 02 06:44:08 np0005541913.novalocal sshd[1437]: Unable to negotiate with 38.102.83.114 port 50040: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 02 06:44:08 np0005541913.novalocal sshd[1439]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:08 np0005541913.novalocal sshd[1439]: Unable to negotiate with 38.102.83.114 port 50052: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 02 06:44:08 np0005541913.novalocal sshd[1442]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:08 np0005541913.novalocal sshd[1442]: Connection closed by 38.102.83.114 port 50062 [preauth]
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Dec 02 06:44:08 np0005541913.novalocal sshd[1457]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1459]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Tue, 02 Dec 2025 06:44:08 +0000. Up 10.71 seconds.
Dec 02 06:44:08 np0005541913.novalocal sshd[1435]: Connection closed by 38.102.83.114 port 50034 [preauth]
Dec 02 06:44:08 np0005541913.novalocal sshd[1471]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:08 np0005541913.novalocal sshd[1471]: fatal: mm_answer_sign: sign: error in libcrypto
Dec 02 06:44:08 np0005541913.novalocal sshd[1521]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:08 np0005541913.novalocal sshd[1521]: Unable to negotiate with 38.102.83.114 port 50088: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1544]: #############################################################
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1546]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 02 06:44:08 np0005541913.novalocal sshd[1457]: Connection closed by 38.102.83.114 port 50068 [preauth]
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1551]: 256 SHA256:6iMR4dY6LoVZ8ymMJCbx8nHStl9AiNZS/eSMUs4v7Xg root@np0005541913.novalocal (ECDSA)
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1558]: 256 SHA256:aGQRCWtPRCvEgwMYFOhpjkXRq2xRH4nHoTwtqSvGbEw root@np0005541913.novalocal (ED25519)
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1564]: 3072 SHA256:PnFcFMPZH676c2vQCBk+PI38fOzVY44G0MSIBRfxQ4s root@np0005541913.novalocal (RSA)
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1566]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1568]: #############################################################
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 02 06:44:08 np0005541913.novalocal cloud-init[1459]: Cloud-init v. 22.1-9.el9 finished at Tue, 02 Dec 2025 06:44:08 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.98 seconds
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 02 06:44:08 np0005541913.novalocal systemd[1]: Reloading Network Manager...
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 02 06:44:08 np0005541913.novalocal NetworkManager[790]: <info>  [1764657848.8882] audit: op="reload" arg="0" pid=1666 uid=0 result="success"
Dec 02 06:44:08 np0005541913.novalocal NetworkManager[790]: <info>  [1764657848.8892] config: signal: SIGHUP (no changes from disk)
Dec 02 06:44:08 np0005541913.novalocal systemd[1]: Reloaded Network Manager.
Dec 02 06:44:08 np0005541913.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Dec 02 06:44:08 np0005541913.novalocal systemd[1]: Reached target Cloud-init target.
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 02 06:44:08 np0005541913.novalocal dracut[1420]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: memstrack is not available
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: memstrack is not available
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: *** Including module: systemd ***
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: *** Including module: systemd-initrd ***
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: *** Including module: i18n ***
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: No KEYMAP configured.
Dec 02 06:44:09 np0005541913.novalocal dracut[1420]: *** Including module: drm ***
Dec 02 06:44:10 np0005541913.novalocal dracut[1420]: *** Including module: prefixdevname ***
Dec 02 06:44:10 np0005541913.novalocal dracut[1420]: *** Including module: kernel-modules ***
Dec 02 06:44:10 np0005541913.novalocal dracut[1420]: *** Including module: kernel-modules-extra ***
Dec 02 06:44:10 np0005541913.novalocal dracut[1420]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 02 06:44:10 np0005541913.novalocal dracut[1420]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 02 06:44:10 np0005541913.novalocal dracut[1420]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 02 06:44:10 np0005541913.novalocal dracut[1420]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 02 06:44:10 np0005541913.novalocal dracut[1420]: *** Including module: qemu ***
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: *** Including module: fstab-sys ***
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: *** Including module: rootfs-block ***
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: *** Including module: terminfo ***
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: *** Including module: udev-rules ***
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: Skipping udev rule: 91-permissions.rules
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: *** Including module: virtiofs ***
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: *** Including module: dracut-systemd ***
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: *** Including module: usrmount ***
Dec 02 06:44:11 np0005541913.novalocal dracut[1420]: *** Including module: base ***
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]: *** Including module: fs-lib ***
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]: *** Including module: kdumpbase ***
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:   microcode_ctl module: mangling fw_dir
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: configuration "intel" is ignored
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]: *** Including module: shutdown ***
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]: *** Including module: squash ***
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]: *** Including modules done ***
Dec 02 06:44:12 np0005541913.novalocal dracut[1420]: *** Installing kernel module dependencies ***
Dec 02 06:44:13 np0005541913.novalocal dracut[1420]: *** Installing kernel module dependencies done ***
Dec 02 06:44:13 np0005541913.novalocal dracut[1420]: *** Resolving executable dependencies ***
Dec 02 06:44:13 np0005541913.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: *** Resolving executable dependencies done ***
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: *** Hardlinking files ***
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: Mode:           real
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: Files:          1099
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: Linked:         3 files
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: Compared:       0 xattrs
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: Compared:       373 files
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: Saved:          61.04 KiB
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: Duration:       0.024060 seconds
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: *** Hardlinking files done ***
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: Could not find 'strip'. Not stripping the initramfs.
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: *** Generating early-microcode cpio image ***
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: *** Constructing AuthenticAMD.bin ***
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: *** Store current command line parameters ***
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: Stored kernel commandline:
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: No dracut internal kernel commandline stored in the initramfs
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: *** Install squash loader ***
Dec 02 06:44:15 np0005541913.novalocal dracut[1420]: *** Squashing the files inside the initramfs ***
Dec 02 06:44:16 np0005541913.novalocal dracut[1420]: *** Squashing the files inside the initramfs done ***
Dec 02 06:44:16 np0005541913.novalocal dracut[1420]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Dec 02 06:44:17 np0005541913.novalocal dracut[1420]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Dec 02 06:44:17 np0005541913.novalocal kdumpctl[1133]: kdump: kexec: loaded kdump kernel
Dec 02 06:44:17 np0005541913.novalocal kdumpctl[1133]: kdump: Starting kdump: [OK]
Dec 02 06:44:17 np0005541913.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 02 06:44:17 np0005541913.novalocal systemd[1]: Startup finished in 1.303s (kernel) + 1.974s (initrd) + 16.650s (userspace) = 19.928s.
Dec 02 06:44:33 np0005541913.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 06:44:34 np0005541913.novalocal sshd[4173]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:44:34 np0005541913.novalocal sshd[4173]: Accepted publickey for zuul from 38.102.83.114 port 55246 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 02 06:44:34 np0005541913.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 02 06:44:34 np0005541913.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 02 06:44:34 np0005541913.novalocal systemd-logind[757]: New session 1 of user zuul.
Dec 02 06:44:34 np0005541913.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 02 06:44:34 np0005541913.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 02 06:44:34 np0005541913.novalocal systemd[4177]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Queued start job for default target Main User Target.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Created slice User Application Slice.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Reached target Paths.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Reached target Timers.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Starting D-Bus User Message Bus Socket...
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Starting Create User's Volatile Files and Directories...
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Finished Create User's Volatile Files and Directories.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Listening on D-Bus User Message Bus Socket.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Reached target Sockets.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Reached target Basic System.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Reached target Main User Target.
Dec 02 06:44:35 np0005541913.novalocal systemd[4177]: Startup finished in 158ms.
Dec 02 06:44:35 np0005541913.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 02 06:44:35 np0005541913.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 02 06:44:35 np0005541913.novalocal sshd[4173]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 06:44:35 np0005541913.novalocal python3[4229]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 06:44:44 np0005541913.novalocal python3[4248]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 06:44:51 np0005541913.novalocal python3[4301]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 06:44:52 np0005541913.novalocal python3[4331]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 02 06:44:55 np0005541913.novalocal python3[4347]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCfcGXFPS+XIPHLw+7WTk1crQnJj1F7l/bATNqEM8HqdPREfaSIeF883HXh8Bv+rj9cjcgSPu+200+1SEsq35V+19mPwwkoxgdhfQu8jGk7vv17tL7k61zl9rWne61hn/7PnFptl+SBaMvOq/9ZdnPuMzb1YBTWbKm6kC3RPkgDUOa/BER5PJh1E6x6wYj1wRGMwVREczSSv+66aA5tTRelsFh16OXZXpq4ddoi7OeuimE3lWuMAHorxzJwF5AN+gPTgKYRkMwbMMHU4nPx7TXt5G3zjqWhmos08Xgdl+lPNHY5i463T96l4hGiycZKO4FOCq0ZMzldYkovXnyZi1CjSYUDcEn+EHIRJyZaK9ZJlJ1no5HVdwv1rwVMw4KkpZvH7HBh/iX47Wsi4qxK+L3X5hwZ7s6iSpNWeEMT5CLZsiDCkrdideFnZ8kW2jgnNIV0h+pUPISFfl1j03bjS9fHJjgl4BndVBxRJZJQf8Szyjx5WcIyBUidtYPnHzSLbmk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:44:55 np0005541913.novalocal python3[4361]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:44:57 np0005541913.novalocal python3[4420]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:44:57 np0005541913.novalocal python3[4461]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764657896.8804188-390-106100223893400/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=fa40fdabeeae48b78b01a4cbccbd42f6_id_rsa follow=False checksum=c9b7a1839a060a12dd883255955d0b791bf96d1d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:44:58 np0005541913.novalocal python3[4534]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:44:59 np0005541913.novalocal python3[4575]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764657898.641792-492-76108552797260/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=fa40fdabeeae48b78b01a4cbccbd42f6_id_rsa.pub follow=False checksum=076b8979e1bf6ba70130c32daa0e2e874f6f0bae backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:01 np0005541913.novalocal python3[4603]: ansible-ping Invoked with data=pong
Dec 02 06:45:03 np0005541913.novalocal python3[4618]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 06:45:06 np0005541913.novalocal python3[4671]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 02 06:45:09 np0005541913.novalocal python3[4693]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:09 np0005541913.novalocal python3[4707]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:09 np0005541913.novalocal python3[4721]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:10 np0005541913.novalocal python3[4735]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:10 np0005541913.novalocal python3[4749]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:11 np0005541913.novalocal python3[4763]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:13 np0005541913.novalocal sudo[4777]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvbcuychpopjzaldmkxaoubinewckbso ; /usr/bin/python3
Dec 02 06:45:13 np0005541913.novalocal sudo[4777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:13 np0005541913.novalocal python3[4779]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:13 np0005541913.novalocal sudo[4777]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:15 np0005541913.novalocal sudo[4825]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbithdbdvkkzabulwbsxlluceitzkzre ; /usr/bin/python3
Dec 02 06:45:15 np0005541913.novalocal sudo[4825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:15 np0005541913.novalocal python3[4827]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:45:15 np0005541913.novalocal sudo[4825]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:15 np0005541913.novalocal sudo[4868]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iujjiadecaavgqnzihcikediqniglcxo ; /usr/bin/python3
Dec 02 06:45:15 np0005541913.novalocal sudo[4868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:15 np0005541913.novalocal python3[4870]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764657914.9314513-101-181543977722468/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:15 np0005541913.novalocal sudo[4868]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:22 np0005541913.novalocal python3[4898]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:23 np0005541913.novalocal python3[4912]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:23 np0005541913.novalocal python3[4927]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:23 np0005541913.novalocal python3[4941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:24 np0005541913.novalocal python3[4955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:24 np0005541913.novalocal python3[4969]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:24 np0005541913.novalocal python3[4983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:24 np0005541913.novalocal python3[4997]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:25 np0005541913.novalocal python3[5011]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:25 np0005541913.novalocal python3[5025]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:25 np0005541913.novalocal python3[5039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:25 np0005541913.novalocal python3[5053]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:26 np0005541913.novalocal python3[5067]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:26 np0005541913.novalocal python3[5081]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:26 np0005541913.novalocal python3[5095]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:26 np0005541913.novalocal python3[5109]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:27 np0005541913.novalocal python3[5123]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:27 np0005541913.novalocal python3[5137]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:27 np0005541913.novalocal python3[5151]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:27 np0005541913.novalocal python3[5165]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:28 np0005541913.novalocal python3[5179]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:28 np0005541913.novalocal python3[5193]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:28 np0005541913.novalocal python3[5207]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:28 np0005541913.novalocal python3[5221]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:29 np0005541913.novalocal python3[5235]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:29 np0005541913.novalocal python3[5249]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 06:45:31 np0005541913.novalocal sudo[5263]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvroyvlxbqizblazjbratkbheocmwksu ; /usr/bin/python3
Dec 02 06:45:31 np0005541913.novalocal sudo[5263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:31 np0005541913.novalocal python3[5265]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 02 06:45:31 np0005541913.novalocal systemd[1]: Starting Time & Date Service...
Dec 02 06:45:31 np0005541913.novalocal systemd[1]: Started Time & Date Service.
Dec 02 06:45:31 np0005541913.novalocal systemd-timedated[5267]: Changed time zone to 'UTC' (UTC).
Dec 02 06:45:31 np0005541913.novalocal sudo[5263]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:32 np0005541913.novalocal sudo[5284]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iujcrtloxzihhibxeyyqvvwppqtihkbt ; /usr/bin/python3
Dec 02 06:45:32 np0005541913.novalocal sudo[5284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:32 np0005541913.novalocal python3[5286]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:32 np0005541913.novalocal sudo[5284]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:33 np0005541913.novalocal python3[5332]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:45:34 np0005541913.novalocal python3[5373]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764657933.5532408-496-126059622645813/source _original_basename=tmpzjyyi0by follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:35 np0005541913.novalocal python3[5433]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:45:35 np0005541913.novalocal python3[5474]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764657935.0452213-585-65096931480121/source _original_basename=tmpikcfcn7u follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:37 np0005541913.novalocal sudo[5534]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxouophhbpvqyxmyriltdbwsyzihahox ; /usr/bin/python3
Dec 02 06:45:37 np0005541913.novalocal sudo[5534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:37 np0005541913.novalocal python3[5536]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:45:37 np0005541913.novalocal sudo[5534]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:37 np0005541913.novalocal sudo[5577]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcvznuzyjaerjefknpakbepogckguqcn ; /usr/bin/python3
Dec 02 06:45:37 np0005541913.novalocal sudo[5577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:37 np0005541913.novalocal python3[5579]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764657937.3060117-728-129315833383179/source _original_basename=tmpwxtt91e9 follow=False checksum=d3787dbc1d919dd7098cc7939d07e9b9a9d1522d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:38 np0005541913.novalocal sudo[5577]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:39 np0005541913.novalocal python3[5607]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:45:39 np0005541913.novalocal python3[5623]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:45:40 np0005541913.novalocal sudo[5671]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kthgdrnsdayasdjzsitpshmeoympjdxc ; /usr/bin/python3
Dec 02 06:45:40 np0005541913.novalocal sudo[5671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:40 np0005541913.novalocal python3[5673]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:45:40 np0005541913.novalocal sudo[5671]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:40 np0005541913.novalocal sudo[5714]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogvnwrnbrjccsiiityimdzqjcydfqctn ; /usr/bin/python3
Dec 02 06:45:40 np0005541913.novalocal sudo[5714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:40 np0005541913.novalocal python3[5716]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764657940.255052-854-280053912187743/source _original_basename=tmpj8ipvftm follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:45:40 np0005541913.novalocal sudo[5714]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:42 np0005541913.novalocal sudo[5745]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvphljrxhrwxvcboqfivtdahshkgivjw ; /usr/bin/python3
Dec 02 06:45:42 np0005541913.novalocal sudo[5745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:45:42 np0005541913.novalocal python3[5747]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-2304-36f4-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:45:42 np0005541913.novalocal sudo[5745]: pam_unix(sudo:session): session closed for user root
Dec 02 06:45:43 np0005541913.novalocal python3[5765]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-2304-36f4-000000000024-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 02 06:45:45 np0005541913.novalocal python3[5783]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:46:01 np0005541913.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 02 06:46:04 np0005541913.novalocal sudo[5800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvfdsgoqirjxjheehtykualwhpwybkdh ; /usr/bin/python3
Dec 02 06:46:04 np0005541913.novalocal sudo[5800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:46:04 np0005541913.novalocal python3[5802]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:46:04 np0005541913.novalocal sudo[5800]: pam_unix(sudo:session): session closed for user root
Dec 02 06:47:04 np0005541913.novalocal sshd[4186]: Received disconnect from 38.102.83.114 port 55246:11: disconnected by user
Dec 02 06:47:04 np0005541913.novalocal sshd[4186]: Disconnected from user zuul 38.102.83.114 port 55246
Dec 02 06:47:04 np0005541913.novalocal sshd[4173]: pam_unix(sshd:session): session closed for user zuul
Dec 02 06:47:04 np0005541913.novalocal systemd-logind[757]: Session 1 logged out. Waiting for processes to exit.
Dec 02 06:47:16 np0005541913.novalocal systemd[4177]: Starting Mark boot as successful...
Dec 02 06:47:16 np0005541913.novalocal systemd[4177]: Finished Mark boot as successful.
Dec 02 06:47:24 np0005541913.novalocal chronyd[763]: Selected source 174.138.193.90 (2.rhel.pool.ntp.org)
Dec 02 06:48:04 np0005541913.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Dec 02 06:48:04 np0005541913.novalocal systemd[1]: efi.mount: Deactivated successfully.
Dec 02 06:48:04 np0005541913.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Dec 02 06:49:42 np0005541913.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Dec 02 06:49:42 np0005541913.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Dec 02 06:49:42 np0005541913.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Dec 02 06:49:42 np0005541913.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Dec 02 06:49:42 np0005541913.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Dec 02 06:49:42 np0005541913.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Dec 02 06:49:42 np0005541913.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Dec 02 06:49:42 np0005541913.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Dec 02 06:49:42 np0005541913.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Dec 02 06:49:42 np0005541913.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9238] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 02 06:49:42 np0005541913.novalocal systemd-udevd[5810]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9365] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Dec 02 06:49:42 np0005541913.novalocal systemd[4177]: Created slice User Background Tasks Slice.
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9383] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 02 06:49:42 np0005541913.novalocal systemd[4177]: Starting Cleanup of User's Temporary Files and Directories...
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9388] device (eth1): carrier: link connected
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9390] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9394] policy: auto-activating connection 'Wired connection 1' (35782912-d644-3ee2-930b-ef582ceabd4b)
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9398] device (eth1): Activation: starting connection 'Wired connection 1' (35782912-d644-3ee2-930b-ef582ceabd4b)
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9399] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9402] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9405] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Dec 02 06:49:42 np0005541913.novalocal NetworkManager[790]: <info>  [1764658182.9408] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 02 06:49:42 np0005541913.novalocal systemd[4177]: Finished Cleanup of User's Temporary Files and Directories.
Dec 02 06:49:43 np0005541913.novalocal sshd[5814]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:49:43 np0005541913.novalocal sshd[5814]: Accepted publickey for zuul from 38.102.83.114 port 35772 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 06:49:43 np0005541913.novalocal systemd-logind[757]: New session 3 of user zuul.
Dec 02 06:49:43 np0005541913.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 02 06:49:43 np0005541913.novalocal sshd[5814]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 06:49:43 np0005541913.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Dec 02 06:49:44 np0005541913.novalocal python3[5831]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-8e68-9bb8-000000000408-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:49:57 np0005541913.novalocal sudo[5879]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmguudbstbfnjwzoatobvcbypgpgbrdj ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 02 06:49:57 np0005541913.novalocal sudo[5879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:49:57 np0005541913.novalocal python3[5881]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:49:57 np0005541913.novalocal sudo[5879]: pam_unix(sudo:session): session closed for user root
Dec 02 06:49:57 np0005541913.novalocal sudo[5922]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbvfldefnbxnkrfimtqgwuqfbvfaxozn ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 02 06:49:57 np0005541913.novalocal sudo[5922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:49:57 np0005541913.novalocal python3[5924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764658197.0543106-486-144637397494422/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=d17281d2876b8cf83357ec6c9a421c589994e444 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:49:57 np0005541913.novalocal sudo[5922]: pam_unix(sudo:session): session closed for user root
Dec 02 06:49:58 np0005541913.novalocal sudo[5952]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnbdjcorjkchctlndccxlvxzfzcagxnd ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 02 06:49:58 np0005541913.novalocal sudo[5952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:49:58 np0005541913.novalocal python3[5954]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Stopping Network Manager...
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[790]: <info>  [1764658198.3387] caught SIGTERM, shutting down normally.
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[790]: <info>  [1764658198.3487] dhcp4 (eth0): canceled DHCP transaction
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[790]: <info>  [1764658198.3488] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[790]: <info>  [1764658198.3488] dhcp4 (eth0): state changed no lease
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[790]: <info>  [1764658198.3490] manager: NetworkManager state is now CONNECTING
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[790]: <info>  [1764658198.3567] dhcp4 (eth1): canceled DHCP transaction
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[790]: <info>  [1764658198.3567] dhcp4 (eth1): state changed no lease
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[790]: <info>  [1764658198.3618] exiting (success)
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Stopped Network Manager.
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: NetworkManager.service: Consumed 2.285s CPU time.
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Starting Network Manager...
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.4111] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:15f9a460-af10-408f-9e3d-85f564c0683d)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.4114] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.4141] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Started Network Manager.
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.4196] manager[0x55da7c1f2090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Starting Hostname Service...
Dec 02 06:49:58 np0005541913.novalocal sudo[5952]: pam_unix(sudo:session): session closed for user root
Dec 02 06:49:58 np0005541913.novalocal systemd[1]: Started Hostname Service.
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5004] hostname: hostname: using hostnamed
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5005] hostname: static hostname changed from (none) to "np0005541913.novalocal"
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5013] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5020] manager[0x55da7c1f2090]: rfkill: Wi-Fi hardware radio set enabled
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5020] manager[0x55da7c1f2090]: rfkill: WWAN hardware radio set enabled
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5063] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5064] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5065] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5066] manager: Networking is enabled by state file
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5074] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5075] settings: Loaded settings plugin: keyfile (internal)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5126] dhcp: init: Using DHCP client 'internal'
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5130] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5140] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5151] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5167] device (lo): Activation: starting connection 'lo' (013d3e5c-fa64-43d6-9ac0-206896105ec9)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5179] device (eth0): carrier: link connected
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5185] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5193] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5194] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5209] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5222] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5233] device (eth1): carrier: link connected
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5240] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5250] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (35782912-d644-3ee2-930b-ef582ceabd4b) (indicated)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5251] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5262] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5276] device (eth1): Activation: starting connection 'Wired connection 1' (35782912-d644-3ee2-930b-ef582ceabd4b)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5305] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5311] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5313] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5317] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5322] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5327] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5331] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5335] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5345] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5352] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5368] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5372] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5428] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5435] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5445] device (lo): Activation: successful, device activated.
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5459] dhcp4 (eth0): state changed new lease, address=38.102.83.144
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5461] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5549] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5582] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5583] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5586] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5588] device (eth0): Activation: successful, device activated.
Dec 02 06:49:58 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658198.5591] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 02 06:49:58 np0005541913.novalocal python3[6013]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-8e68-9bb8-00000000012b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:50:08 np0005541913.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 06:50:28 np0005541913.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 06:50:43 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658243.7817] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Dec 02 06:50:43 np0005541913.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 06:50:43 np0005541913.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 06:50:43 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658243.8007] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Dec 02 06:50:43 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658243.8012] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Dec 02 06:50:43 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658243.8024] device (eth1): Activation: successful, device activated.
Dec 02 06:50:43 np0005541913.novalocal NetworkManager[5965]: <info>  [1764658243.8033] manager: startup complete
Dec 02 06:50:43 np0005541913.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 02 06:50:53 np0005541913.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 06:50:58 np0005541913.novalocal sshd[5817]: Received disconnect from 38.102.83.114 port 35772:11: disconnected by user
Dec 02 06:50:58 np0005541913.novalocal sshd[5817]: Disconnected from user zuul 38.102.83.114 port 35772
Dec 02 06:50:58 np0005541913.novalocal sshd[5814]: pam_unix(sshd:session): session closed for user zuul
Dec 02 06:50:58 np0005541913.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 02 06:50:58 np0005541913.novalocal systemd[1]: session-3.scope: Consumed 1.356s CPU time.
Dec 02 06:50:58 np0005541913.novalocal systemd-logind[757]: Session 3 logged out. Waiting for processes to exit.
Dec 02 06:50:58 np0005541913.novalocal systemd-logind[757]: Removed session 3.
Dec 02 06:51:41 np0005541913.novalocal sshd[6055]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:51:41 np0005541913.novalocal sshd[6055]: Accepted publickey for zuul from 38.102.83.114 port 44934 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 06:51:41 np0005541913.novalocal systemd-logind[757]: New session 4 of user zuul.
Dec 02 06:51:41 np0005541913.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 02 06:51:41 np0005541913.novalocal sshd[6055]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 06:51:41 np0005541913.novalocal sudo[6104]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvojwvvzgqdjgdvdzhvyaqxidbnhnyoj ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 02 06:51:41 np0005541913.novalocal sudo[6104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:51:41 np0005541913.novalocal python3[6106]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:51:41 np0005541913.novalocal sudo[6104]: pam_unix(sudo:session): session closed for user root
Dec 02 06:51:42 np0005541913.novalocal sudo[6147]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrvwrubpnbkvhxaoxacxltbovhitmnww ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 02 06:51:42 np0005541913.novalocal sudo[6147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:51:42 np0005541913.novalocal python3[6149]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764658301.6792257-628-168368169528833/source _original_basename=tmpwb1pv6io follow=False checksum=c2b23ffe44719bb1642f7b68b2bf34d320a2a721 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:51:42 np0005541913.novalocal sudo[6147]: pam_unix(sudo:session): session closed for user root
Dec 02 06:51:45 np0005541913.novalocal sshd[6055]: pam_unix(sshd:session): session closed for user zuul
Dec 02 06:51:45 np0005541913.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 02 06:51:45 np0005541913.novalocal systemd-logind[757]: Session 4 logged out. Waiting for processes to exit.
Dec 02 06:51:45 np0005541913.novalocal systemd-logind[757]: Removed session 4.
Dec 02 06:57:38 np0005541913.novalocal sshd[6168]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:57:39 np0005541913.novalocal sshd[6168]: Accepted publickey for zuul from 38.102.83.114 port 54886 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 06:57:39 np0005541913.novalocal systemd-logind[757]: New session 5 of user zuul.
Dec 02 06:57:39 np0005541913.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 02 06:57:39 np0005541913.novalocal sshd[6168]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 06:57:39 np0005541913.novalocal sudo[6185]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vizfomethyocefnoqxsjbrgcqjpjgjdi ; /usr/bin/python3
Dec 02 06:57:39 np0005541913.novalocal sudo[6185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:39 np0005541913.novalocal python3[6187]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-e6e8-5ca8-000000001d02-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:57:39 np0005541913.novalocal sudo[6185]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:40 np0005541913.novalocal sudo[6203]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymttmygotvyeqxxxeyufknlrgmhljjql ; /usr/bin/python3
Dec 02 06:57:40 np0005541913.novalocal sudo[6203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:40 np0005541913.novalocal python3[6205]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:57:40 np0005541913.novalocal sudo[6203]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:40 np0005541913.novalocal sudo[6219]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwcevsuybgzpbgkyxsklwhighkcsheps ; /usr/bin/python3
Dec 02 06:57:40 np0005541913.novalocal sudo[6219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:40 np0005541913.novalocal python3[6221]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:57:40 np0005541913.novalocal sudo[6219]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:41 np0005541913.novalocal sudo[6235]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdpqxrrhvkmjcpykzzcsraotjufxucgk ; /usr/bin/python3
Dec 02 06:57:41 np0005541913.novalocal sudo[6235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:41 np0005541913.novalocal python3[6237]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:57:41 np0005541913.novalocal sudo[6235]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:41 np0005541913.novalocal sudo[6251]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfzcuhajevpeyyjicprapqmgcruxaixb ; /usr/bin/python3
Dec 02 06:57:41 np0005541913.novalocal sudo[6251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:41 np0005541913.novalocal python3[6253]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:57:41 np0005541913.novalocal sudo[6251]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:42 np0005541913.novalocal sudo[6268]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyuqzdaoasxesbbxrnsypzdunzsdzgfb ; /usr/bin/python3
Dec 02 06:57:42 np0005541913.novalocal sudo[6268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:42 np0005541913.novalocal python3[6270]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:57:42 np0005541913.novalocal sudo[6268]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:43 np0005541913.novalocal sudo[6316]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdbvdzwipkjtuzprudgatwluaqecdrcg ; /usr/bin/python3
Dec 02 06:57:43 np0005541913.novalocal sudo[6316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:43 np0005541913.novalocal python3[6318]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 06:57:43 np0005541913.novalocal sudo[6316]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:43 np0005541913.novalocal sudo[6359]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kswzqbmipwzsjwdakrmxsgqfkzazkgnj ; /usr/bin/python3
Dec 02 06:57:43 np0005541913.novalocal sudo[6359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:44 np0005541913.novalocal python3[6361]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764658663.2627435-645-88884789018823/source _original_basename=tmpenr4wqc1 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 06:57:44 np0005541913.novalocal sudo[6359]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:45 np0005541913.novalocal sudo[6389]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqlgpjctbctfquyjpozfbsezipdpbmzy ; /usr/bin/python3
Dec 02 06:57:45 np0005541913.novalocal sudo[6389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:45 np0005541913.novalocal python3[6391]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 06:57:45 np0005541913.novalocal systemd[1]: Reloading.
Dec 02 06:57:45 np0005541913.novalocal systemd-rc-local-generator[6407]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 06:57:45 np0005541913.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 06:57:46 np0005541913.novalocal sudo[6389]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:46 np0005541913.novalocal sudo[6435]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzqqjookijhtaufonnqozuqncelcevlj ; /usr/bin/python3
Dec 02 06:57:46 np0005541913.novalocal sudo[6435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:47 np0005541913.novalocal python3[6437]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 02 06:57:47 np0005541913.novalocal sudo[6435]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:48 np0005541913.novalocal sudo[6451]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xryjdbpaaukgcvlpezyirorvkvqlffti ; /usr/bin/python3
Dec 02 06:57:48 np0005541913.novalocal sudo[6451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:48 np0005541913.novalocal python3[6453]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:57:48 np0005541913.novalocal sudo[6451]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:48 np0005541913.novalocal sudo[6469]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfsccvevoutqnwmlswkqksucttjovnhf ; /usr/bin/python3
Dec 02 06:57:48 np0005541913.novalocal sudo[6469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:48 np0005541913.novalocal python3[6471]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:57:49 np0005541913.novalocal sudo[6469]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:49 np0005541913.novalocal sudo[6487]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwrgsrrzxrqgchwepmerxboygajqhoet ; /usr/bin/python3
Dec 02 06:57:49 np0005541913.novalocal sudo[6487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:49 np0005541913.novalocal python3[6489]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:57:49 np0005541913.novalocal sudo[6487]: pam_unix(sudo:session): session closed for user root
Dec 02 06:57:49 np0005541913.novalocal sudo[6505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohslbjvxdoevbrqaawbwfscxffncdsof ; /usr/bin/python3
Dec 02 06:57:49 np0005541913.novalocal sudo[6505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:57:49 np0005541913.novalocal python3[6507]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:57:49 np0005541913.novalocal sudo[6505]: pam_unix(sudo:session): session closed for user root
Dec 02 06:58:00 np0005541913.novalocal python3[6525]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-e6e8-5ca8-000000001d09-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:58:01 np0005541913.novalocal python3[6544]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 06:58:04 np0005541913.novalocal sshd[6168]: pam_unix(sshd:session): session closed for user zuul
Dec 02 06:58:04 np0005541913.novalocal systemd-logind[757]: Session 5 logged out. Waiting for processes to exit.
Dec 02 06:58:04 np0005541913.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 02 06:58:04 np0005541913.novalocal systemd[1]: session-5.scope: Consumed 4.079s CPU time.
Dec 02 06:58:04 np0005541913.novalocal systemd-logind[757]: Removed session 5.
Dec 02 06:59:16 np0005541913.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Dec 02 06:59:16 np0005541913.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 02 06:59:16 np0005541913.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Dec 02 06:59:16 np0005541913.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 02 06:59:18 np0005541913.novalocal sshd[6555]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 06:59:18 np0005541913.novalocal sshd[6555]: Accepted publickey for zuul from 38.102.83.114 port 36518 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 06:59:18 np0005541913.novalocal systemd-logind[757]: New session 6 of user zuul.
Dec 02 06:59:18 np0005541913.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 02 06:59:18 np0005541913.novalocal sshd[6555]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 06:59:18 np0005541913.novalocal sudo[6572]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syxjhzizuhdfhuovsekuqiymcttxtiot ; /usr/bin/python3
Dec 02 06:59:18 np0005541913.novalocal sudo[6572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:59:19 np0005541913.novalocal systemd[1]: Starting RHSM dbus service...
Dec 02 06:59:19 np0005541913.novalocal systemd[1]: Started RHSM dbus service.
Dec 02 06:59:19 np0005541913.novalocal rhsm-service[6579]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 02 06:59:19 np0005541913.novalocal rhsm-service[6579]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 02 06:59:19 np0005541913.novalocal rhsm-service[6579]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 02 06:59:19 np0005541913.novalocal rhsm-service[6579]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 02 06:59:20 np0005541913.novalocal rhsm-service[6579]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005541913.novalocal (d1b4d74d-2a0e-41d6-a299-a10b4d7396a9)
Dec 02 06:59:20 np0005541913.novalocal subscription-manager[6579]: Registered system with identity: d1b4d74d-2a0e-41d6-a299-a10b4d7396a9
Dec 02 06:59:21 np0005541913.novalocal rhsm-service[6579]:  INFO [subscription_manager.entcertlib:131] certs updated:
Dec 02 06:59:21 np0005541913.novalocal rhsm-service[6579]: Total updates: 1
Dec 02 06:59:21 np0005541913.novalocal rhsm-service[6579]: Found (local) serial# []
Dec 02 06:59:21 np0005541913.novalocal rhsm-service[6579]: Expected (UEP) serial# [5614244909064200304]
Dec 02 06:59:21 np0005541913.novalocal rhsm-service[6579]: Added (new)
Dec 02 06:59:21 np0005541913.novalocal rhsm-service[6579]:   [sn:5614244909064200304 ( Content Access,) @ /etc/pki/entitlement/5614244909064200304.pem]
Dec 02 06:59:21 np0005541913.novalocal rhsm-service[6579]: Deleted (rogue):
Dec 02 06:59:21 np0005541913.novalocal rhsm-service[6579]:   <NONE>
Dec 02 06:59:21 np0005541913.novalocal subscription-manager[6579]: Added subscription for 'Content Access' contract 'None'
Dec 02 06:59:21 np0005541913.novalocal subscription-manager[6579]: Added subscription for product ' Content Access'
Dec 02 06:59:22 np0005541913.novalocal rhsm-service[6579]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 02 06:59:22 np0005541913.novalocal rhsm-service[6579]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 02 06:59:22 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 06:59:22 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 06:59:22 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 06:59:22 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 06:59:22 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 06:59:23 np0005541913.novalocal sudo[6572]: pam_unix(sudo:session): session closed for user root
Dec 02 06:59:30 np0005541913.novalocal python3[6670]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163e3b-3c83-0809-2eed-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 06:59:31 np0005541913.novalocal sudo[6687]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-repspvqoijxaacclbwaaapwgskapccdn ; /usr/bin/python3
Dec 02 06:59:31 np0005541913.novalocal sudo[6687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 06:59:32 np0005541913.novalocal python3[6689]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:00:03 np0005541913.novalocal setsebool[6764]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 02 07:00:03 np0005541913.novalocal setsebool[6764]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 02 07:00:12 np0005541913.novalocal kernel: SELinux:  Converting 407 SID table entries...
Dec 02 07:00:12 np0005541913.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 07:00:12 np0005541913.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 07:00:12 np0005541913.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 07:00:12 np0005541913.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 07:00:12 np0005541913.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 07:00:12 np0005541913.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 07:00:12 np0005541913.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 07:00:24 np0005541913.novalocal dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Dec 02 07:00:24 np0005541913.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 07:00:24 np0005541913.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 02 07:00:24 np0005541913.novalocal systemd[1]: Reloading.
Dec 02 07:00:24 np0005541913.novalocal systemd-rc-local-generator[7539]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:00:24 np0005541913.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:00:24 np0005541913.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 07:00:26 np0005541913.novalocal sudo[6687]: pam_unix(sudo:session): session closed for user root
Dec 02 07:00:28 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:00:33 np0005541913.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 07:00:33 np0005541913.novalocal systemd[1]: Finished man-db-cache-update.service.
Dec 02 07:00:33 np0005541913.novalocal systemd[1]: man-db-cache-update.service: Consumed 10.463s CPU time.
Dec 02 07:00:33 np0005541913.novalocal systemd[1]: run-r0543e0d27c2d4ab895c16ea57db181eb.service: Deactivated successfully.
Dec 02 07:01:01 np0005541913.novalocal CROND[18339]: (root) CMD (run-parts /etc/cron.hourly)
Dec 02 07:01:02 np0005541913.novalocal run-parts[18342]: (/etc/cron.hourly) starting 0anacron
Dec 02 07:01:02 np0005541913.novalocal anacron[18350]: Anacron started on 2025-12-02
Dec 02 07:01:02 np0005541913.novalocal anacron[18350]: Will run job `cron.daily' in 48 min.
Dec 02 07:01:02 np0005541913.novalocal anacron[18350]: Will run job `cron.weekly' in 68 min.
Dec 02 07:01:02 np0005541913.novalocal anacron[18350]: Will run job `cron.monthly' in 88 min.
Dec 02 07:01:02 np0005541913.novalocal anacron[18350]: Jobs will be executed sequentially
Dec 02 07:01:02 np0005541913.novalocal run-parts[18352]: (/etc/cron.hourly) finished 0anacron
Dec 02 07:01:02 np0005541913.novalocal CROND[18338]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 02 07:01:26 np0005541913.novalocal sshd[6558]: Received disconnect from 38.102.83.114 port 36518:11: disconnected by user
Dec 02 07:01:26 np0005541913.novalocal sshd[6558]: Disconnected from user zuul 38.102.83.114 port 36518
Dec 02 07:01:26 np0005541913.novalocal sshd[6555]: pam_unix(sshd:session): session closed for user zuul
Dec 02 07:01:26 np0005541913.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Dec 02 07:01:26 np0005541913.novalocal systemd[1]: session-6.scope: Consumed 50.351s CPU time.
Dec 02 07:01:26 np0005541913.novalocal systemd-logind[757]: Session 6 logged out. Waiting for processes to exit.
Dec 02 07:01:26 np0005541913.novalocal systemd-logind[757]: Removed session 6.
Dec 02 07:01:31 np0005541913.novalocal sshd[18354]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:01:31 np0005541913.novalocal sshd[18354]: Accepted publickey for zuul from 38.102.83.114 port 54770 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:01:31 np0005541913.novalocal systemd-logind[757]: New session 7 of user zuul.
Dec 02 07:01:31 np0005541913.novalocal systemd[1]: Started Session 7 of User zuul.
Dec 02 07:01:31 np0005541913.novalocal sshd[18354]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 07:01:31 np0005541913.novalocal sudo[18371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzeaquiznexxickreawupbnsqfowdqmx ; /usr/bin/python3
Dec 02 07:01:31 np0005541913.novalocal sudo[18371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:01:31 np0005541913.novalocal podman[18374]: 2025-12-02 07:01:31.918607628 +0000 UTC m=+0.105880147 system refresh
Dec 02 07:01:32 np0005541913.novalocal sudo[18371]: pam_unix(sudo:session): session closed for user root
Dec 02 07:01:32 np0005541913.novalocal systemd[4177]: Starting D-Bus User Message Bus...
Dec 02 07:01:32 np0005541913.novalocal systemd[4177]: Started D-Bus User Message Bus.
Dec 02 07:01:32 np0005541913.novalocal dbus-broker-launch[18431]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 02 07:01:32 np0005541913.novalocal dbus-broker-launch[18431]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 02 07:01:32 np0005541913.novalocal dbus-broker-lau[18431]: Ready
Dec 02 07:01:32 np0005541913.novalocal systemd[4177]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Dec 02 07:01:32 np0005541913.novalocal systemd[4177]: Created slice Slice /user.
Dec 02 07:01:32 np0005541913.novalocal systemd[4177]: podman-18414.scope: unit configures an IP firewall, but not running as root.
Dec 02 07:01:32 np0005541913.novalocal systemd[4177]: (This warning is only shown for the first unit using IP firewalling.)
Dec 02 07:01:32 np0005541913.novalocal systemd[4177]: Started podman-18414.scope.
Dec 02 07:01:32 np0005541913.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:01:33 np0005541913.novalocal systemd[4177]: Started podman-pause-5c4ae909.scope.
Dec 02 07:01:35 np0005541913.novalocal sshd[18354]: pam_unix(sshd:session): session closed for user zuul
Dec 02 07:01:35 np0005541913.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Dec 02 07:01:35 np0005541913.novalocal systemd[1]: session-7.scope: Consumed 1.095s CPU time.
Dec 02 07:01:35 np0005541913.novalocal systemd-logind[757]: Session 7 logged out. Waiting for processes to exit.
Dec 02 07:01:35 np0005541913.novalocal systemd-logind[757]: Removed session 7.
Dec 02 07:01:51 np0005541913.novalocal sshd[18438]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:01:51 np0005541913.novalocal sshd[18434]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:01:51 np0005541913.novalocal sshd[18437]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:01:51 np0005541913.novalocal sshd[18438]: Connection closed by 38.102.83.45 port 48624 [preauth]
Dec 02 07:01:51 np0005541913.novalocal sshd[18436]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:01:51 np0005541913.novalocal sshd[18435]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:01:51 np0005541913.novalocal sshd[18434]: Unable to negotiate with 38.102.83.45 port 48634: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 02 07:01:51 np0005541913.novalocal sshd[18437]: Unable to negotiate with 38.102.83.45 port 48640: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 02 07:01:51 np0005541913.novalocal sshd[18435]: Connection closed by 38.102.83.45 port 48614 [preauth]
Dec 02 07:01:51 np0005541913.novalocal sshd[18436]: Unable to negotiate with 38.102.83.45 port 48632: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 02 07:01:56 np0005541913.novalocal sshd[18444]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:01:56 np0005541913.novalocal sshd[18444]: Accepted publickey for zuul from 38.102.83.114 port 51394 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:01:56 np0005541913.novalocal systemd-logind[757]: New session 8 of user zuul.
Dec 02 07:01:56 np0005541913.novalocal systemd[1]: Started Session 8 of User zuul.
Dec 02 07:01:56 np0005541913.novalocal sshd[18444]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 07:01:57 np0005541913.novalocal python3[18461]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI3vTocdvpL7KoTE0s+B2HOorkXEJmfFflLp6CHTopK26IhGD4IX+p0PXIjQjXzwbw8u6vDuDtUAlLIH4wGuE2A= zuul@np0005541906.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 07:01:57 np0005541913.novalocal sudo[18475]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtroorwobpttayizwusbszskbzycnuqm ; /usr/bin/python3
Dec 02 07:01:57 np0005541913.novalocal sudo[18475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:01:57 np0005541913.novalocal python3[18477]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI3vTocdvpL7KoTE0s+B2HOorkXEJmfFflLp6CHTopK26IhGD4IX+p0PXIjQjXzwbw8u6vDuDtUAlLIH4wGuE2A= zuul@np0005541906.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 07:01:57 np0005541913.novalocal sudo[18475]: pam_unix(sudo:session): session closed for user root
Dec 02 07:01:59 np0005541913.novalocal sshd[18444]: pam_unix(sshd:session): session closed for user zuul
Dec 02 07:01:59 np0005541913.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Dec 02 07:01:59 np0005541913.novalocal systemd-logind[757]: Session 8 logged out. Waiting for processes to exit.
Dec 02 07:01:59 np0005541913.novalocal systemd-logind[757]: Removed session 8.
Dec 02 07:03:27 np0005541913.novalocal sshd[18479]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:03:27 np0005541913.novalocal sshd[18479]: Accepted publickey for zuul from 38.102.83.114 port 57632 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:03:27 np0005541913.novalocal systemd-logind[757]: New session 9 of user zuul.
Dec 02 07:03:27 np0005541913.novalocal systemd[1]: Started Session 9 of User zuul.
Dec 02 07:03:27 np0005541913.novalocal sshd[18479]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 07:03:28 np0005541913.novalocal sudo[18496]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hslmzftiadbqhjwkezhjnzpohwfxbkdl ; /usr/bin/python3
Dec 02 07:03:28 np0005541913.novalocal sudo[18496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:03:28 np0005541913.novalocal python3[18498]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCfcGXFPS+XIPHLw+7WTk1crQnJj1F7l/bATNqEM8HqdPREfaSIeF883HXh8Bv+rj9cjcgSPu+200+1SEsq35V+19mPwwkoxgdhfQu8jGk7vv17tL7k61zl9rWne61hn/7PnFptl+SBaMvOq/9ZdnPuMzb1YBTWbKm6kC3RPkgDUOa/BER5PJh1E6x6wYj1wRGMwVREczSSv+66aA5tTRelsFh16OXZXpq4ddoi7OeuimE3lWuMAHorxzJwF5AN+gPTgKYRkMwbMMHU4nPx7TXt5G3zjqWhmos08Xgdl+lPNHY5i463T96l4hGiycZKO4FOCq0ZMzldYkovXnyZi1CjSYUDcEn+EHIRJyZaK9ZJlJ1no5HVdwv1rwVMw4KkpZvH7HBh/iX47Wsi4qxK+L3X5hwZ7s6iSpNWeEMT5CLZsiDCkrdideFnZ8kW2jgnNIV0h+pUPISFfl1j03bjS9fHJjgl4BndVBxRJZJQf8Szyjx5WcIyBUidtYPnHzSLbmk= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 07:03:28 np0005541913.novalocal sudo[18496]: pam_unix(sudo:session): session closed for user root
Dec 02 07:03:29 np0005541913.novalocal sudo[18512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxdizuejzczwlqistodunlxrqzfrxwqz ; /usr/bin/python3
Dec 02 07:03:29 np0005541913.novalocal sudo[18512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:03:29 np0005541913.novalocal python3[18514]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005541913.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 02 07:03:29 np0005541913.novalocal sudo[18512]: pam_unix(sudo:session): session closed for user root
Dec 02 07:03:30 np0005541913.novalocal sudo[18562]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmcppfbwsjknzwvytuozhxegbezvcodn ; /usr/bin/python3
Dec 02 07:03:30 np0005541913.novalocal sudo[18562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:03:30 np0005541913.novalocal python3[18564]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:03:30 np0005541913.novalocal sudo[18562]: pam_unix(sudo:session): session closed for user root
Dec 02 07:03:31 np0005541913.novalocal sudo[18605]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygvnyxpejnwhxqeunwrkjtjiuektbptq ; /usr/bin/python3
Dec 02 07:03:31 np0005541913.novalocal sudo[18605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:03:31 np0005541913.novalocal python3[18607]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764659010.6460295-137-171557122913719/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=fa40fdabeeae48b78b01a4cbccbd42f6_id_rsa follow=False checksum=c9b7a1839a060a12dd883255955d0b791bf96d1d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:03:31 np0005541913.novalocal sudo[18605]: pam_unix(sudo:session): session closed for user root
Dec 02 07:03:32 np0005541913.novalocal sudo[18667]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bizdwxvvoirowyyqiqytwcfsohobelzi ; /usr/bin/python3
Dec 02 07:03:32 np0005541913.novalocal sudo[18667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:03:32 np0005541913.novalocal python3[18669]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:03:32 np0005541913.novalocal sudo[18667]: pam_unix(sudo:session): session closed for user root
Dec 02 07:03:32 np0005541913.novalocal sudo[18710]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urybldcywoicixasgjzdpqakskropzyu ; /usr/bin/python3
Dec 02 07:03:32 np0005541913.novalocal sudo[18710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:03:32 np0005541913.novalocal python3[18712]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764659012.3983626-225-243319304045149/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=fa40fdabeeae48b78b01a4cbccbd42f6_id_rsa.pub follow=False checksum=076b8979e1bf6ba70130c32daa0e2e874f6f0bae backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:03:33 np0005541913.novalocal sudo[18710]: pam_unix(sudo:session): session closed for user root
Dec 02 07:03:35 np0005541913.novalocal sudo[18740]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbekqwoxmpfywpfaztgeesqhubrtldua ; /usr/bin/python3
Dec 02 07:03:35 np0005541913.novalocal sudo[18740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:03:35 np0005541913.novalocal python3[18742]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:03:35 np0005541913.novalocal sudo[18740]: pam_unix(sudo:session): session closed for user root
Dec 02 07:03:36 np0005541913.novalocal python3[18788]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:03:36 np0005541913.novalocal python3[18804]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpbwl00bfs recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:03:37 np0005541913.novalocal python3[18864]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:03:37 np0005541913.novalocal python3[18880]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmp7eh3joiw recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:03:39 np0005541913.novalocal python3[18940]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:03:39 np0005541913.novalocal python3[18956]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmplienmmx0 recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:03:40 np0005541913.novalocal sshd[18479]: pam_unix(sshd:session): session closed for user zuul
Dec 02 07:03:40 np0005541913.novalocal systemd[1]: session-9.scope: Deactivated successfully.
Dec 02 07:03:40 np0005541913.novalocal systemd[1]: session-9.scope: Consumed 3.365s CPU time.
Dec 02 07:03:40 np0005541913.novalocal systemd-logind[757]: Session 9 logged out. Waiting for processes to exit.
Dec 02 07:03:40 np0005541913.novalocal systemd-logind[757]: Removed session 9.
Dec 02 07:05:53 np0005541913.novalocal sshd[18971]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:05:53 np0005541913.novalocal sshd[18971]: Accepted publickey for zuul from 38.102.83.45 port 55756 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:05:53 np0005541913.novalocal systemd-logind[757]: New session 10 of user zuul.
Dec 02 07:05:53 np0005541913.novalocal systemd[1]: Started Session 10 of User zuul.
Dec 02 07:05:53 np0005541913.novalocal sshd[18971]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 07:05:53 np0005541913.novalocal python3[19017]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:09:16 np0005541913.novalocal sshd[19022]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:09:16 np0005541913.novalocal sshd[19023]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:09:16 np0005541913.novalocal sshd[19023]: error: kex_exchange_identification: read: Connection reset by peer
Dec 02 07:09:16 np0005541913.novalocal sshd[19023]: Connection reset by 45.140.17.97 port 32290
Dec 02 07:10:52 np0005541913.novalocal sshd[18974]: Received disconnect from 38.102.83.45 port 55756:11: disconnected by user
Dec 02 07:10:52 np0005541913.novalocal sshd[18974]: Disconnected from user zuul 38.102.83.45 port 55756
Dec 02 07:10:52 np0005541913.novalocal sshd[18971]: pam_unix(sshd:session): session closed for user zuul
Dec 02 07:10:52 np0005541913.novalocal systemd[1]: session-10.scope: Deactivated successfully.
Dec 02 07:10:52 np0005541913.novalocal systemd-logind[757]: Session 10 logged out. Waiting for processes to exit.
Dec 02 07:10:52 np0005541913.novalocal systemd-logind[757]: Removed session 10.
Dec 02 07:13:06 np0005541913.novalocal systemd[1]: Starting dnf makecache...
Dec 02 07:13:06 np0005541913.novalocal dnf[19027]: Updating Subscription Management repositories.
Dec 02 07:13:08 np0005541913.novalocal dnf[19027]: Failed determining last makecache time.
Dec 02 07:13:08 np0005541913.novalocal dnf[19027]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  36 kB/s | 4.5 kB     00:00
Dec 02 07:13:08 np0005541913.novalocal dnf[19027]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  34 kB/s | 4.5 kB     00:00
Dec 02 07:13:08 np0005541913.novalocal dnf[19027]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   30 kB/s | 4.1 kB     00:00
Dec 02 07:13:09 np0005541913.novalocal dnf[19027]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   31 kB/s | 4.1 kB     00:00
Dec 02 07:13:09 np0005541913.novalocal dnf[19027]: Metadata cache created.
Dec 02 07:13:09 np0005541913.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 02 07:13:09 np0005541913.novalocal systemd[1]: Finished dnf makecache.
Dec 02 07:13:09 np0005541913.novalocal systemd[1]: dnf-makecache.service: Consumed 2.628s CPU time.
Dec 02 07:15:59 np0005541913.novalocal sshd[19033]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:16:03 np0005541913.novalocal sshd[19033]: Invalid user admin from 103.210.21.20 port 35020
Dec 02 07:16:03 np0005541913.novalocal sshd[19033]: Connection closed by invalid user admin 103.210.21.20 port 35020 [preauth]
Dec 02 07:18:15 np0005541913.novalocal sshd[19038]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:18:15 np0005541913.novalocal sshd[19038]: Accepted publickey for zuul from 38.102.83.114 port 58660 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:18:15 np0005541913.novalocal systemd-logind[757]: New session 11 of user zuul.
Dec 02 07:18:15 np0005541913.novalocal systemd[1]: Started Session 11 of User zuul.
Dec 02 07:18:15 np0005541913.novalocal sshd[19038]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 07:18:16 np0005541913.novalocal python3[19055]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163e3b-3c83-8c0a-0232-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:18:17 np0005541913.novalocal sudo[19073]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dznnaumoqtwibolbamixysypuekqccfn ; /usr/bin/python3
Dec 02 07:18:17 np0005541913.novalocal sudo[19073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:18:17 np0005541913.novalocal python3[19075]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163e3b-3c83-8c0a-0232-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:18:19 np0005541913.novalocal sudo[19073]: pam_unix(sudo:session): session closed for user root
Dec 02 07:18:22 np0005541913.novalocal sudo[19093]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kumltrijmqvolkdsgimnwkuyqwqqrhkw ; /usr/bin/python3
Dec 02 07:18:22 np0005541913.novalocal sudo[19093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:18:22 np0005541913.novalocal python3[19095]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Dec 02 07:18:25 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:18:25 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:18:51 np0005541913.novalocal sudo[19093]: pam_unix(sudo:session): session closed for user root
Dec 02 07:19:21 np0005541913.novalocal sudo[19250]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymmfambeigjgfoilbukweugdrdwjwdbi ; /usr/bin/python3
Dec 02 07:19:21 np0005541913.novalocal sudo[19250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:19:21 np0005541913.novalocal python3[19252]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Dec 02 07:19:23 np0005541913.novalocal sshd[19255]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:19:24 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:19:26 np0005541913.novalocal sudo[19250]: pam_unix(sudo:session): session closed for user root
Dec 02 07:19:33 np0005541913.novalocal sudo[19392]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooyhcjllocqxlmvzatuwmsulnugjrhza ; /usr/bin/python3
Dec 02 07:19:33 np0005541913.novalocal sudo[19392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:19:33 np0005541913.novalocal python3[19394]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Dec 02 07:19:37 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:19:37 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:19:43 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:19:51 np0005541913.novalocal sudo[19392]: pam_unix(sudo:session): session closed for user root
Dec 02 07:20:05 np0005541913.novalocal sudo[19667]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dinljfpdvcdmqzyulkloyiktxeujgkio ; /usr/bin/python3
Dec 02 07:20:05 np0005541913.novalocal sudo[19667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:20:05 np0005541913.novalocal python3[19669]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Dec 02 07:20:08 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:20:08 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:20:14 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:20:14 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:20:21 np0005541913.novalocal sudo[19667]: pam_unix(sudo:session): session closed for user root
Dec 02 07:20:25 np0005541913.novalocal sshd[19255]: Connection closed by 103.210.21.20 port 45102 [preauth]
Dec 02 07:20:40 np0005541913.novalocal sudo[20003]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svmclndwdavqywktyxznsyhtnyulpccr ; /usr/bin/python3
Dec 02 07:20:40 np0005541913.novalocal sudo[20003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:20:40 np0005541913.novalocal python3[20005]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Dec 02 07:20:44 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:20:44 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:20:49 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:20:50 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:20:57 np0005541913.novalocal sudo[20003]: pam_unix(sudo:session): session closed for user root
Dec 02 07:21:14 np0005541913.novalocal sudo[20400]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxnkrhnyginujwlcndpegiyoxrfwbfqc ; /usr/bin/python3
Dec 02 07:21:14 np0005541913.novalocal sudo[20400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:21:15 np0005541913.novalocal python3[20402]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-8c0a-0232-000000000013-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:21:17 np0005541913.novalocal sudo[20400]: pam_unix(sudo:session): session closed for user root
Dec 02 07:21:20 np0005541913.novalocal sudo[20419]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xiqczibnncgwwhehyiouoglaoxtnflys ; /usr/bin/python3
Dec 02 07:21:20 np0005541913.novalocal sudo[20419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:21:20 np0005541913.novalocal python3[20421]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:21:41 np0005541913.novalocal kernel: SELinux:  Converting 490 SID table entries...
Dec 02 07:21:41 np0005541913.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 07:21:41 np0005541913.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 07:21:41 np0005541913.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 07:21:41 np0005541913.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 07:21:41 np0005541913.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 07:21:41 np0005541913.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 07:21:41 np0005541913.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 07:21:42 np0005541913.novalocal groupadd[20528]: group added to /etc/group: name=unbound, GID=987
Dec 02 07:21:42 np0005541913.novalocal groupadd[20528]: group added to /etc/gshadow: name=unbound
Dec 02 07:21:42 np0005541913.novalocal groupadd[20528]: new group: name=unbound, GID=987
Dec 02 07:21:42 np0005541913.novalocal useradd[20535]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Dec 02 07:21:42 np0005541913.novalocal dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Dec 02 07:21:42 np0005541913.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 02 07:21:42 np0005541913.novalocal groupadd[20548]: group added to /etc/group: name=openvswitch, GID=986
Dec 02 07:21:42 np0005541913.novalocal groupadd[20548]: group added to /etc/gshadow: name=openvswitch
Dec 02 07:21:42 np0005541913.novalocal groupadd[20548]: new group: name=openvswitch, GID=986
Dec 02 07:21:42 np0005541913.novalocal useradd[20555]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Dec 02 07:21:42 np0005541913.novalocal groupadd[20563]: group added to /etc/group: name=hugetlbfs, GID=985
Dec 02 07:21:42 np0005541913.novalocal groupadd[20563]: group added to /etc/gshadow: name=hugetlbfs
Dec 02 07:21:42 np0005541913.novalocal groupadd[20563]: new group: name=hugetlbfs, GID=985
Dec 02 07:21:42 np0005541913.novalocal usermod[20571]: add 'openvswitch' to group 'hugetlbfs'
Dec 02 07:21:42 np0005541913.novalocal usermod[20571]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 02 07:21:45 np0005541913.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 07:21:45 np0005541913.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 02 07:21:45 np0005541913.novalocal systemd[1]: Reloading.
Dec 02 07:21:45 np0005541913.novalocal systemd-sysv-generator[21076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:21:45 np0005541913.novalocal systemd-rc-local-generator[21072]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:21:45 np0005541913.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:21:45 np0005541913.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 07:21:46 np0005541913.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 07:21:46 np0005541913.novalocal systemd[1]: Finished man-db-cache-update.service.
Dec 02 07:21:46 np0005541913.novalocal systemd[1]: run-ra43097f77f2f4ce88eea19a47a4833bf.service: Deactivated successfully.
Dec 02 07:21:46 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:21:46 np0005541913.novalocal sudo[20419]: pam_unix(sudo:session): session closed for user root
Dec 02 07:21:46 np0005541913.novalocal rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 07:22:13 np0005541913.novalocal sudo[21629]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beuqgcyxdykoyxxghlvcgonrpyhmydjb ; /usr/bin/python3
Dec 02 07:22:13 np0005541913.novalocal sudo[21629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:13 np0005541913.novalocal python3[21631]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-8c0a-0232-000000000015-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:22:27 np0005541913.novalocal sudo[21629]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:44 np0005541913.novalocal sudo[21649]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epmxznwrhorsszeoufxdcapnqdhzbuxk ; /usr/bin/python3
Dec 02 07:22:44 np0005541913.novalocal sudo[21649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:45 np0005541913.novalocal python3[21651]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:22:45 np0005541913.novalocal sudo[21649]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:45 np0005541913.novalocal sudo[21697]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afuosvjnrwvgxmpyvdgkqsgurjhcufhu ; /usr/bin/python3
Dec 02 07:22:45 np0005541913.novalocal sudo[21697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:46 np0005541913.novalocal python3[21699]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:22:46 np0005541913.novalocal sudo[21697]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:46 np0005541913.novalocal sudo[21740]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-woohzlcongrushqerrxyoiqglrqeiylv ; /usr/bin/python3
Dec 02 07:22:46 np0005541913.novalocal sudo[21740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:46 np0005541913.novalocal python3[21742]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764660165.6364863-293-247612159958287/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=9333f42ac4b9baf349a5c32f7bcba3335b5912e0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:22:46 np0005541913.novalocal sudo[21740]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:47 np0005541913.novalocal sudo[21770]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugkoluheqvnicojldiuegkvmachqdjde ; /usr/bin/python3
Dec 02 07:22:47 np0005541913.novalocal sudo[21770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:48 np0005541913.novalocal python3[21772]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 02 07:22:48 np0005541913.novalocal sudo[21770]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:48 np0005541913.novalocal systemd-journald[619]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation.
Dec 02 07:22:48 np0005541913.novalocal systemd-journald[619]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 07:22:48 np0005541913.novalocal rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 07:22:48 np0005541913.novalocal rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 07:22:48 np0005541913.novalocal sudo[21791]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zffvwjtjztjaiabjrnlnldondadbtidu ; /usr/bin/python3
Dec 02 07:22:48 np0005541913.novalocal sudo[21791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:48 np0005541913.novalocal python3[21793]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 02 07:22:48 np0005541913.novalocal sudo[21791]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:48 np0005541913.novalocal sudo[21811]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgoubuheymmgfqpckniyjlbzaadqtihg ; /usr/bin/python3
Dec 02 07:22:48 np0005541913.novalocal sudo[21811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:48 np0005541913.novalocal python3[21813]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 02 07:22:48 np0005541913.novalocal sudo[21811]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:48 np0005541913.novalocal sudo[21831]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lapjwupdfuvehjvglqshhdjwtgrgohcs ; /usr/bin/python3
Dec 02 07:22:48 np0005541913.novalocal sudo[21831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:48 np0005541913.novalocal python3[21833]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 02 07:22:48 np0005541913.novalocal sudo[21831]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:49 np0005541913.novalocal sudo[21851]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlvkcobhmxwylwamtobvtekxqcjxxgrw ; /usr/bin/python3
Dec 02 07:22:49 np0005541913.novalocal sudo[21851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:49 np0005541913.novalocal python3[21853]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 02 07:22:49 np0005541913.novalocal sudo[21851]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:51 np0005541913.novalocal sudo[21871]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfqahtvxsgbmmcaufftxunzapcckbaxx ; /usr/bin/python3
Dec 02 07:22:51 np0005541913.novalocal sudo[21871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:51 np0005541913.novalocal python3[21873]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 07:22:51 np0005541913.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Dec 02 07:22:51 np0005541913.novalocal network[21876]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 07:22:51 np0005541913.novalocal network[21887]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 07:22:51 np0005541913.novalocal network[21876]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Dec 02 07:22:51 np0005541913.novalocal network[21888]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:22:51 np0005541913.novalocal network[21876]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 07:22:51 np0005541913.novalocal network[21889]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 07:22:51 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660171.5884] audit: op="connections-reload" pid=21917 uid=0 result="success"
Dec 02 07:22:51 np0005541913.novalocal network[21876]: Bringing up loopback interface:  [  OK  ]
Dec 02 07:22:51 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660171.7875] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22005 uid=0 result="success"
Dec 02 07:22:51 np0005541913.novalocal network[21876]: Bringing up interface eth0:  [  OK  ]
Dec 02 07:22:51 np0005541913.novalocal systemd[1]: Started LSB: Bring up/down networking.
Dec 02 07:22:51 np0005541913.novalocal sudo[21871]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:51 np0005541913.novalocal sudo[22044]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eejcndugbjchfsfqvwoebnirqhcsrvsy ; /usr/bin/python3
Dec 02 07:22:51 np0005541913.novalocal sudo[22044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:52 np0005541913.novalocal python3[22046]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 07:22:52 np0005541913.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Dec 02 07:22:52 np0005541913.novalocal chown[22050]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 02 07:22:52 np0005541913.novalocal ovs-ctl[22055]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 02 07:22:52 np0005541913.novalocal ovs-ctl[22055]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 02 07:22:52 np0005541913.novalocal ovs-ctl[22055]: Starting ovsdb-server [  OK  ]
Dec 02 07:22:52 np0005541913.novalocal ovs-vsctl[22105]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 02 07:22:52 np0005541913.novalocal ovs-vsctl[22125]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Dec 02 07:22:52 np0005541913.novalocal ovs-ctl[22055]: Configuring Open vSwitch system IDs [  OK  ]
Dec 02 07:22:52 np0005541913.novalocal ovs-ctl[22055]: Enabling remote OVSDB managers [  OK  ]
Dec 02 07:22:52 np0005541913.novalocal systemd[1]: Started Open vSwitch Database Unit.
Dec 02 07:22:52 np0005541913.novalocal ovs-vsctl[22131]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005541913.novalocal
Dec 02 07:22:52 np0005541913.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 02 07:22:52 np0005541913.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 02 07:22:52 np0005541913.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 02 07:22:52 np0005541913.novalocal kernel: openvswitch: Open vSwitch switching datapath
Dec 02 07:22:52 np0005541913.novalocal ovs-ctl[22175]: Inserting openvswitch module [  OK  ]
Dec 02 07:22:52 np0005541913.novalocal ovs-ctl[22144]: Starting ovs-vswitchd [  OK  ]
Dec 02 07:22:52 np0005541913.novalocal ovs-vsctl[22194]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005541913.novalocal
Dec 02 07:22:52 np0005541913.novalocal ovs-ctl[22144]: Enabling remote OVSDB managers [  OK  ]
Dec 02 07:22:52 np0005541913.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 02 07:22:52 np0005541913.novalocal systemd[1]: Starting Open vSwitch...
Dec 02 07:22:52 np0005541913.novalocal systemd[1]: Finished Open vSwitch.
Dec 02 07:22:52 np0005541913.novalocal sudo[22044]: pam_unix(sudo:session): session closed for user root
Dec 02 07:22:55 np0005541913.novalocal sudo[22210]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwqofnoedbbtsltreadeihvqmlmqtigc ; /usr/bin/python3
Dec 02 07:22:55 np0005541913.novalocal sudo[22210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:22:55 np0005541913.novalocal python3[22212]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-8c0a-0232-00000000001a-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:22:56 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660176.4345] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22370 uid=0 result="success"
Dec 02 07:22:56 np0005541913.novalocal ifup[22371]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:22:56 np0005541913.novalocal ifup[22372]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:22:56 np0005541913.novalocal ifup[22373]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:22:56 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660176.4692] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22379 uid=0 result="success"
Dec 02 07:22:56 np0005541913.novalocal ovs-vsctl[22381]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:48:4f:22 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Dec 02 07:22:56 np0005541913.novalocal kernel: device ovs-system entered promiscuous mode
Dec 02 07:22:56 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660176.4975] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Dec 02 07:22:56 np0005541913.novalocal kernel: Timeout policy base is empty
Dec 02 07:22:56 np0005541913.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Dec 02 07:22:56 np0005541913.novalocal systemd-udevd[22383]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 07:22:56 np0005541913.novalocal kernel: device br-ex entered promiscuous mode
Dec 02 07:22:56 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660176.5298] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Dec 02 07:22:56 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660176.5511] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22409 uid=0 result="success"
Dec 02 07:22:56 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660176.5684] device (br-ex): carrier: link connected
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.6243] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22438 uid=0 result="success"
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.6756] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22453 uid=0 result="success"
Dec 02 07:22:59 np0005541913.novalocal NET[22478]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.7586] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.7727] dhcp4 (eth1): canceled DHCP transaction
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.7728] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.7728] dhcp4 (eth1): state changed no lease
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.7789] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22487 uid=0 result="success"
Dec 02 07:22:59 np0005541913.novalocal ifup[22488]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:22:59 np0005541913.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 07:22:59 np0005541913.novalocal ifup[22490]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:22:59 np0005541913.novalocal ifup[22491]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:22:59 np0005541913.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.8150] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22504 uid=0 result="success"
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.8571] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22515 uid=0 result="success"
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.8639] device (eth1): carrier: link connected
Dec 02 07:22:59 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660179.8859] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22524 uid=0 result="success"
Dec 02 07:22:59 np0005541913.novalocal ipv6_wait_tentative[22536]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Dec 02 07:23:00 np0005541913.novalocal ipv6_wait_tentative[22541]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Dec 02 07:23:01 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660181.9505] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22550 uid=0 result="success"
Dec 02 07:23:01 np0005541913.novalocal ovs-vsctl[22565]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Dec 02 07:23:01 np0005541913.novalocal kernel: device eth1 entered promiscuous mode
Dec 02 07:23:02 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660182.0127] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22573 uid=0 result="success"
Dec 02 07:23:02 np0005541913.novalocal ifup[22574]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:02 np0005541913.novalocal ifup[22575]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:02 np0005541913.novalocal ifup[22576]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:02 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660182.0403] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22582 uid=0 result="success"
Dec 02 07:23:02 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660182.0732] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22592 uid=0 result="success"
Dec 02 07:23:02 np0005541913.novalocal ifup[22593]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:02 np0005541913.novalocal ifup[22594]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:02 np0005541913.novalocal ifup[22595]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:02 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660182.0975] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22601 uid=0 result="success"
Dec 02 07:23:02 np0005541913.novalocal ovs-vsctl[22604]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Dec 02 07:23:02 np0005541913.novalocal kernel: device vlan23 entered promiscuous mode
Dec 02 07:23:02 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660182.1294] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Dec 02 07:23:02 np0005541913.novalocal systemd-udevd[22606]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 07:23:02 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660182.1548] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22615 uid=0 result="success"
Dec 02 07:23:02 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660182.1724] device (vlan23): carrier: link connected
Dec 02 07:23:05 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660185.2272] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22644 uid=0 result="success"
Dec 02 07:23:05 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660185.2765] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22659 uid=0 result="success"
Dec 02 07:23:05 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660185.3325] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22680 uid=0 result="success"
Dec 02 07:23:05 np0005541913.novalocal ifup[22681]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:05 np0005541913.novalocal ifup[22682]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:05 np0005541913.novalocal ifup[22683]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:05 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660185.3645] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22689 uid=0 result="success"
Dec 02 07:23:05 np0005541913.novalocal ovs-vsctl[22692]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Dec 02 07:23:05 np0005541913.novalocal kernel: device vlan20 entered promiscuous mode
Dec 02 07:23:05 np0005541913.novalocal systemd-udevd[22694]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 07:23:05 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660185.3984] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Dec 02 07:23:05 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660185.4223] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22704 uid=0 result="success"
Dec 02 07:23:05 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660185.4405] device (vlan20): carrier: link connected
Dec 02 07:23:08 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660188.4943] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22734 uid=0 result="success"
Dec 02 07:23:08 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660188.5432] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22749 uid=0 result="success"
Dec 02 07:23:08 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660188.6012] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22770 uid=0 result="success"
Dec 02 07:23:08 np0005541913.novalocal ifup[22771]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:08 np0005541913.novalocal ifup[22772]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:08 np0005541913.novalocal ifup[22773]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:08 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660188.6294] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22779 uid=0 result="success"
Dec 02 07:23:08 np0005541913.novalocal ovs-vsctl[22782]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Dec 02 07:23:08 np0005541913.novalocal kernel: device vlan21 entered promiscuous mode
Dec 02 07:23:08 np0005541913.novalocal systemd-udevd[22784]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 07:23:08 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660188.6777] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Dec 02 07:23:08 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660188.7034] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22794 uid=0 result="success"
Dec 02 07:23:08 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660188.7216] device (vlan21): carrier: link connected
Dec 02 07:23:09 np0005541913.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 07:23:11 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660191.7755] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22824 uid=0 result="success"
Dec 02 07:23:11 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660191.8217] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22839 uid=0 result="success"
Dec 02 07:23:11 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660191.8835] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22860 uid=0 result="success"
Dec 02 07:23:11 np0005541913.novalocal ifup[22861]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:11 np0005541913.novalocal ifup[22862]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:11 np0005541913.novalocal ifup[22863]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:11 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660191.9141] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22869 uid=0 result="success"
Dec 02 07:23:11 np0005541913.novalocal ovs-vsctl[22872]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Dec 02 07:23:11 np0005541913.novalocal systemd-udevd[22874]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 07:23:11 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660191.9554] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Dec 02 07:23:11 np0005541913.novalocal kernel: device vlan22 entered promiscuous mode
Dec 02 07:23:11 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660191.9841] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22884 uid=0 result="success"
Dec 02 07:23:12 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660192.0051] device (vlan22): carrier: link connected
Dec 02 07:23:15 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660195.0566] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22914 uid=0 result="success"
Dec 02 07:23:15 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660195.1104] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22929 uid=0 result="success"
Dec 02 07:23:15 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660195.1673] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22950 uid=0 result="success"
Dec 02 07:23:15 np0005541913.novalocal ifup[22951]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:15 np0005541913.novalocal ifup[22952]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:15 np0005541913.novalocal ifup[22953]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:15 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660195.1989] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22959 uid=0 result="success"
Dec 02 07:23:15 np0005541913.novalocal ovs-vsctl[22962]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Dec 02 07:23:15 np0005541913.novalocal systemd-udevd[22964]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 07:23:15 np0005541913.novalocal kernel: device vlan44 entered promiscuous mode
Dec 02 07:23:15 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660195.2417] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Dec 02 07:23:15 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660195.2678] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22974 uid=0 result="success"
Dec 02 07:23:15 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660195.2880] device (vlan44): carrier: link connected
Dec 02 07:23:18 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660198.3466] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23004 uid=0 result="success"
Dec 02 07:23:18 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660198.3977] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23019 uid=0 result="success"
Dec 02 07:23:18 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660198.4525] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23040 uid=0 result="success"
Dec 02 07:23:18 np0005541913.novalocal ifup[23041]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:18 np0005541913.novalocal ifup[23042]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:18 np0005541913.novalocal ifup[23043]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:18 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660198.4810] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23049 uid=0 result="success"
Dec 02 07:23:18 np0005541913.novalocal ovs-vsctl[23052]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Dec 02 07:23:18 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660198.5763] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23059 uid=0 result="success"
Dec 02 07:23:19 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660199.6430] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23086 uid=0 result="success"
Dec 02 07:23:19 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660199.7000] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23101 uid=0 result="success"
Dec 02 07:23:19 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660199.7622] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23122 uid=0 result="success"
Dec 02 07:23:19 np0005541913.novalocal ifup[23123]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:19 np0005541913.novalocal ifup[23124]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:19 np0005541913.novalocal ifup[23125]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:19 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660199.7994] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23131 uid=0 result="success"
Dec 02 07:23:19 np0005541913.novalocal ovs-vsctl[23134]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Dec 02 07:23:19 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660199.9017] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23141 uid=0 result="success"
Dec 02 07:23:20 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660200.9618] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23169 uid=0 result="success"
Dec 02 07:23:21 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660201.0075] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23184 uid=0 result="success"
Dec 02 07:23:21 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660201.0667] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23205 uid=0 result="success"
Dec 02 07:23:21 np0005541913.novalocal ifup[23206]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:21 np0005541913.novalocal ifup[23207]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:21 np0005541913.novalocal ifup[23208]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:21 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660201.0992] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23214 uid=0 result="success"
Dec 02 07:23:21 np0005541913.novalocal ovs-vsctl[23217]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Dec 02 07:23:21 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660201.2057] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23224 uid=0 result="success"
Dec 02 07:23:22 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660202.2669] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23252 uid=0 result="success"
Dec 02 07:23:22 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660202.3073] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23267 uid=0 result="success"
Dec 02 07:23:22 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660202.3634] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23288 uid=0 result="success"
Dec 02 07:23:22 np0005541913.novalocal ifup[23289]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:22 np0005541913.novalocal ifup[23290]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:22 np0005541913.novalocal ifup[23291]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:22 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660202.3928] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23297 uid=0 result="success"
Dec 02 07:23:22 np0005541913.novalocal ovs-vsctl[23300]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Dec 02 07:23:22 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660202.4928] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23307 uid=0 result="success"
Dec 02 07:23:23 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660203.5532] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23335 uid=0 result="success"
Dec 02 07:23:23 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660203.5969] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23350 uid=0 result="success"
Dec 02 07:23:23 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660203.6414] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23371 uid=0 result="success"
Dec 02 07:23:23 np0005541913.novalocal ifup[23372]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 02 07:23:23 np0005541913.novalocal ifup[23373]: 'network-scripts' will be removed from distribution in near future.
Dec 02 07:23:23 np0005541913.novalocal ifup[23374]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 02 07:23:23 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660203.6746] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23380 uid=0 result="success"
Dec 02 07:23:23 np0005541913.novalocal ovs-vsctl[23383]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Dec 02 07:23:23 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660203.7247] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23390 uid=0 result="success"
Dec 02 07:23:24 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660204.7820] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23418 uid=0 result="success"
Dec 02 07:23:24 np0005541913.novalocal NetworkManager[5965]: <info>  [1764660204.8318] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23433 uid=0 result="success"
Dec 02 07:23:24 np0005541913.novalocal sudo[22210]: pam_unix(sudo:session): session closed for user root
Dec 02 07:24:17 np0005541913.novalocal python3[23465]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-8c0a-0232-00000000001b-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:24:23 np0005541913.novalocal python3[23484]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCfcGXFPS+XIPHLw+7WTk1crQnJj1F7l/bATNqEM8HqdPREfaSIeF883HXh8Bv+rj9cjcgSPu+200+1SEsq35V+19mPwwkoxgdhfQu8jGk7vv17tL7k61zl9rWne61hn/7PnFptl+SBaMvOq/9ZdnPuMzb1YBTWbKm6kC3RPkgDUOa/BER5PJh1E6x6wYj1wRGMwVREczSSv+66aA5tTRelsFh16OXZXpq4ddoi7OeuimE3lWuMAHorxzJwF5AN+gPTgKYRkMwbMMHU4nPx7TXt5G3zjqWhmos08Xgdl+lPNHY5i463T96l4hGiycZKO4FOCq0ZMzldYkovXnyZi1CjSYUDcEn+EHIRJyZaK9ZJlJ1no5HVdwv1rwVMw4KkpZvH7HBh/iX47Wsi4qxK+L3X5hwZ7s6iSpNWeEMT5CLZsiDCkrdideFnZ8kW2jgnNIV0h+pUPISFfl1j03bjS9fHJjgl4BndVBxRJZJQf8Szyjx5WcIyBUidtYPnHzSLbmk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 07:24:23 np0005541913.novalocal sudo[23498]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkiiklmhygqumocfgyhhnrtxkamwlosz ; /usr/bin/python3
Dec 02 07:24:23 np0005541913.novalocal sudo[23498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:24:23 np0005541913.novalocal python3[23500]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCfcGXFPS+XIPHLw+7WTk1crQnJj1F7l/bATNqEM8HqdPREfaSIeF883HXh8Bv+rj9cjcgSPu+200+1SEsq35V+19mPwwkoxgdhfQu8jGk7vv17tL7k61zl9rWne61hn/7PnFptl+SBaMvOq/9ZdnPuMzb1YBTWbKm6kC3RPkgDUOa/BER5PJh1E6x6wYj1wRGMwVREczSSv+66aA5tTRelsFh16OXZXpq4ddoi7OeuimE3lWuMAHorxzJwF5AN+gPTgKYRkMwbMMHU4nPx7TXt5G3zjqWhmos08Xgdl+lPNHY5i463T96l4hGiycZKO4FOCq0ZMzldYkovXnyZi1CjSYUDcEn+EHIRJyZaK9ZJlJ1no5HVdwv1rwVMw4KkpZvH7HBh/iX47Wsi4qxK+L3X5hwZ7s6iSpNWeEMT5CLZsiDCkrdideFnZ8kW2jgnNIV0h+pUPISFfl1j03bjS9fHJjgl4BndVBxRJZJQf8Szyjx5WcIyBUidtYPnHzSLbmk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 07:24:23 np0005541913.novalocal sudo[23498]: pam_unix(sudo:session): session closed for user root
Dec 02 07:24:25 np0005541913.novalocal python3[23514]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCfcGXFPS+XIPHLw+7WTk1crQnJj1F7l/bATNqEM8HqdPREfaSIeF883HXh8Bv+rj9cjcgSPu+200+1SEsq35V+19mPwwkoxgdhfQu8jGk7vv17tL7k61zl9rWne61hn/7PnFptl+SBaMvOq/9ZdnPuMzb1YBTWbKm6kC3RPkgDUOa/BER5PJh1E6x6wYj1wRGMwVREczSSv+66aA5tTRelsFh16OXZXpq4ddoi7OeuimE3lWuMAHorxzJwF5AN+gPTgKYRkMwbMMHU4nPx7TXt5G3zjqWhmos08Xgdl+lPNHY5i463T96l4hGiycZKO4FOCq0ZMzldYkovXnyZi1CjSYUDcEn+EHIRJyZaK9ZJlJ1no5HVdwv1rwVMw4KkpZvH7HBh/iX47Wsi4qxK+L3X5hwZ7s6iSpNWeEMT5CLZsiDCkrdideFnZ8kW2jgnNIV0h+pUPISFfl1j03bjS9fHJjgl4BndVBxRJZJQf8Szyjx5WcIyBUidtYPnHzSLbmk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 07:24:25 np0005541913.novalocal sudo[23528]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxdocseuvwfgbedhvjirtipbhnqwnkkp ; /usr/bin/python3
Dec 02 07:24:25 np0005541913.novalocal sudo[23528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:24:25 np0005541913.novalocal python3[23530]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCfcGXFPS+XIPHLw+7WTk1crQnJj1F7l/bATNqEM8HqdPREfaSIeF883HXh8Bv+rj9cjcgSPu+200+1SEsq35V+19mPwwkoxgdhfQu8jGk7vv17tL7k61zl9rWne61hn/7PnFptl+SBaMvOq/9ZdnPuMzb1YBTWbKm6kC3RPkgDUOa/BER5PJh1E6x6wYj1wRGMwVREczSSv+66aA5tTRelsFh16OXZXpq4ddoi7OeuimE3lWuMAHorxzJwF5AN+gPTgKYRkMwbMMHU4nPx7TXt5G3zjqWhmos08Xgdl+lPNHY5i463T96l4hGiycZKO4FOCq0ZMzldYkovXnyZi1CjSYUDcEn+EHIRJyZaK9ZJlJ1no5HVdwv1rwVMw4KkpZvH7HBh/iX47Wsi4qxK+L3X5hwZ7s6iSpNWeEMT5CLZsiDCkrdideFnZ8kW2jgnNIV0h+pUPISFfl1j03bjS9fHJjgl4BndVBxRJZJQf8Szyjx5WcIyBUidtYPnHzSLbmk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 07:24:25 np0005541913.novalocal sudo[23528]: pam_unix(sudo:session): session closed for user root
Dec 02 07:24:26 np0005541913.novalocal python3[23544]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Dec 02 07:24:27 np0005541913.novalocal python3[23559]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005541913.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-8c0a-0232-000000000022-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:24:28 np0005541913.novalocal sudo[23577]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bthrmcpawvnszxpnkeghtphkvkjcfaeb ; /usr/bin/python3
Dec 02 07:24:28 np0005541913.novalocal sudo[23577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:24:28 np0005541913.novalocal python3[23579]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.localdomain"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-8c0a-0232-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:24:28 np0005541913.novalocal systemd[1]: Starting Hostname Service...
Dec 02 07:24:28 np0005541913.novalocal systemd[1]: Started Hostname Service.
Dec 02 07:24:28 np0005541913.localdomain systemd-hostnamed[23583]: Hostname set to <np0005541913.localdomain> (static)
Dec 02 07:24:28 np0005541913.localdomain NetworkManager[5965]: <info>  [1764660268.5174] hostname: static hostname changed from "np0005541913.novalocal" to "np0005541913.localdomain"
Dec 02 07:24:28 np0005541913.localdomain systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 07:24:28 np0005541913.localdomain systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 07:24:28 np0005541913.localdomain sudo[23577]: pam_unix(sudo:session): session closed for user root
Dec 02 07:24:29 np0005541913.localdomain sshd[19038]: pam_unix(sshd:session): session closed for user zuul
Dec 02 07:24:29 np0005541913.localdomain systemd-logind[757]: Session 11 logged out. Waiting for processes to exit.
Dec 02 07:24:29 np0005541913.localdomain systemd[1]: session-11.scope: Deactivated successfully.
Dec 02 07:24:29 np0005541913.localdomain systemd[1]: session-11.scope: Consumed 1min 45.112s CPU time.
Dec 02 07:24:29 np0005541913.localdomain systemd-logind[757]: Removed session 11.
Dec 02 07:24:32 np0005541913.localdomain sshd[23594]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:24:32 np0005541913.localdomain sshd[23594]: Accepted publickey for zuul from 38.102.83.114 port 53716 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:24:32 np0005541913.localdomain systemd-logind[757]: New session 12 of user zuul.
Dec 02 07:24:32 np0005541913.localdomain systemd[1]: Started Session 12 of User zuul.
Dec 02 07:24:32 np0005541913.localdomain sshd[23594]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 07:24:33 np0005541913.localdomain python3[23611]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Dec 02 07:24:34 np0005541913.localdomain sshd[23594]: pam_unix(sshd:session): session closed for user zuul
Dec 02 07:24:34 np0005541913.localdomain systemd[1]: session-12.scope: Deactivated successfully.
Dec 02 07:24:34 np0005541913.localdomain systemd-logind[757]: Session 12 logged out. Waiting for processes to exit.
Dec 02 07:24:34 np0005541913.localdomain systemd-logind[757]: Removed session 12.
Dec 02 07:24:38 np0005541913.localdomain systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 07:24:58 np0005541913.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 07:25:21 np0005541913.localdomain sshd[23617]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:25:21 np0005541913.localdomain sshd[23617]: Accepted publickey for zuul from 38.102.83.114 port 58832 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:25:21 np0005541913.localdomain systemd-logind[757]: New session 13 of user zuul.
Dec 02 07:25:21 np0005541913.localdomain systemd[1]: Started Session 13 of User zuul.
Dec 02 07:25:21 np0005541913.localdomain sshd[23617]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 07:25:21 np0005541913.localdomain sudo[23634]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plydzuceefprgokuicxvkfdfgjqoqepw ; /usr/bin/python3
Dec 02 07:25:21 np0005541913.localdomain sudo[23634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:25:22 np0005541913.localdomain python3[23636]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:25:25 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:25:25 np0005541913.localdomain systemd-sysv-generator[23679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:25:25 np0005541913.localdomain systemd-rc-local-generator[23674]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:25:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:25:26 np0005541913.localdomain systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 02 07:25:26 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:25:26 np0005541913.localdomain systemd-rc-local-generator[23717]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:25:26 np0005541913.localdomain systemd-sysv-generator[23723]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:25:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:25:26 np0005541913.localdomain systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 02 07:25:26 np0005541913.localdomain systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 02 07:25:26 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:25:26 np0005541913.localdomain systemd-rc-local-generator[23758]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:25:26 np0005541913.localdomain systemd-sysv-generator[23763]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:25:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:25:26 np0005541913.localdomain systemd[1]: Listening on LVM2 poll daemon socket.
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:25:27 np0005541913.localdomain systemd-rc-local-generator[23817]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:25:27 np0005541913.localdomain systemd-sysv-generator[23820]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: run-rdd2e5b96d89644679b2a565c6126d080.service: Deactivated successfully.
Dec 02 07:25:27 np0005541913.localdomain systemd[1]: run-r8216724a560f408db8309d093c592012.service: Deactivated successfully.
Dec 02 07:25:28 np0005541913.localdomain sudo[23634]: pam_unix(sudo:session): session closed for user root
Dec 02 07:26:28 np0005541913.localdomain sshd[23620]: Received disconnect from 38.102.83.114 port 58832:11: disconnected by user
Dec 02 07:26:28 np0005541913.localdomain sshd[23620]: Disconnected from user zuul 38.102.83.114 port 58832
Dec 02 07:26:28 np0005541913.localdomain sshd[23617]: pam_unix(sshd:session): session closed for user zuul
Dec 02 07:26:28 np0005541913.localdomain systemd[1]: session-13.scope: Deactivated successfully.
Dec 02 07:26:28 np0005541913.localdomain systemd[1]: session-13.scope: Consumed 4.552s CPU time.
Dec 02 07:26:28 np0005541913.localdomain systemd-logind[757]: Session 13 logged out. Waiting for processes to exit.
Dec 02 07:26:28 np0005541913.localdomain systemd-logind[757]: Removed session 13.
Dec 02 07:42:20 np0005541913.localdomain sshd[24413]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:42:20 np0005541913.localdomain sshd[24413]: Accepted publickey for zuul from 192.168.122.100 port 60882 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:42:20 np0005541913.localdomain systemd-logind[757]: New session 14 of user zuul.
Dec 02 07:42:20 np0005541913.localdomain systemd[1]: Started Session 14 of User zuul.
Dec 02 07:42:20 np0005541913.localdomain sshd[24413]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 07:42:20 np0005541913.localdomain sudo[24459]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgrokstbxyxjsftjwpodrakadpmxilgv ; /usr/bin/python3
Dec 02 07:42:20 np0005541913.localdomain sudo[24459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:21 np0005541913.localdomain python3[24461]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 07:42:21 np0005541913.localdomain sudo[24459]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:22 np0005541913.localdomain sudo[24546]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czzxogtahsjfmfudbtcmhepdwlweuavd ; /usr/bin/python3
Dec 02 07:42:22 np0005541913.localdomain sudo[24546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:23 np0005541913.localdomain python3[24548]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:42:25 np0005541913.localdomain sudo[24546]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:26 np0005541913.localdomain sudo[24563]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjfgvxfvofoplhjcmodkultwphzgictk ; /usr/bin/python3
Dec 02 07:42:26 np0005541913.localdomain sudo[24563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:26 np0005541913.localdomain python3[24565]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:42:26 np0005541913.localdomain sudo[24563]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:26 np0005541913.localdomain sudo[24579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdupakqbfjzfmitshzqecgyeviduqsbq ; /usr/bin/python3
Dec 02 07:42:26 np0005541913.localdomain sudo[24579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:26 np0005541913.localdomain python3[24581]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:42:26 np0005541913.localdomain kernel: loop: module loaded
Dec 02 07:42:26 np0005541913.localdomain kernel: loop3: detected capacity change from 0 to 14680064
Dec 02 07:42:26 np0005541913.localdomain sudo[24579]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:27 np0005541913.localdomain sudo[24604]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyvmqdemptrjjqqjsqklegqwgvtidkcw ; /usr/bin/python3
Dec 02 07:42:27 np0005541913.localdomain sudo[24604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:27 np0005541913.localdomain python3[24606]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                                         vgcreate ceph_vg0 /dev/loop3
                                                         lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:42:27 np0005541913.localdomain lvm[24609]: PV /dev/loop3 not used.
Dec 02 07:42:27 np0005541913.localdomain lvm[24611]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 02 07:42:27 np0005541913.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec 02 07:42:27 np0005541913.localdomain lvm[24620]:   1 logical volume(s) in volume group "ceph_vg0" now active
Dec 02 07:42:27 np0005541913.localdomain systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec 02 07:42:27 np0005541913.localdomain sudo[24604]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:28 np0005541913.localdomain sudo[24666]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njwrridoretuwiozbuqxkrfcgnfdisdj ; /usr/bin/python3
Dec 02 07:42:28 np0005541913.localdomain sudo[24666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:28 np0005541913.localdomain python3[24668]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:42:28 np0005541913.localdomain sudo[24666]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:28 np0005541913.localdomain sudo[24709]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkdantcldalienchzrwjshkzjchirkcb ; /usr/bin/python3
Dec 02 07:42:28 np0005541913.localdomain sudo[24709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:28 np0005541913.localdomain python3[24711]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764661347.997428-53935-71278920694817/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:42:28 np0005541913.localdomain sudo[24709]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:29 np0005541913.localdomain sudo[24739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zoepjfkajwhzuxdnoquycytdmborwnkn ; /usr/bin/python3
Dec 02 07:42:29 np0005541913.localdomain sudo[24739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:29 np0005541913.localdomain python3[24741]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:42:29 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:42:29 np0005541913.localdomain systemd-rc-local-generator[24767]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:42:29 np0005541913.localdomain systemd-sysv-generator[24771]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:42:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:42:29 np0005541913.localdomain systemd[1]: Starting Ceph OSD losetup...
Dec 02 07:42:29 np0005541913.localdomain bash[24784]: /dev/loop3: [64516]:8402014 (/var/lib/ceph-osd-0.img)
Dec 02 07:42:30 np0005541913.localdomain systemd[1]: Finished Ceph OSD losetup.
Dec 02 07:42:30 np0005541913.localdomain lvm[24785]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 02 07:42:30 np0005541913.localdomain lvm[24785]: VG ceph_vg0 finished
Dec 02 07:42:30 np0005541913.localdomain sudo[24739]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:31 np0005541913.localdomain sudo[24800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjuxmhvlnnwwoblfxieulegcgixnjjfe ; /usr/bin/python3
Dec 02 07:42:31 np0005541913.localdomain sudo[24800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:31 np0005541913.localdomain python3[24802]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:42:34 np0005541913.localdomain sudo[24800]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:34 np0005541913.localdomain sudo[24817]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbzfxzyrpzjmsrxekjyforcibysqztpd ; /usr/bin/python3
Dec 02 07:42:34 np0005541913.localdomain sudo[24817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:34 np0005541913.localdomain python3[24819]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:42:34 np0005541913.localdomain sudo[24817]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:35 np0005541913.localdomain sudo[24833]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psgehrdybdjklxgrjgszkwmlgofijepx ; /usr/bin/python3
Dec 02 07:42:35 np0005541913.localdomain sudo[24833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:35 np0005541913.localdomain python3[24835]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:42:35 np0005541913.localdomain kernel: loop4: detected capacity change from 0 to 14680064
Dec 02 07:42:35 np0005541913.localdomain sudo[24833]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:35 np0005541913.localdomain sudo[24855]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asuomwswjfshdrfctwarhreibulwyguv ; /usr/bin/python3
Dec 02 07:42:35 np0005541913.localdomain sudo[24855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:35 np0005541913.localdomain python3[24857]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                                         vgcreate ceph_vg1 /dev/loop4
                                                         lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:42:36 np0005541913.localdomain lvm[24860]: PV /dev/loop4 not used.
Dec 02 07:42:36 np0005541913.localdomain lvm[24870]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 02 07:42:36 np0005541913.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Dec 02 07:42:36 np0005541913.localdomain sudo[24855]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:36 np0005541913.localdomain lvm[24872]:   1 logical volume(s) in volume group "ceph_vg1" now active
Dec 02 07:42:36 np0005541913.localdomain systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Dec 02 07:42:36 np0005541913.localdomain sudo[24918]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epetelnysmjzpxzwyiohpznqafuqrlct ; /usr/bin/python3
Dec 02 07:42:36 np0005541913.localdomain sudo[24918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:36 np0005541913.localdomain python3[24920]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:42:36 np0005541913.localdomain sudo[24918]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:37 np0005541913.localdomain sudo[24961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbbkjqynbmtpinglkqkswmlsbxyweodi ; /usr/bin/python3
Dec 02 07:42:37 np0005541913.localdomain sudo[24961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:37 np0005541913.localdomain python3[24963]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764661356.5509481-54105-167353494703947/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:42:37 np0005541913.localdomain sudo[24961]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:37 np0005541913.localdomain sudo[24991]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugeutfmgxrdnlvnaqfhlmojuiublnluu ; /usr/bin/python3
Dec 02 07:42:37 np0005541913.localdomain sudo[24991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:37 np0005541913.localdomain python3[24993]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:42:37 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:42:37 np0005541913.localdomain systemd-rc-local-generator[25020]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:42:37 np0005541913.localdomain systemd-sysv-generator[25026]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:42:37 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:42:38 np0005541913.localdomain systemd[1]: Starting Ceph OSD losetup...
Dec 02 07:42:38 np0005541913.localdomain bash[25034]: /dev/loop4: [64516]:8402047 (/var/lib/ceph-osd-1.img)
Dec 02 07:42:38 np0005541913.localdomain systemd[1]: Finished Ceph OSD losetup.
Dec 02 07:42:38 np0005541913.localdomain lvm[25035]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 02 07:42:38 np0005541913.localdomain lvm[25035]: VG ceph_vg1 finished
Dec 02 07:42:38 np0005541913.localdomain sudo[24991]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:46 np0005541913.localdomain sudo[25078]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwxaypfjrvwmddxduecqgrpzjyliwazh ; /usr/bin/python3
Dec 02 07:42:46 np0005541913.localdomain sudo[25078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:46 np0005541913.localdomain python3[25080]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 07:42:46 np0005541913.localdomain sudo[25078]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:47 np0005541913.localdomain sudo[25098]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azngqtlcfkvuwddkdewzdikuheylhzol ; /usr/bin/python3
Dec 02 07:42:47 np0005541913.localdomain sudo[25098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:47 np0005541913.localdomain python3[25100]: ansible-hostname Invoked with name=np0005541913.localdomain use=None
Dec 02 07:42:48 np0005541913.localdomain systemd[1]: Starting Hostname Service...
Dec 02 07:42:48 np0005541913.localdomain systemd[1]: Started Hostname Service.
Dec 02 07:42:48 np0005541913.localdomain sudo[25098]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:55 np0005541913.localdomain sudo[25121]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgxytshdogoikolgkxuwwytaxxbwplcm ; /usr/bin/python3
Dec 02 07:42:55 np0005541913.localdomain sudo[25121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:55 np0005541913.localdomain python3[25123]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Dec 02 07:42:55 np0005541913.localdomain sudo[25121]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:55 np0005541913.localdomain sudo[25169]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usfcokklfwmegtxqvqztzhpazsgnwvxj ; /usr/bin/python3
Dec 02 07:42:55 np0005541913.localdomain sudo[25169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:56 np0005541913.localdomain python3[25171]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.twl1h7y6tmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:42:56 np0005541913.localdomain sudo[25169]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:56 np0005541913.localdomain sudo[25199]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbydmjfrpkzdimglltouirkfvimpsjlx ; /usr/bin/python3
Dec 02 07:42:56 np0005541913.localdomain sudo[25199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:56 np0005541913.localdomain python3[25201]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.twl1h7y6tmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:42:56 np0005541913.localdomain sudo[25199]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:56 np0005541913.localdomain sudo[25215]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktvughbypiqnmosjuivfcpcmfzuhnixf ; /usr/bin/python3
Dec 02 07:42:56 np0005541913.localdomain sudo[25215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:57 np0005541913.localdomain python3[25217]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.twl1h7y6tmphosts insertbefore=BOF block=192.168.122.106 np0005541912.localdomain np0005541912
                                                         192.168.122.106 np0005541912.ctlplane.localdomain np0005541912.ctlplane
                                                         192.168.122.107 np0005541913.localdomain np0005541913
                                                         192.168.122.107 np0005541913.ctlplane.localdomain np0005541913.ctlplane
                                                         192.168.122.108 np0005541914.localdomain np0005541914
                                                         192.168.122.108 np0005541914.ctlplane.localdomain np0005541914.ctlplane
                                                         192.168.122.103 np0005541909.localdomain np0005541909
                                                         192.168.122.103 np0005541909.ctlplane.localdomain np0005541909.ctlplane
                                                         192.168.122.104 np0005541910.localdomain np0005541910
                                                         192.168.122.104 np0005541910.ctlplane.localdomain np0005541910.ctlplane
                                                         192.168.122.105 np0005541911.localdomain np0005541911
                                                         192.168.122.105 np0005541911.ctlplane.localdomain np0005541911.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:42:57 np0005541913.localdomain sudo[25215]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:57 np0005541913.localdomain sudo[25231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtvkfejztstqlmqkisozwtigqxlkomkx ; /usr/bin/python3
Dec 02 07:42:57 np0005541913.localdomain sudo[25231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:57 np0005541913.localdomain python3[25233]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.twl1h7y6tmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:42:57 np0005541913.localdomain sudo[25231]: pam_unix(sudo:session): session closed for user root
Dec 02 07:42:58 np0005541913.localdomain sudo[25248]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqanglrfgtgjjlmiurfueauszijklwme ; /usr/bin/python3
Dec 02 07:42:58 np0005541913.localdomain sudo[25248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:42:58 np0005541913.localdomain python3[25250]: ansible-file Invoked with path=/tmp/ansible.twl1h7y6tmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:42:58 np0005541913.localdomain sudo[25248]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:00 np0005541913.localdomain sudo[25264]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bugyrsrnznxwuvdppkuldmfckyzsveid ; /usr/bin/python3
Dec 02 07:43:00 np0005541913.localdomain sudo[25264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:00 np0005541913.localdomain python3[25266]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:43:00 np0005541913.localdomain sudo[25264]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:00 np0005541913.localdomain sudo[25282]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftramvdenygdricphwdkcbyghngtjshk ; /usr/bin/python3
Dec 02 07:43:00 np0005541913.localdomain sudo[25282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:01 np0005541913.localdomain python3[25284]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:43:03 np0005541913.localdomain sudo[25282]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:05 np0005541913.localdomain sudo[25331]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iftskirxbrdviiieqetinunpycvmfwyj ; /usr/bin/python3
Dec 02 07:43:05 np0005541913.localdomain sudo[25331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:05 np0005541913.localdomain python3[25333]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:43:05 np0005541913.localdomain sudo[25331]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:05 np0005541913.localdomain sudo[25376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfwwiswzokocdffjsfefznzqmkaxjnhq ; /usr/bin/python3
Dec 02 07:43:05 np0005541913.localdomain sudo[25376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:06 np0005541913.localdomain python3[25378]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764661385.0698972-55051-133166024742249/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:43:06 np0005541913.localdomain sudo[25376]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:07 np0005541913.localdomain sudo[25406]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdbtnriqvascokzdgiuqblubymrwxbab ; /usr/bin/python3
Dec 02 07:43:07 np0005541913.localdomain sudo[25406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:07 np0005541913.localdomain python3[25408]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:43:07 np0005541913.localdomain sudo[25406]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:07 np0005541913.localdomain sudo[25424]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkelulrfcwtwwizdooevlqwmxmqtuzwu ; /usr/bin/python3
Dec 02 07:43:07 np0005541913.localdomain sudo[25424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:08 np0005541913.localdomain python3[25426]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 07:43:08 np0005541913.localdomain chronyd[763]: chronyd exiting
Dec 02 07:43:08 np0005541913.localdomain systemd[1]: Stopping NTP client/server...
Dec 02 07:43:08 np0005541913.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Dec 02 07:43:08 np0005541913.localdomain systemd[1]: Stopped NTP client/server.
Dec 02 07:43:08 np0005541913.localdomain systemd[1]: chronyd.service: Consumed 90ms CPU time, read 1.9M from disk, written 0B to disk.
Dec 02 07:43:08 np0005541913.localdomain systemd[1]: Starting NTP client/server...
Dec 02 07:43:08 np0005541913.localdomain chronyd[25433]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 02 07:43:08 np0005541913.localdomain chronyd[25433]: Frequency -26.710 +/- 0.093 ppm read from /var/lib/chrony/drift
Dec 02 07:43:08 np0005541913.localdomain chronyd[25433]: Loaded seccomp filter (level 2)
Dec 02 07:43:08 np0005541913.localdomain systemd[1]: Started NTP client/server.
Dec 02 07:43:08 np0005541913.localdomain sudo[25424]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:08 np0005541913.localdomain sudo[25480]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uieoelxpwfziqesxyjcvmoooxsqyxgna ; /usr/bin/python3
Dec 02 07:43:08 np0005541913.localdomain sudo[25480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:09 np0005541913.localdomain python3[25482]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:43:09 np0005541913.localdomain sudo[25480]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:09 np0005541913.localdomain sudo[25523]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-augxvvwperjggwfxyyhnurufpmwfanmk ; /usr/bin/python3
Dec 02 07:43:09 np0005541913.localdomain sudo[25523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:09 np0005541913.localdomain python3[25525]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764661388.7061102-55195-200133466110311/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:43:09 np0005541913.localdomain sudo[25523]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:09 np0005541913.localdomain sudo[25553]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfvmjntocklfejtotayplhihthirnqxe ; /usr/bin/python3
Dec 02 07:43:09 np0005541913.localdomain sudo[25553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:09 np0005541913.localdomain python3[25555]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:43:09 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:43:10 np0005541913.localdomain systemd-rc-local-generator[25577]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:43:10 np0005541913.localdomain systemd-sysv-generator[25580]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:43:10 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:43:10 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:43:10 np0005541913.localdomain systemd-rc-local-generator[25617]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:43:10 np0005541913.localdomain systemd-sysv-generator[25621]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:43:10 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:43:10 np0005541913.localdomain systemd[1]: Starting chronyd online sources service...
Dec 02 07:43:10 np0005541913.localdomain chronyc[25631]: 200 OK
Dec 02 07:43:10 np0005541913.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Dec 02 07:43:10 np0005541913.localdomain systemd[1]: Finished chronyd online sources service.
Dec 02 07:43:10 np0005541913.localdomain sudo[25553]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:11 np0005541913.localdomain sudo[25645]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckkiaxeirqcykrffblglcqwbfgljmmma ; /usr/bin/python3
Dec 02 07:43:11 np0005541913.localdomain sudo[25645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:11 np0005541913.localdomain python3[25647]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:43:11 np0005541913.localdomain chronyd[25433]: System clock was stepped by 0.000000 seconds
Dec 02 07:43:11 np0005541913.localdomain sudo[25645]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:11 np0005541913.localdomain sudo[25662]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sihqcicftbpwjymynmdiumnyyxcyjyhs ; /usr/bin/python3
Dec 02 07:43:11 np0005541913.localdomain sudo[25662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:11 np0005541913.localdomain python3[25664]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:43:12 np0005541913.localdomain chronyd[25433]: Selected source 167.160.187.12 (pool.ntp.org)
Dec 02 07:43:18 np0005541913.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 07:43:21 np0005541913.localdomain sudo[25662]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:21 np0005541913.localdomain sudo[25683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otgykrgbfvbppqsgqiwnnbujofljphcn ; /usr/bin/python3
Dec 02 07:43:21 np0005541913.localdomain sudo[25683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:22 np0005541913.localdomain python3[25685]: ansible-timezone Invoked with name=UTC hwclock=None
Dec 02 07:43:22 np0005541913.localdomain systemd[1]: Starting Time & Date Service...
Dec 02 07:43:22 np0005541913.localdomain systemd[1]: Started Time & Date Service.
Dec 02 07:43:22 np0005541913.localdomain sudo[25683]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:22 np0005541913.localdomain sudo[25703]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkkzjtbeyqgfngcgasktcazpmugdnkkk ; /usr/bin/python3
Dec 02 07:43:22 np0005541913.localdomain sudo[25703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:23 np0005541913.localdomain python3[25705]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 07:43:23 np0005541913.localdomain chronyd[25433]: chronyd exiting
Dec 02 07:43:23 np0005541913.localdomain systemd[1]: Stopping NTP client/server...
Dec 02 07:43:23 np0005541913.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Dec 02 07:43:23 np0005541913.localdomain systemd[1]: Stopped NTP client/server.
Dec 02 07:43:23 np0005541913.localdomain systemd[1]: Starting NTP client/server...
Dec 02 07:43:23 np0005541913.localdomain chronyd[25712]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 02 07:43:23 np0005541913.localdomain chronyd[25712]: Frequency -26.710 +/- 0.105 ppm read from /var/lib/chrony/drift
Dec 02 07:43:23 np0005541913.localdomain chronyd[25712]: Loaded seccomp filter (level 2)
Dec 02 07:43:23 np0005541913.localdomain systemd[1]: Started NTP client/server.
Dec 02 07:43:23 np0005541913.localdomain sudo[25703]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:27 np0005541913.localdomain chronyd[25712]: Selected source 51.222.12.92 (pool.ntp.org)
Dec 02 07:43:39 np0005541913.localdomain sudo[25727]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiqdeqqrzqmmfzfheutdkdezfswzysfs ; /usr/bin/python3
Dec 02 07:43:39 np0005541913.localdomain sudo[25727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:39 np0005541913.localdomain useradd[25731]: new group: name=ceph-admin, GID=1002
Dec 02 07:43:39 np0005541913.localdomain useradd[25731]: new user: name=ceph-admin, UID=1002, GID=1002, home=/home/ceph-admin, shell=/bin/bash, from=none
Dec 02 07:43:39 np0005541913.localdomain sudo[25727]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:39 np0005541913.localdomain sudo[25783]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xafsmtaznxxuurchfxlsjflgmplfslgu ; /usr/bin/python3
Dec 02 07:43:39 np0005541913.localdomain sudo[25783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:39 np0005541913.localdomain sudo[25783]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:40 np0005541913.localdomain sudo[25826]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfocnukgfxyuhjjdqgagngjckhvgnyyx ; /usr/bin/python3
Dec 02 07:43:40 np0005541913.localdomain sudo[25826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:40 np0005541913.localdomain sudo[25826]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:40 np0005541913.localdomain sudo[25856]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iepzicaufyfwlnwtzmxrlugfwyhryfxi ; /usr/bin/python3
Dec 02 07:43:40 np0005541913.localdomain sudo[25856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:40 np0005541913.localdomain sudo[25856]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:41 np0005541913.localdomain sudo[25872]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngvdrshqcsibgxsufkxurvauvyfndeyq ; /usr/bin/python3
Dec 02 07:43:41 np0005541913.localdomain sudo[25872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:41 np0005541913.localdomain sudo[25872]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:41 np0005541913.localdomain sudo[25888]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyxeozkhxbsbadrluldupelbtjyjpujh ; /usr/bin/python3
Dec 02 07:43:41 np0005541913.localdomain sudo[25888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:41 np0005541913.localdomain sudo[25888]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:42 np0005541913.localdomain sudo[25904]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gewqwipdvurmalclgdadpbnqbvisjjgv ; /usr/bin/python3
Dec 02 07:43:42 np0005541913.localdomain sudo[25904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:43:42 np0005541913.localdomain sudo[25904]: pam_unix(sudo:session): session closed for user root
Dec 02 07:43:52 np0005541913.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 02 07:44:48 np0005541913.localdomain sshd[25910]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:44:48 np0005541913.localdomain sshd[25910]: Invalid user user from 78.128.112.74 port 41628
Dec 02 07:44:48 np0005541913.localdomain sshd[25910]: Connection closed by invalid user user 78.128.112.74 port 41628 [preauth]
Dec 02 07:45:24 np0005541913.localdomain sshd[25912]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:24 np0005541913.localdomain sshd[25912]: Accepted publickey for ceph-admin from 192.168.122.103 port 54196 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:24 np0005541913.localdomain systemd-logind[757]: New session 15 of user ceph-admin.
Dec 02 07:45:24 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 1002.
Dec 02 07:45:24 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Dec 02 07:45:24 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Dec 02 07:45:24 np0005541913.localdomain systemd[1]: Starting User Manager for UID 1002...
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:24 np0005541913.localdomain sshd[25930]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Queued start job for default target Main User Target.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Created slice User Application Slice.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Reached target Paths.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Reached target Timers.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Starting D-Bus User Message Bus Socket...
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Starting Create User's Volatile Files and Directories...
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Finished Create User's Volatile Files and Directories.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Listening on D-Bus User Message Bus Socket.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Reached target Sockets.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Reached target Basic System.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Reached target Main User Target.
Dec 02 07:45:24 np0005541913.localdomain systemd[25916]: Startup finished in 115ms.
Dec 02 07:45:24 np0005541913.localdomain systemd[1]: Started User Manager for UID 1002.
Dec 02 07:45:24 np0005541913.localdomain systemd[1]: Started Session 15 of User ceph-admin.
Dec 02 07:45:24 np0005541913.localdomain sshd[25912]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:24 np0005541913.localdomain sshd[25930]: Accepted publickey for ceph-admin from 192.168.122.103 port 54198 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:24 np0005541913.localdomain systemd-logind[757]: New session 17 of user ceph-admin.
Dec 02 07:45:24 np0005541913.localdomain systemd[1]: Started Session 17 of User ceph-admin.
Dec 02 07:45:25 np0005541913.localdomain sshd[25930]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:25 np0005541913.localdomain sudo[25937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:45:25 np0005541913.localdomain sudo[25937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:25 np0005541913.localdomain sudo[25937]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:25 np0005541913.localdomain sshd[25952]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:25 np0005541913.localdomain sshd[25952]: Accepted publickey for ceph-admin from 192.168.122.103 port 54214 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:25 np0005541913.localdomain systemd-logind[757]: New session 18 of user ceph-admin.
Dec 02 07:45:25 np0005541913.localdomain systemd[1]: Started Session 18 of User ceph-admin.
Dec 02 07:45:25 np0005541913.localdomain sshd[25952]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:25 np0005541913.localdomain sudo[25956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005541913.localdomain
Dec 02 07:45:25 np0005541913.localdomain sudo[25956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:25 np0005541913.localdomain sudo[25956]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:25 np0005541913.localdomain sshd[25971]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:25 np0005541913.localdomain sshd[25971]: Accepted publickey for ceph-admin from 192.168.122.103 port 54222 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:25 np0005541913.localdomain systemd-logind[757]: New session 19 of user ceph-admin.
Dec 02 07:45:25 np0005541913.localdomain systemd[1]: Started Session 19 of User ceph-admin.
Dec 02 07:45:25 np0005541913.localdomain sshd[25971]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:25 np0005541913.localdomain sudo[25975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Dec 02 07:45:25 np0005541913.localdomain sudo[25975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:25 np0005541913.localdomain sudo[25975]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:25 np0005541913.localdomain sshd[25990]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:26 np0005541913.localdomain sshd[25990]: Accepted publickey for ceph-admin from 192.168.122.103 port 54236 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:26 np0005541913.localdomain systemd-logind[757]: New session 20 of user ceph-admin.
Dec 02 07:45:26 np0005541913.localdomain systemd[1]: Started Session 20 of User ceph-admin.
Dec 02 07:45:26 np0005541913.localdomain sshd[25990]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:26 np0005541913.localdomain sudo[25994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 07:45:26 np0005541913.localdomain sudo[25994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:26 np0005541913.localdomain sudo[25994]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:26 np0005541913.localdomain sshd[26009]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:26 np0005541913.localdomain sshd[26009]: Accepted publickey for ceph-admin from 192.168.122.103 port 54242 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:26 np0005541913.localdomain systemd-logind[757]: New session 21 of user ceph-admin.
Dec 02 07:45:26 np0005541913.localdomain systemd[1]: Started Session 21 of User ceph-admin.
Dec 02 07:45:26 np0005541913.localdomain sshd[26009]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:26 np0005541913.localdomain sudo[26013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 07:45:26 np0005541913.localdomain sudo[26013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:26 np0005541913.localdomain sudo[26013]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:26 np0005541913.localdomain sshd[26028]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:26 np0005541913.localdomain sshd[26028]: Accepted publickey for ceph-admin from 192.168.122.103 port 54250 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:26 np0005541913.localdomain systemd-logind[757]: New session 22 of user ceph-admin.
Dec 02 07:45:26 np0005541913.localdomain systemd[1]: Started Session 22 of User ceph-admin.
Dec 02 07:45:26 np0005541913.localdomain sshd[26028]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:26 np0005541913.localdomain sudo[26032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Dec 02 07:45:26 np0005541913.localdomain sudo[26032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:26 np0005541913.localdomain sudo[26032]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:27 np0005541913.localdomain sshd[26047]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:27 np0005541913.localdomain sshd[26047]: Accepted publickey for ceph-admin from 192.168.122.103 port 54258 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:27 np0005541913.localdomain systemd-logind[757]: New session 23 of user ceph-admin.
Dec 02 07:45:27 np0005541913.localdomain systemd[1]: Started Session 23 of User ceph-admin.
Dec 02 07:45:27 np0005541913.localdomain sshd[26047]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:27 np0005541913.localdomain sudo[26051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 07:45:27 np0005541913.localdomain sudo[26051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:27 np0005541913.localdomain sudo[26051]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:27 np0005541913.localdomain sshd[26066]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:27 np0005541913.localdomain sshd[26066]: Accepted publickey for ceph-admin from 192.168.122.103 port 54268 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:27 np0005541913.localdomain systemd-logind[757]: New session 24 of user ceph-admin.
Dec 02 07:45:27 np0005541913.localdomain systemd[1]: Started Session 24 of User ceph-admin.
Dec 02 07:45:27 np0005541913.localdomain sshd[26066]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:27 np0005541913.localdomain sudo[26070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Dec 02 07:45:27 np0005541913.localdomain sudo[26070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:27 np0005541913.localdomain sudo[26070]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:27 np0005541913.localdomain sshd[26085]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:27 np0005541913.localdomain sshd[26085]: Accepted publickey for ceph-admin from 192.168.122.103 port 54282 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:27 np0005541913.localdomain systemd-logind[757]: New session 25 of user ceph-admin.
Dec 02 07:45:27 np0005541913.localdomain systemd[1]: Started Session 25 of User ceph-admin.
Dec 02 07:45:27 np0005541913.localdomain sshd[26085]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:28 np0005541913.localdomain sshd[26102]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:28 np0005541913.localdomain sshd[26102]: Accepted publickey for ceph-admin from 192.168.122.103 port 54288 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:28 np0005541913.localdomain systemd-logind[757]: New session 26 of user ceph-admin.
Dec 02 07:45:28 np0005541913.localdomain systemd[1]: Started Session 26 of User ceph-admin.
Dec 02 07:45:28 np0005541913.localdomain sshd[26102]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:28 np0005541913.localdomain sudo[26106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Dec 02 07:45:28 np0005541913.localdomain sudo[26106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:28 np0005541913.localdomain sudo[26106]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:28 np0005541913.localdomain sshd[26121]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:45:28 np0005541913.localdomain sshd[26121]: Accepted publickey for ceph-admin from 192.168.122.103 port 54300 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 07:45:28 np0005541913.localdomain systemd-logind[757]: New session 27 of user ceph-admin.
Dec 02 07:45:28 np0005541913.localdomain systemd[1]: Started Session 27 of User ceph-admin.
Dec 02 07:45:28 np0005541913.localdomain sshd[26121]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 07:45:28 np0005541913.localdomain sudo[26125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005541913.localdomain
Dec 02 07:45:28 np0005541913.localdomain sudo[26125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:45:29 np0005541913.localdomain sudo[26125]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:48 np0005541913.localdomain sudo[26160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:45:48 np0005541913.localdomain sudo[26160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:48 np0005541913.localdomain sudo[26160]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:48 np0005541913.localdomain sudo[26175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:45:48 np0005541913.localdomain sudo[26175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:48 np0005541913.localdomain sudo[26175]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:48 np0005541913.localdomain sudo[26190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 07:45:48 np0005541913.localdomain sudo[26190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:45:48 np0005541913.localdomain sudo[26190]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:49 np0005541913.localdomain sudo[26223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:45:49 np0005541913.localdomain sudo[26223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:49 np0005541913.localdomain sudo[26223]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:49 np0005541913.localdomain sudo[26238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 07:45:49 np0005541913.localdomain sudo[26238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:45:49 np0005541913.localdomain sudo[26238]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:49 np0005541913.localdomain sudo[26293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:45:49 np0005541913.localdomain sudo[26293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:49 np0005541913.localdomain sudo[26293]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:49 np0005541913.localdomain sudo[26308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:45:49 np0005541913.localdomain sudo[26308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:45:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:45:50 np0005541913.localdomain systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26336 (sysctl)
Dec 02 07:45:50 np0005541913.localdomain systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 02 07:45:50 np0005541913.localdomain systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 02 07:45:50 np0005541913.localdomain sudo[26308]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:50 np0005541913.localdomain sudo[26358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:45:50 np0005541913.localdomain sudo[26358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:50 np0005541913.localdomain sudo[26358]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:50 np0005541913.localdomain sudo[26373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 07:45:50 np0005541913.localdomain sudo[26373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:45:50 np0005541913.localdomain sudo[26373]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:51 np0005541913.localdomain sudo[26406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:45:51 np0005541913.localdomain sudo[26406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:51 np0005541913.localdomain sudo[26406]: pam_unix(sudo:session): session closed for user root
Dec 02 07:45:51 np0005541913.localdomain sudo[26421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 -- inventory --format=json-pretty --filter-for-batch
Dec 02 07:45:51 np0005541913.localdomain sudo[26421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:45:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:45:55 np0005541913.localdomain kernel: VFS: idmapped mount is not enabled.
Dec 02 07:46:16 np0005541913.localdomain podman[26473]: 
Dec 02 07:46:16 np0005541913.localdomain podman[26473]: 2025-12-02 07:46:16.655947085 +0000 UTC m=+25.075775424 container create 94b86ab6d6359020d2dc138f650f697a3966db4c57d6de8575c6b91eccb16ed2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_buck, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, release=1763362218, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, architecture=x86_64, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, io.buildah.version=1.41.4)
Dec 02 07:46:16 np0005541913.localdomain systemd[1]: Created slice Slice /machine.
Dec 02 07:46:16 np0005541913.localdomain systemd[1]: Started libpod-conmon-94b86ab6d6359020d2dc138f650f697a3966db4c57d6de8575c6b91eccb16ed2.scope.
Dec 02 07:46:16 np0005541913.localdomain podman[26473]: 2025-12-02 07:45:51.623462588 +0000 UTC m=+0.043290987 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:46:16 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:46:16 np0005541913.localdomain podman[26473]: 2025-12-02 07:46:16.762123109 +0000 UTC m=+25.181951478 container init 94b86ab6d6359020d2dc138f650f697a3966db4c57d6de8575c6b91eccb16ed2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_buck, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, ceph=True, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.buildah.version=1.41.4, version=7)
Dec 02 07:46:16 np0005541913.localdomain podman[26473]: 2025-12-02 07:46:16.772951905 +0000 UTC m=+25.192780274 container start 94b86ab6d6359020d2dc138f650f697a3966db4c57d6de8575c6b91eccb16ed2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_buck, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7, RELEASE=main, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.component=rhceph-container)
Dec 02 07:46:16 np0005541913.localdomain podman[26473]: 2025-12-02 07:46:16.773217155 +0000 UTC m=+25.193045534 container attach 94b86ab6d6359020d2dc138f650f697a3966db4c57d6de8575c6b91eccb16ed2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_buck, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, RELEASE=main, release=1763362218, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 07:46:16 np0005541913.localdomain happy_buck[26825]: 167 167
Dec 02 07:46:16 np0005541913.localdomain systemd[1]: libpod-94b86ab6d6359020d2dc138f650f697a3966db4c57d6de8575c6b91eccb16ed2.scope: Deactivated successfully.
Dec 02 07:46:16 np0005541913.localdomain podman[26473]: 2025-12-02 07:46:16.777949059 +0000 UTC m=+25.197777438 container died 94b86ab6d6359020d2dc138f650f697a3966db4c57d6de8575c6b91eccb16ed2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_buck, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.component=rhceph-container)
Dec 02 07:46:16 np0005541913.localdomain podman[26830]: 2025-12-02 07:46:16.845179989 +0000 UTC m=+0.060997064 container remove 94b86ab6d6359020d2dc138f650f697a3966db4c57d6de8575c6b91eccb16ed2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_buck, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, version=7, vendor=Red Hat, Inc., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.buildah.version=1.41.4, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 07:46:16 np0005541913.localdomain systemd[1]: libpod-conmon-94b86ab6d6359020d2dc138f650f697a3966db4c57d6de8575c6b91eccb16ed2.scope: Deactivated successfully.
Dec 02 07:46:17 np0005541913.localdomain podman[26852]: 
Dec 02 07:46:17 np0005541913.localdomain podman[26852]: 2025-12-02 07:46:17.031363708 +0000 UTC m=+0.043739984 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:46:17 np0005541913.localdomain systemd[1]: tmp-crun.hIEeGd.mount: Deactivated successfully.
Dec 02 07:46:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6d44bcb444b7d11fdc7e0e97e7099f04e6d4b16174b6a6be0898ba0d13d240fb-merged.mount: Deactivated successfully.
Dec 02 07:46:20 np0005541913.localdomain podman[26852]: 2025-12-02 07:46:20.283387774 +0000 UTC m=+3.295764060 container create ae053d1a0c166c191e92107c230c5743439471357e49b153457d55f3f725317b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_mestorf, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218)
Dec 02 07:46:21 np0005541913.localdomain systemd[1]: Started libpod-conmon-ae053d1a0c166c191e92107c230c5743439471357e49b153457d55f3f725317b.scope.
Dec 02 07:46:21 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:46:21 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba515e25921f71f7c6ad0cd1c9f745e2567c058902d3018e6b9f8ec89ac9ebe1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:21 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba515e25921f71f7c6ad0cd1c9f745e2567c058902d3018e6b9f8ec89ac9ebe1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:21 np0005541913.localdomain podman[26852]: 2025-12-02 07:46:21.046991223 +0000 UTC m=+4.059367469 container init ae053d1a0c166c191e92107c230c5743439471357e49b153457d55f3f725317b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_mestorf, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, name=rhceph, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, RELEASE=main, distribution-scope=public)
Dec 02 07:46:21 np0005541913.localdomain podman[26852]: 2025-12-02 07:46:21.10322402 +0000 UTC m=+4.115600276 container start ae053d1a0c166c191e92107c230c5743439471357e49b153457d55f3f725317b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_mestorf, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, distribution-scope=public)
Dec 02 07:46:21 np0005541913.localdomain podman[26852]: 2025-12-02 07:46:21.103426507 +0000 UTC m=+4.115802753 container attach ae053d1a0c166c191e92107c230c5743439471357e49b153457d55f3f725317b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_mestorf, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public)
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]: [
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:     {
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         "available": false,
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         "ceph_device": false,
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         "lsm_data": {},
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         "lvs": [],
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         "path": "/dev/sr0",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         "rejected_reasons": [
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "Has a FileSystem",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "Insufficient space (<5GB)"
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         ],
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         "sys_api": {
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "actuators": null,
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "device_nodes": "sr0",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "human_readable_size": "482.00 KB",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "id_bus": "ata",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "model": "QEMU DVD-ROM",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "nr_requests": "2",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "partitions": {},
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "path": "/dev/sr0",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "removable": "1",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "rev": "2.5+",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "ro": "0",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "rotational": "1",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "sas_address": "",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "sas_device_handle": "",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "scheduler_mode": "mq-deadline",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "sectors": 0,
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "sectorsize": "2048",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "size": 493568.0,
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "support_discard": "0",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "type": "disk",
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:             "vendor": "QEMU"
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:         }
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]:     }
Dec 02 07:46:21 np0005541913.localdomain magical_mestorf[27004]: ]
Dec 02 07:46:21 np0005541913.localdomain systemd[1]: libpod-ae053d1a0c166c191e92107c230c5743439471357e49b153457d55f3f725317b.scope: Deactivated successfully.
Dec 02 07:46:21 np0005541913.localdomain podman[26852]: 2025-12-02 07:46:21.947669473 +0000 UTC m=+4.960045729 container died ae053d1a0c166c191e92107c230c5743439471357e49b153457d55f3f725317b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_mestorf, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container)
Dec 02 07:46:21 np0005541913.localdomain systemd[1]: tmp-crun.d8KfNs.mount: Deactivated successfully.
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ba515e25921f71f7c6ad0cd1c9f745e2567c058902d3018e6b9f8ec89ac9ebe1-merged.mount: Deactivated successfully.
Dec 02 07:46:22 np0005541913.localdomain podman[28390]: 2025-12-02 07:46:22.018417454 +0000 UTC m=+0.065282762 container remove ae053d1a0c166c191e92107c230c5743439471357e49b153457d55f3f725317b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_mestorf, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, release=1763362218, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.41.4, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: libpod-conmon-ae053d1a0c166c191e92107c230c5743439471357e49b153457d55f3f725317b.scope: Deactivated successfully.
Dec 02 07:46:22 np0005541913.localdomain sudo[26421]: pam_unix(sudo:session): session closed for user root
Dec 02 07:46:22 np0005541913.localdomain sudo[28402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:46:22 np0005541913.localdomain sudo[28402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:46:22 np0005541913.localdomain sudo[28402]: pam_unix(sudo:session): session closed for user root
Dec 02 07:46:22 np0005541913.localdomain sudo[28417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 _orch set-coredump-overrides --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 --coredump-max-size=32G
Dec 02 07:46:22 np0005541913.localdomain sudo[28417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: systemd-coredump.socket: Deactivated successfully.
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: Closed Process Core Dump Socket.
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: Stopping Process Core Dump Socket...
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: Listening on Process Core Dump Socket.
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:46:22 np0005541913.localdomain systemd-rc-local-generator[28467]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:46:22 np0005541913.localdomain systemd-sysv-generator[28471]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:46:22 np0005541913.localdomain systemd-sysv-generator[28510]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:46:22 np0005541913.localdomain systemd-rc-local-generator[28504]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:46:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:46:23 np0005541913.localdomain sudo[28417]: pam_unix(sudo:session): session closed for user root
Dec 02 07:46:51 np0005541913.localdomain sudo[28519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:46:51 np0005541913.localdomain sudo[28519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:46:51 np0005541913.localdomain sudo[28519]: pam_unix(sudo:session): session closed for user root
Dec 02 07:46:51 np0005541913.localdomain sudo[28534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 07:46:51 np0005541913.localdomain sudo[28534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:46:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:46:52 np0005541913.localdomain podman[28591]: 
Dec 02 07:46:52 np0005541913.localdomain podman[28591]: 2025-12-02 07:46:52.162474742 +0000 UTC m=+0.070588344 container create aace7dbcc756d974daf1759eca0fc9b8dfa02b496836b21add4c426d6a83606f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_poincare, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., version=7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: Started libpod-conmon-aace7dbcc756d974daf1759eca0fc9b8dfa02b496836b21add4c426d6a83606f.scope.
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:46:52 np0005541913.localdomain podman[28591]: 2025-12-02 07:46:52.228267635 +0000 UTC m=+0.136381237 container init aace7dbcc756d974daf1759eca0fc9b8dfa02b496836b21add4c426d6a83606f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_poincare, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, release=1763362218, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True)
Dec 02 07:46:52 np0005541913.localdomain podman[28591]: 2025-12-02 07:46:52.133362029 +0000 UTC m=+0.041475631 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:46:52 np0005541913.localdomain podman[28591]: 2025-12-02 07:46:52.239006708 +0000 UTC m=+0.147120320 container start aace7dbcc756d974daf1759eca0fc9b8dfa02b496836b21add4c426d6a83606f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_poincare, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, RELEASE=main, version=7, name=rhceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 07:46:52 np0005541913.localdomain podman[28591]: 2025-12-02 07:46:52.239325946 +0000 UTC m=+0.147439618 container attach aace7dbcc756d974daf1759eca0fc9b8dfa02b496836b21add4c426d6a83606f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_poincare, architecture=x86_64, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph)
Dec 02 07:46:52 np0005541913.localdomain inspiring_poincare[28606]: 167 167
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: libpod-aace7dbcc756d974daf1759eca0fc9b8dfa02b496836b21add4c426d6a83606f.scope: Deactivated successfully.
Dec 02 07:46:52 np0005541913.localdomain podman[28591]: 2025-12-02 07:46:52.242643336 +0000 UTC m=+0.150756958 container died aace7dbcc756d974daf1759eca0fc9b8dfa02b496836b21add4c426d6a83606f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_poincare, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 07:46:52 np0005541913.localdomain podman[28611]: 2025-12-02 07:46:52.333962274 +0000 UTC m=+0.077475281 container remove aace7dbcc756d974daf1759eca0fc9b8dfa02b496836b21add4c426d6a83606f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_poincare, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.expose-services=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_CLEAN=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z)
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: libpod-conmon-aace7dbcc756d974daf1759eca0fc9b8dfa02b496836b21add4c426d6a83606f.scope: Deactivated successfully.
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:46:52 np0005541913.localdomain systemd-rc-local-generator[28654]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:46:52 np0005541913.localdomain systemd-sysv-generator[28658]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:46:52 np0005541913.localdomain systemd-rc-local-generator[28691]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:46:52 np0005541913.localdomain systemd-sysv-generator[28694]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: Reached target All Ceph clusters and services.
Dec 02 07:46:52 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:46:52 np0005541913.localdomain systemd-rc-local-generator[28727]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:46:52 np0005541913.localdomain systemd-sysv-generator[28730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: Reached target Ceph cluster c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:46:53 np0005541913.localdomain systemd-rc-local-generator[28770]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:46:53 np0005541913.localdomain systemd-sysv-generator[28773]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:46:53 np0005541913.localdomain systemd-rc-local-generator[28805]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:46:53 np0005541913.localdomain systemd-sysv-generator[28812]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: Created slice Slice /system/ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: Reached target System Time Set.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: Reached target System Time Synchronized.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: Starting Ceph crash.np0005541913 for c7c8e171-a193-56fb-95fa-8879fcfa7074...
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:46:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 07:46:53 np0005541913.localdomain podman[28869]: 
Dec 02 07:46:53 np0005541913.localdomain podman[28869]: 2025-12-02 07:46:53.972059562 +0000 UTC m=+0.073774240 container create 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, version=7, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True)
Dec 02 07:46:54 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acf37503ebf93858839be6ff8621087588dcfd2cd5142a47fc29d2d0ec632bd4/merged/etc/ceph/ceph.client.crash.np0005541913.keyring supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:54 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acf37503ebf93858839be6ff8621087588dcfd2cd5142a47fc29d2d0ec632bd4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:54 np0005541913.localdomain podman[28869]: 2025-12-02 07:46:53.942203759 +0000 UTC m=+0.043918427 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:46:54 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acf37503ebf93858839be6ff8621087588dcfd2cd5142a47fc29d2d0ec632bd4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:54 np0005541913.localdomain podman[28869]: 2025-12-02 07:46:54.071609275 +0000 UTC m=+0.173323943 container init 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 07:46:54 np0005541913.localdomain podman[28869]: 2025-12-02 07:46:54.0865101 +0000 UTC m=+0.188224768 container start 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, RELEASE=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z)
Dec 02 07:46:54 np0005541913.localdomain bash[28869]: 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467
Dec 02 07:46:54 np0005541913.localdomain systemd[1]: Started Ceph crash.np0005541913 for c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 07:46:54 np0005541913.localdomain sudo[28534]: pam_unix(sudo:session): session closed for user root
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: INFO:ceph-crash:pinging cluster to exercise our key
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: 2025-12-02T07:46:54.258+0000 7efc164c7640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: 2025-12-02T07:46:54.258+0000 7efc164c7640 -1 AuthRegistry(0x7efc100680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: 2025-12-02T07:46:54.259+0000 7efc164c7640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: 2025-12-02T07:46:54.259+0000 7efc164c7640 -1 AuthRegistry(0x7efc164c6000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: 2025-12-02T07:46:54.266+0000 7efc14a3d640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: 2025-12-02T07:46:54.268+0000 7efc0ffff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: 2025-12-02T07:46:54.268+0000 7efc0f7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: 2025-12-02T07:46:54.268+0000 7efc164c7640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec 02 07:46:54 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913[28883]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec 02 07:46:54 np0005541913.localdomain sudo[28890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:46:54 np0005541913.localdomain sudo[28890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:46:54 np0005541913.localdomain sudo[28890]: pam_unix(sudo:session): session closed for user root
Dec 02 07:46:54 np0005541913.localdomain sudo[28915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 --yes --no-systemd
Dec 02 07:46:54 np0005541913.localdomain sudo[28915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:46:54 np0005541913.localdomain podman[28970]: 
Dec 02 07:46:54 np0005541913.localdomain podman[28970]: 2025-12-02 07:46:54.88853021 +0000 UTC m=+0.058818583 container create d0edac2ecf6b5e6c4d1b4ad4d62ddb31075f7c9c9e3e7e880f5328be6d1815fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, RELEASE=main, vcs-type=git, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218)
Dec 02 07:46:54 np0005541913.localdomain systemd[1]: Started libpod-conmon-d0edac2ecf6b5e6c4d1b4ad4d62ddb31075f7c9c9e3e7e880f5328be6d1815fd.scope.
Dec 02 07:46:54 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:46:54 np0005541913.localdomain podman[28970]: 2025-12-02 07:46:54.948825804 +0000 UTC m=+0.119114177 container init d0edac2ecf6b5e6c4d1b4ad4d62ddb31075f7c9c9e3e7e880f5328be6d1815fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 02 07:46:54 np0005541913.localdomain podman[28970]: 2025-12-02 07:46:54.956317428 +0000 UTC m=+0.126605801 container start d0edac2ecf6b5e6c4d1b4ad4d62ddb31075f7c9c9e3e7e880f5328be6d1815fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, RELEASE=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, version=7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 07:46:54 np0005541913.localdomain podman[28970]: 2025-12-02 07:46:54.956488803 +0000 UTC m=+0.126777206 container attach d0edac2ecf6b5e6c4d1b4ad4d62ddb31075f7c9c9e3e7e880f5328be6d1815fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, ceph=True, RELEASE=main, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4)
Dec 02 07:46:54 np0005541913.localdomain quizzical_albattani[28985]: 167 167
Dec 02 07:46:54 np0005541913.localdomain systemd[1]: libpod-d0edac2ecf6b5e6c4d1b4ad4d62ddb31075f7c9c9e3e7e880f5328be6d1815fd.scope: Deactivated successfully.
Dec 02 07:46:54 np0005541913.localdomain podman[28970]: 2025-12-02 07:46:54.960771299 +0000 UTC m=+0.131059702 container died d0edac2ecf6b5e6c4d1b4ad4d62ddb31075f7c9c9e3e7e880f5328be6d1815fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 07:46:54 np0005541913.localdomain podman[28970]: 2025-12-02 07:46:54.862567513 +0000 UTC m=+0.032855936 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:46:55 np0005541913.localdomain podman[28990]: 2025-12-02 07:46:55.017996888 +0000 UTC m=+0.048354389 container remove d0edac2ecf6b5e6c4d1b4ad4d62ddb31075f7c9c9e3e7e880f5328be6d1815fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, io.buildah.version=1.41.4, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, architecture=x86_64, name=rhceph, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Dec 02 07:46:55 np0005541913.localdomain systemd[1]: libpod-conmon-d0edac2ecf6b5e6c4d1b4ad4d62ddb31075f7c9c9e3e7e880f5328be6d1815fd.scope: Deactivated successfully.
Dec 02 07:46:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8e77ead6f51af6fbaea5426585741a435352adfad94e0763689ddff20894ceee-merged.mount: Deactivated successfully.
Dec 02 07:46:55 np0005541913.localdomain podman[29010]: 
Dec 02 07:46:55 np0005541913.localdomain podman[29010]: 2025-12-02 07:46:55.204588811 +0000 UTC m=+0.067070578 container create d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_shockley, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=1763362218, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.buildah.version=1.41.4, GIT_BRANCH=main, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 02 07:46:55 np0005541913.localdomain systemd[1]: Started libpod-conmon-d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045.scope.
Dec 02 07:46:55 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:46:55 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57812ead10bf640953eb7ea3baeeef0a816e07be8ab45bada3efc4ed2639414/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:55 np0005541913.localdomain podman[29010]: 2025-12-02 07:46:55.178912802 +0000 UTC m=+0.041394589 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:46:55 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57812ead10bf640953eb7ea3baeeef0a816e07be8ab45bada3efc4ed2639414/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:55 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57812ead10bf640953eb7ea3baeeef0a816e07be8ab45bada3efc4ed2639414/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:55 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57812ead10bf640953eb7ea3baeeef0a816e07be8ab45bada3efc4ed2639414/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:55 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57812ead10bf640953eb7ea3baeeef0a816e07be8ab45bada3efc4ed2639414/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 02 07:46:55 np0005541913.localdomain podman[29010]: 2025-12-02 07:46:55.324756125 +0000 UTC m=+0.187237882 container init d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_shockley, com.redhat.component=rhceph-container, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, ceph=True, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 07:46:55 np0005541913.localdomain podman[29010]: 2025-12-02 07:46:55.333723449 +0000 UTC m=+0.196205206 container start d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_shockley, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, version=7, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, name=rhceph)
Dec 02 07:46:55 np0005541913.localdomain podman[29010]: 2025-12-02 07:46:55.333932355 +0000 UTC m=+0.196414142 container attach d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_shockley, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=1763362218, version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7)
Dec 02 07:46:55 np0005541913.localdomain elastic_shockley[29026]: --> passed data devices: 0 physical, 2 LVM
Dec 02 07:46:55 np0005541913.localdomain elastic_shockley[29026]: --> relative data size: 1.0
Dec 02 07:46:55 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 02 07:46:55 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 79866ec3-47a0-4109-900e-7f4b902017d5
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 02 07:46:56 np0005541913.localdomain lvm[29080]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 02 07:46:56 np0005541913.localdomain lvm[29080]: VG ceph_vg0 finished
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]:  stderr: got monmap epoch 3
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: --> Creating keyring file for osd.0
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Dec 02 07:46:56 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 79866ec3-47a0-4109-900e-7f4b902017d5 --setuser ceph --setgroup ceph
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]:  stderr: 2025-12-02T07:46:56.993+0000 7f0daf5f7a80 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]:  stderr: 2025-12-02T07:46:56.993+0000 7f0daf5f7a80 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: --> ceph-volume lvm activate successful for osd ID: 0
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 02 07:46:59 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 580fd654-ce1e-4384-8610-e58c3d508de1
Dec 02 07:47:00 np0005541913.localdomain lvm[30024]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 02 07:47:00 np0005541913.localdomain lvm[30024]: VG ceph_vg1 finished
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-3
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-3/activate.monmap
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]:  stderr: got monmap epoch 3
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: --> Creating keyring file for osd.3
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/keyring
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/
Dec 02 07:47:00 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 3 --monmap /var/lib/ceph/osd/ceph-3/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-3/ --osd-uuid 580fd654-ce1e-4384-8610-e58c3d508de1 --setuser ceph --setgroup ceph
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]:  stderr: 2025-12-02T07:47:00.818+0000 7f9dab183a80 -1 bluestore(/var/lib/ceph/osd/ceph-3//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]:  stderr: 2025-12-02T07:47:00.818+0000 7f9dab183a80 -1 bluestore(/var/lib/ceph/osd/ceph-3/) _read_fsid unparsable uuid
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-3 --no-mon-config
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]: --> ceph-volume lvm activate successful for osd ID: 3
Dec 02 07:47:03 np0005541913.localdomain elastic_shockley[29026]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Dec 02 07:47:03 np0005541913.localdomain systemd[1]: libpod-d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045.scope: Deactivated successfully.
Dec 02 07:47:03 np0005541913.localdomain systemd[1]: libpod-d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045.scope: Consumed 3.679s CPU time.
Dec 02 07:47:03 np0005541913.localdomain podman[30938]: 2025-12-02 07:47:03.493280409 +0000 UTC m=+0.051486573 container died d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_shockley, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, version=7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True)
Dec 02 07:47:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a57812ead10bf640953eb7ea3baeeef0a816e07be8ab45bada3efc4ed2639414-merged.mount: Deactivated successfully.
Dec 02 07:47:03 np0005541913.localdomain podman[30938]: 2025-12-02 07:47:03.521134548 +0000 UTC m=+0.079340652 container remove d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_shockley, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, GIT_CLEAN=True, ceph=True, RELEASE=main, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, com.redhat.component=rhceph-container)
Dec 02 07:47:03 np0005541913.localdomain systemd[1]: libpod-conmon-d14bc450c67fbc207c1be6d406e9a476cc2a4e3f77b8e2f121434ee3c8c0f045.scope: Deactivated successfully.
Dec 02 07:47:03 np0005541913.localdomain sudo[28915]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:03 np0005541913.localdomain sudo[30951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:47:03 np0005541913.localdomain sudo[30951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:03 np0005541913.localdomain sudo[30951]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:03 np0005541913.localdomain sudo[30966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 -- lvm list --format json
Dec 02 07:47:03 np0005541913.localdomain sudo[30966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:04 np0005541913.localdomain podman[31020]: 
Dec 02 07:47:04 np0005541913.localdomain podman[31020]: 2025-12-02 07:47:04.167788636 +0000 UTC m=+0.061308512 container create d2b30f96796c44295bcf2958aaf620a83a9feee9ab7da15fc6f60ac6109169c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_mestorf, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7)
Dec 02 07:47:04 np0005541913.localdomain systemd[1]: Started libpod-conmon-d2b30f96796c44295bcf2958aaf620a83a9feee9ab7da15fc6f60ac6109169c5.scope.
Dec 02 07:47:04 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:04 np0005541913.localdomain podman[31020]: 2025-12-02 07:47:04.226753532 +0000 UTC m=+0.120273398 container init d2b30f96796c44295bcf2958aaf620a83a9feee9ab7da15fc6f60ac6109169c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_mestorf, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, ceph=True, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, release=1763362218)
Dec 02 07:47:04 np0005541913.localdomain podman[31020]: 2025-12-02 07:47:04.234242695 +0000 UTC m=+0.127762571 container start d2b30f96796c44295bcf2958aaf620a83a9feee9ab7da15fc6f60ac6109169c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_mestorf, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, version=7, RELEASE=main, distribution-scope=public, release=1763362218)
Dec 02 07:47:04 np0005541913.localdomain podman[31020]: 2025-12-02 07:47:04.234523974 +0000 UTC m=+0.128043900 container attach d2b30f96796c44295bcf2958aaf620a83a9feee9ab7da15fc6f60ac6109169c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_mestorf, RELEASE=main, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218)
Dec 02 07:47:04 np0005541913.localdomain quirky_mestorf[31036]: 167 167
Dec 02 07:47:04 np0005541913.localdomain systemd[1]: libpod-d2b30f96796c44295bcf2958aaf620a83a9feee9ab7da15fc6f60ac6109169c5.scope: Deactivated successfully.
Dec 02 07:47:04 np0005541913.localdomain podman[31020]: 2025-12-02 07:47:04.138296312 +0000 UTC m=+0.031816178 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:04 np0005541913.localdomain podman[31020]: 2025-12-02 07:47:04.237802823 +0000 UTC m=+0.131322759 container died d2b30f96796c44295bcf2958aaf620a83a9feee9ab7da15fc6f60ac6109169c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_mestorf, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=1763362218, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_CLEAN=True, architecture=x86_64)
Dec 02 07:47:04 np0005541913.localdomain podman[31041]: 2025-12-02 07:47:04.30926866 +0000 UTC m=+0.065766883 container remove d2b30f96796c44295bcf2958aaf620a83a9feee9ab7da15fc6f60ac6109169c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_mestorf, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Dec 02 07:47:04 np0005541913.localdomain systemd[1]: libpod-conmon-d2b30f96796c44295bcf2958aaf620a83a9feee9ab7da15fc6f60ac6109169c5.scope: Deactivated successfully.
Dec 02 07:47:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-17f193f8448a675432606997dfbb8f973caa4e078b2083d726cda503b6f61a95-merged.mount: Deactivated successfully.
Dec 02 07:47:04 np0005541913.localdomain podman[31062]: 
Dec 02 07:47:04 np0005541913.localdomain podman[31062]: 2025-12-02 07:47:04.516966218 +0000 UTC m=+0.074021297 container create 60cb85d7752ac58b3b50b5b080280d52b4755bb5468209588b6d0da06deb12af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_ritchie, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container)
Dec 02 07:47:04 np0005541913.localdomain systemd[1]: Started libpod-conmon-60cb85d7752ac58b3b50b5b080280d52b4755bb5468209588b6d0da06deb12af.scope.
Dec 02 07:47:04 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:04 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92053208b7194d38b3f2182affca70eaf9aa9a799a24a7136a6eac170a0dcae/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:04 np0005541913.localdomain podman[31062]: 2025-12-02 07:47:04.487908697 +0000 UTC m=+0.044963876 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:04 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92053208b7194d38b3f2182affca70eaf9aa9a799a24a7136a6eac170a0dcae/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:04 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92053208b7194d38b3f2182affca70eaf9aa9a799a24a7136a6eac170a0dcae/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:04 np0005541913.localdomain podman[31062]: 2025-12-02 07:47:04.612964204 +0000 UTC m=+0.170019283 container init 60cb85d7752ac58b3b50b5b080280d52b4755bb5468209588b6d0da06deb12af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_ritchie, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1763362218, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 07:47:04 np0005541913.localdomain systemd[1]: tmp-crun.aeqEDh.mount: Deactivated successfully.
Dec 02 07:47:04 np0005541913.localdomain podman[31062]: 2025-12-02 07:47:04.626186604 +0000 UTC m=+0.183241683 container start 60cb85d7752ac58b3b50b5b080280d52b4755bb5468209588b6d0da06deb12af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_ritchie, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, release=1763362218, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Dec 02 07:47:04 np0005541913.localdomain podman[31062]: 2025-12-02 07:47:04.626556514 +0000 UTC m=+0.183611593 container attach 60cb85d7752ac58b3b50b5b080280d52b4755bb5468209588b6d0da06deb12af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_ritchie, build-date=2025-11-26T19:44:28Z, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., vcs-type=git, name=rhceph, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, release=1763362218, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7)
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]: {
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:     "0": [
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:         {
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "devices": [
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "/dev/loop3"
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             ],
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_name": "ceph_lv0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_size": "7511998464",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=ZLSqh9-iILz-8uhj-aKI4-uLLc-tJ4e-DU20Ju,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c7c8e171-a193-56fb-95fa-8879fcfa7074,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=79866ec3-47a0-4109-900e-7f4b902017d5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_uuid": "ZLSqh9-iILz-8uhj-aKI4-uLLc-tJ4e-DU20Ju",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "name": "ceph_lv0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "tags": {
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.block_uuid": "ZLSqh9-iILz-8uhj-aKI4-uLLc-tJ4e-DU20Ju",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.cephx_lockbox_secret": "",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.cluster_fsid": "c7c8e171-a193-56fb-95fa-8879fcfa7074",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.cluster_name": "ceph",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.crush_device_class": "",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.encrypted": "0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.osd_fsid": "79866ec3-47a0-4109-900e-7f4b902017d5",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.osd_id": "0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.type": "block",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.vdo": "0"
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             },
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "type": "block",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "vg_name": "ceph_vg0"
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:         }
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:     ],
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:     "3": [
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:         {
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "devices": [
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "/dev/loop4"
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             ],
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_name": "ceph_lv1",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_size": "7511998464",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=uizRyl-UtyY-UzC3-C8WR-dfjh-VTZH-QMxT4X,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c7c8e171-a193-56fb-95fa-8879fcfa7074,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=580fd654-ce1e-4384-8610-e58c3d508de1,ceph.osd_id=3,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "lv_uuid": "uizRyl-UtyY-UzC3-C8WR-dfjh-VTZH-QMxT4X",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "name": "ceph_lv1",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "tags": {
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.block_uuid": "uizRyl-UtyY-UzC3-C8WR-dfjh-VTZH-QMxT4X",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.cephx_lockbox_secret": "",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.cluster_fsid": "c7c8e171-a193-56fb-95fa-8879fcfa7074",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.cluster_name": "ceph",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.crush_device_class": "",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.encrypted": "0",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.osd_fsid": "580fd654-ce1e-4384-8610-e58c3d508de1",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.osd_id": "3",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.type": "block",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:                 "ceph.vdo": "0"
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             },
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "type": "block",
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:             "vg_name": "ceph_vg1"
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:         }
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]:     ]
Dec 02 07:47:04 np0005541913.localdomain pensive_ritchie[31077]: }
Dec 02 07:47:04 np0005541913.localdomain systemd[1]: libpod-60cb85d7752ac58b3b50b5b080280d52b4755bb5468209588b6d0da06deb12af.scope: Deactivated successfully.
Dec 02 07:47:04 np0005541913.localdomain podman[31062]: 2025-12-02 07:47:04.952787762 +0000 UTC m=+0.509842881 container died 60cb85d7752ac58b3b50b5b080280d52b4755bb5468209588b6d0da06deb12af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_ritchie, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_BRANCH=main, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True)
Dec 02 07:47:05 np0005541913.localdomain podman[31086]: 2025-12-02 07:47:05.023071297 +0000 UTC m=+0.062583126 container remove 60cb85d7752ac58b3b50b5b080280d52b4755bb5468209588b6d0da06deb12af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_ritchie, io.openshift.expose-services=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7)
Dec 02 07:47:05 np0005541913.localdomain systemd[1]: libpod-conmon-60cb85d7752ac58b3b50b5b080280d52b4755bb5468209588b6d0da06deb12af.scope: Deactivated successfully.
Dec 02 07:47:05 np0005541913.localdomain sudo[30966]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:05 np0005541913.localdomain sudo[31099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:47:05 np0005541913.localdomain sudo[31099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:05 np0005541913.localdomain sudo[31099]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:05 np0005541913.localdomain sudo[31114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 07:47:05 np0005541913.localdomain sudo[31114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b92053208b7194d38b3f2182affca70eaf9aa9a799a24a7136a6eac170a0dcae-merged.mount: Deactivated successfully.
Dec 02 07:47:05 np0005541913.localdomain podman[31168]: 
Dec 02 07:47:05 np0005541913.localdomain podman[31168]: 2025-12-02 07:47:05.760903398 +0000 UTC m=+0.062357290 container create 39bed095e95bda3337abed181f66e22fb92632d4191ffc5171b124a2574fc17f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_goldstine, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4, RELEASE=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, com.redhat.component=rhceph-container, vcs-type=git)
Dec 02 07:47:05 np0005541913.localdomain systemd[1]: Started libpod-conmon-39bed095e95bda3337abed181f66e22fb92632d4191ffc5171b124a2574fc17f.scope.
Dec 02 07:47:05 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:05 np0005541913.localdomain podman[31168]: 2025-12-02 07:47:05.830489154 +0000 UTC m=+0.131943056 container init 39bed095e95bda3337abed181f66e22fb92632d4191ffc5171b124a2574fc17f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_goldstine, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, ceph=True, release=1763362218, vendor=Red Hat, Inc., version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 07:47:05 np0005541913.localdomain podman[31168]: 2025-12-02 07:47:05.739477155 +0000 UTC m=+0.040931027 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:05 np0005541913.localdomain systemd[1]: tmp-crun.yAdrR3.mount: Deactivated successfully.
Dec 02 07:47:05 np0005541913.localdomain podman[31168]: 2025-12-02 07:47:05.842840551 +0000 UTC m=+0.144294443 container start 39bed095e95bda3337abed181f66e22fb92632d4191ffc5171b124a2574fc17f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_goldstine, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, release=1763362218, name=rhceph, version=7, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7)
Dec 02 07:47:05 np0005541913.localdomain podman[31168]: 2025-12-02 07:47:05.843106178 +0000 UTC m=+0.144560050 container attach 39bed095e95bda3337abed181f66e22fb92632d4191ffc5171b124a2574fc17f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_goldstine, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=1763362218, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7)
Dec 02 07:47:05 np0005541913.localdomain wonderful_goldstine[31182]: 167 167
Dec 02 07:47:05 np0005541913.localdomain systemd[1]: libpod-39bed095e95bda3337abed181f66e22fb92632d4191ffc5171b124a2574fc17f.scope: Deactivated successfully.
Dec 02 07:47:05 np0005541913.localdomain podman[31168]: 2025-12-02 07:47:05.846059508 +0000 UTC m=+0.147513440 container died 39bed095e95bda3337abed181f66e22fb92632d4191ffc5171b124a2574fc17f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_goldstine, vendor=Red Hat, Inc., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, release=1763362218, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=rhceph)
Dec 02 07:47:05 np0005541913.localdomain podman[31187]: 2025-12-02 07:47:05.941961081 +0000 UTC m=+0.083967739 container remove 39bed095e95bda3337abed181f66e22fb92632d4191ffc5171b124a2574fc17f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_goldstine, release=1763362218, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True)
Dec 02 07:47:05 np0005541913.localdomain systemd[1]: libpod-conmon-39bed095e95bda3337abed181f66e22fb92632d4191ffc5171b124a2574fc17f.scope: Deactivated successfully.
Dec 02 07:47:06 np0005541913.localdomain podman[31216]: 
Dec 02 07:47:06 np0005541913.localdomain podman[31216]: 2025-12-02 07:47:06.273272727 +0000 UTC m=+0.070116282 container create 94f280b7b4a48469f40b1e6c35e28cd89a4bf269a9e6e5dd935529160e5c559b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Dec 02 07:47:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-94f280b7b4a48469f40b1e6c35e28cd89a4bf269a9e6e5dd935529160e5c559b.scope.
Dec 02 07:47:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c70add2ddcf31ea6a188a20d36e456d3936dd0b7e56854bd41976e8a86b26a69/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c70add2ddcf31ea6a188a20d36e456d3936dd0b7e56854bd41976e8a86b26a69/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:06 np0005541913.localdomain podman[31216]: 2025-12-02 07:47:06.247344311 +0000 UTC m=+0.044187856 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c70add2ddcf31ea6a188a20d36e456d3936dd0b7e56854bd41976e8a86b26a69/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c70add2ddcf31ea6a188a20d36e456d3936dd0b7e56854bd41976e8a86b26a69/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c70add2ddcf31ea6a188a20d36e456d3936dd0b7e56854bd41976e8a86b26a69/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:06 np0005541913.localdomain podman[31216]: 2025-12-02 07:47:06.382202854 +0000 UTC m=+0.179046399 container init 94f280b7b4a48469f40b1e6c35e28cd89a4bf269a9e6e5dd935529160e5c559b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate-test, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Dec 02 07:47:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4800bc8bec066dbb74d2a6401a155f1e928ad0cc52c02cce9a1b4762d710b134-merged.mount: Deactivated successfully.
Dec 02 07:47:06 np0005541913.localdomain podman[31216]: 2025-12-02 07:47:06.532292784 +0000 UTC m=+0.329136339 container start 94f280b7b4a48469f40b1e6c35e28cd89a4bf269a9e6e5dd935529160e5c559b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate-test, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-type=git, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, version=7, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main)
Dec 02 07:47:06 np0005541913.localdomain podman[31216]: 2025-12-02 07:47:06.533282631 +0000 UTC m=+0.330126186 container attach 94f280b7b4a48469f40b1e6c35e28cd89a4bf269a9e6e5dd935529160e5c559b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate-test, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, name=rhceph, distribution-scope=public, io.openshift.expose-services=)
Dec 02 07:47:06 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate-test[31231]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec 02 07:47:06 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate-test[31231]:                             [--no-systemd] [--no-tmpfs]
Dec 02 07:47:06 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate-test[31231]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 02 07:47:06 np0005541913.localdomain systemd[1]: libpod-94f280b7b4a48469f40b1e6c35e28cd89a4bf269a9e6e5dd935529160e5c559b.scope: Deactivated successfully.
Dec 02 07:47:06 np0005541913.localdomain podman[31216]: 2025-12-02 07:47:06.672208596 +0000 UTC m=+0.469052171 container died 94f280b7b4a48469f40b1e6c35e28cd89a4bf269a9e6e5dd935529160e5c559b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate-test, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, RELEASE=main)
Dec 02 07:47:06 np0005541913.localdomain systemd[1]: tmp-crun.Rra5J8.mount: Deactivated successfully.
Dec 02 07:47:06 np0005541913.localdomain systemd-journald[619]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Dec 02 07:47:06 np0005541913.localdomain systemd-journald[619]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 07:47:06 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 07:47:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c70add2ddcf31ea6a188a20d36e456d3936dd0b7e56854bd41976e8a86b26a69-merged.mount: Deactivated successfully.
Dec 02 07:47:06 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 07:47:06 np0005541913.localdomain podman[31236]: 2025-12-02 07:47:06.813961937 +0000 UTC m=+0.132040098 container remove 94f280b7b4a48469f40b1e6c35e28cd89a4bf269a9e6e5dd935529160e5c559b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate-test, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, release=1763362218, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=)
Dec 02 07:47:06 np0005541913.localdomain systemd[1]: libpod-conmon-94f280b7b4a48469f40b1e6c35e28cd89a4bf269a9e6e5dd935529160e5c559b.scope: Deactivated successfully.
Dec 02 07:47:07 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:47:07 np0005541913.localdomain systemd-sysv-generator[31292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:47:07 np0005541913.localdomain systemd-rc-local-generator[31288]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:47:07 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:47:07 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:47:07 np0005541913.localdomain systemd-rc-local-generator[31335]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:47:07 np0005541913.localdomain systemd-sysv-generator[31341]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:47:07 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:47:07 np0005541913.localdomain systemd[1]: Starting Ceph osd.0 for c7c8e171-a193-56fb-95fa-8879fcfa7074...
Dec 02 07:47:07 np0005541913.localdomain podman[31397]: 
Dec 02 07:47:07 np0005541913.localdomain podman[31397]: 2025-12-02 07:47:07.870912293 +0000 UTC m=+0.071717465 container create be0399117839d0b075122d6dadb60dd3d9308f657472f6ee566b94db47904bd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_BRANCH=main, architecture=x86_64, vendor=Red Hat, Inc., release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=)
Dec 02 07:47:07 np0005541913.localdomain systemd[1]: tmp-crun.aC0Cbp.mount: Deactivated successfully.
Dec 02 07:47:07 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:07 np0005541913.localdomain podman[31397]: 2025-12-02 07:47:07.842486998 +0000 UTC m=+0.043292230 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:07 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f4780a0ba6841bc2e86e32d453138c911a37f7070f864b5253fa4fdb79eaafe/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:07 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f4780a0ba6841bc2e86e32d453138c911a37f7070f864b5253fa4fdb79eaafe/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:07 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f4780a0ba6841bc2e86e32d453138c911a37f7070f864b5253fa4fdb79eaafe/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:07 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f4780a0ba6841bc2e86e32d453138c911a37f7070f864b5253fa4fdb79eaafe/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:07 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f4780a0ba6841bc2e86e32d453138c911a37f7070f864b5253fa4fdb79eaafe/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:07 np0005541913.localdomain podman[31397]: 2025-12-02 07:47:07.993879414 +0000 UTC m=+0.194684586 container init be0399117839d0b075122d6dadb60dd3d9308f657472f6ee566b94db47904bd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Dec 02 07:47:08 np0005541913.localdomain podman[31397]: 2025-12-02 07:47:08.003970068 +0000 UTC m=+0.204775230 container start be0399117839d0b075122d6dadb60dd3d9308f657472f6ee566b94db47904bd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_CLEAN=True, ceph=True, name=rhceph, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 07:47:08 np0005541913.localdomain podman[31397]: 2025-12-02 07:47:08.004267896 +0000 UTC m=+0.205073108 container attach be0399117839d0b075122d6dadb60dd3d9308f657472f6ee566b94db47904bd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate, RELEASE=main, name=rhceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=)
Dec 02 07:47:08 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate[31413]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 02 07:47:08 np0005541913.localdomain bash[31397]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 02 07:47:08 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate[31413]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec 02 07:47:08 np0005541913.localdomain bash[31397]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec 02 07:47:08 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate[31413]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec 02 07:47:08 np0005541913.localdomain bash[31397]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec 02 07:47:08 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate[31413]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 02 07:47:08 np0005541913.localdomain bash[31397]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 02 07:47:08 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate[31413]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 02 07:47:08 np0005541913.localdomain bash[31397]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 02 07:47:08 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate[31413]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 02 07:47:08 np0005541913.localdomain bash[31397]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 02 07:47:08 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate[31413]: --> ceph-volume raw activate successful for osd ID: 0
Dec 02 07:47:08 np0005541913.localdomain bash[31397]: --> ceph-volume raw activate successful for osd ID: 0
Dec 02 07:47:08 np0005541913.localdomain systemd[1]: libpod-be0399117839d0b075122d6dadb60dd3d9308f657472f6ee566b94db47904bd5.scope: Deactivated successfully.
Dec 02 07:47:08 np0005541913.localdomain podman[31397]: 2025-12-02 07:47:08.673221791 +0000 UTC m=+0.874026973 container died be0399117839d0b075122d6dadb60dd3d9308f657472f6ee566b94db47904bd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, distribution-scope=public)
Dec 02 07:47:08 np0005541913.localdomain podman[31543]: 2025-12-02 07:47:08.754358162 +0000 UTC m=+0.071265572 container remove be0399117839d0b075122d6dadb60dd3d9308f657472f6ee566b94db47904bd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0-activate, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7)
Dec 02 07:47:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0f4780a0ba6841bc2e86e32d453138c911a37f7070f864b5253fa4fdb79eaafe-merged.mount: Deactivated successfully.
Dec 02 07:47:09 np0005541913.localdomain podman[31604]: 
Dec 02 07:47:09 np0005541913.localdomain podman[31604]: 2025-12-02 07:47:09.045485123 +0000 UTC m=+0.070514512 container create 3886dff7ff9b490471697e906f326979cbdf63a2e30cddc8480dbb69bd74263a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git, RELEASE=main, GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Dec 02 07:47:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3457f1da0c93fed37fafc473be3799786fad35963b1e4f95fba7b386a56ef4ea/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:09 np0005541913.localdomain podman[31604]: 2025-12-02 07:47:09.016042281 +0000 UTC m=+0.041071720 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3457f1da0c93fed37fafc473be3799786fad35963b1e4f95fba7b386a56ef4ea/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3457f1da0c93fed37fafc473be3799786fad35963b1e4f95fba7b386a56ef4ea/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3457f1da0c93fed37fafc473be3799786fad35963b1e4f95fba7b386a56ef4ea/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3457f1da0c93fed37fafc473be3799786fad35963b1e4f95fba7b386a56ef4ea/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:09 np0005541913.localdomain podman[31604]: 2025-12-02 07:47:09.168952937 +0000 UTC m=+0.193982286 container init 3886dff7ff9b490471697e906f326979cbdf63a2e30cddc8480dbb69bd74263a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, version=7, ceph=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=)
Dec 02 07:47:09 np0005541913.localdomain podman[31604]: 2025-12-02 07:47:09.178495687 +0000 UTC m=+0.203525096 container start 3886dff7ff9b490471697e906f326979cbdf63a2e30cddc8480dbb69bd74263a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0, ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main)
Dec 02 07:47:09 np0005541913.localdomain bash[31604]: 3886dff7ff9b490471697e906f326979cbdf63a2e30cddc8480dbb69bd74263a
Dec 02 07:47:09 np0005541913.localdomain systemd[1]: Started Ceph osd.0 for c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 07:47:09 np0005541913.localdomain sudo[31114]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: set uid:gid to 167:167 (ceph:ceph)
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: pidfile_write: ignore empty --pid-file
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) close
Dec 02 07:47:09 np0005541913.localdomain sudo[31635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:47:09 np0005541913.localdomain sudo[31635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:09 np0005541913.localdomain sudo[31635]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:09 np0005541913.localdomain sudo[31650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 07:47:09 np0005541913.localdomain sudo[31650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) close
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: load: jerasure load: lrc 
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 02 07:47:09 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) close
Dec 02 07:47:10 np0005541913.localdomain podman[31714]: 
Dec 02 07:47:10 np0005541913.localdomain podman[31714]: 2025-12-02 07:47:10.017583267 +0000 UTC m=+0.055400330 container create e5931ac9c15211a2493a684385e23fac84ef4cb392fab3d6f162ba4b3299a9e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_johnson, io.buildah.version=1.41.4, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, release=1763362218, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) close
Dec 02 07:47:10 np0005541913.localdomain systemd[1]: Started libpod-conmon-e5931ac9c15211a2493a684385e23fac84ef4cb392fab3d6f162ba4b3299a9e0.scope.
Dec 02 07:47:10 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:10 np0005541913.localdomain podman[31714]: 2025-12-02 07:47:09.995714411 +0000 UTC m=+0.033531514 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:10 np0005541913.localdomain podman[31714]: 2025-12-02 07:47:10.099820007 +0000 UTC m=+0.137637100 container init e5931ac9c15211a2493a684385e23fac84ef4cb392fab3d6f162ba4b3299a9e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_johnson, name=rhceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, vendor=Red Hat, Inc., release=1763362218, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 07:47:10 np0005541913.localdomain podman[31714]: 2025-12-02 07:47:10.108222926 +0000 UTC m=+0.146040029 container start e5931ac9c15211a2493a684385e23fac84ef4cb392fab3d6f162ba4b3299a9e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_johnson, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64, release=1763362218, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main)
Dec 02 07:47:10 np0005541913.localdomain podman[31714]: 2025-12-02 07:47:10.108435732 +0000 UTC m=+0.146252835 container attach e5931ac9c15211a2493a684385e23fac84ef4cb392fab3d6f162ba4b3299a9e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_johnson, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, name=rhceph, distribution-scope=public, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main)
Dec 02 07:47:10 np0005541913.localdomain peaceful_johnson[31733]: 167 167
Dec 02 07:47:10 np0005541913.localdomain systemd[1]: libpod-e5931ac9c15211a2493a684385e23fac84ef4cb392fab3d6f162ba4b3299a9e0.scope: Deactivated successfully.
Dec 02 07:47:10 np0005541913.localdomain podman[31714]: 2025-12-02 07:47:10.113020247 +0000 UTC m=+0.150837340 container died e5931ac9c15211a2493a684385e23fac84ef4cb392fab3d6f162ba4b3299a9e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_johnson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.expose-services=, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, distribution-scope=public, name=rhceph)
Dec 02 07:47:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7fcfaf8331c0e0e17ef8d2ce0e4e33b8252767c22528a3ca07b06254ca1ef80e-merged.mount: Deactivated successfully.
Dec 02 07:47:10 np0005541913.localdomain podman[31738]: 2025-12-02 07:47:10.194947278 +0000 UTC m=+0.069412142 container remove e5931ac9c15211a2493a684385e23fac84ef4cb392fab3d6f162ba4b3299a9e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_johnson, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1763362218, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, version=7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 07:47:10 np0005541913.localdomain systemd[1]: libpod-conmon-e5931ac9c15211a2493a684385e23fac84ef4cb392fab3d6f162ba4b3299a9e0.scope: Deactivated successfully.
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3ae00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluefs mount
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluefs mount shared_bdev_used = 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: RocksDB version: 7.9.2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Git sha 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: DB SUMMARY
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: DB Session ID:  RL381G0UN127R7VJTA20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: CURRENT file:  CURRENT
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: IDENTITY file:  IDENTITY
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                         Options.error_if_exists: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.create_if_missing: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                         Options.paranoid_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                                     Options.env: 0x5581cb95fc70
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                                Options.info_log: 0x5581cbad6be0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_file_opening_threads: 16
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                              Options.statistics: (nil)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.use_fsync: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.max_log_file_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                         Options.allow_fallocate: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.use_direct_reads: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.create_missing_column_families: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                              Options.db_log_dir: 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                                 Options.wal_dir: db.wal
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.advise_random_on_open: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.write_buffer_manager: 0x5581cab24140
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                            Options.rate_limiter: (nil)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.unordered_write: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.row_cache: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                              Options.wal_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.allow_ingest_behind: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.two_write_queues: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.manual_wal_flush: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.wal_compression: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.atomic_flush: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.log_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.allow_data_in_errors: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.db_host_id: __hostname__
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.max_background_jobs: 4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.max_background_compactions: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.max_subcompactions: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.max_open_files: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.bytes_per_sync: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.max_background_flushes: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Compression algorithms supported:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kZSTD supported: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kXpressCompression supported: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kBZip2Compression supported: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kLZ4Compression supported: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kZlibCompression supported: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kLZ4HCCompression supported: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kSnappyCompression supported: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab12850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab12850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab12850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab12850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab12850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab12850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab12850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6fc0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6fc0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbad6fc0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: a7df2b79-a8f8-4f57-b14e-09b951f22d3a
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661630327127, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661630327393, "job": 1, "event": "recovery_finished"}
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: freelist init
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: freelist _read_cfg
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluefs umount
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) close
Dec 02 07:47:10 np0005541913.localdomain podman[31962]: 
Dec 02 07:47:10 np0005541913.localdomain podman[31962]: 2025-12-02 07:47:10.507162835 +0000 UTC m=+0.066097992 container create 1ee6409ad0c389872f4246af8672e412adf06162189a0e98e03b6f35d069a11d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate-test, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph)
Dec 02 07:47:10 np0005541913.localdomain systemd[1]: Started libpod-conmon-1ee6409ad0c389872f4246af8672e412adf06162189a0e98e03b6f35d069a11d.scope.
Dec 02 07:47:10 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aeb4002e1e42fbaa3cc2086fd25a5a1d9772e0ff1a5838be07ad7ff814184c2e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aeb4002e1e42fbaa3cc2086fd25a5a1d9772e0ff1a5838be07ad7ff814184c2e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aeb4002e1e42fbaa3cc2086fd25a5a1d9772e0ff1a5838be07ad7ff814184c2e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:10 np0005541913.localdomain podman[31962]: 2025-12-02 07:47:10.48458477 +0000 UTC m=+0.043519947 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aeb4002e1e42fbaa3cc2086fd25a5a1d9772e0ff1a5838be07ad7ff814184c2e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bdev(0x5581cab3b180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluefs mount
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluefs mount shared_bdev_used = 4718592
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: RocksDB version: 7.9.2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Git sha 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: DB SUMMARY
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: DB Session ID:  RL381G0UN127R7VJTA21
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: CURRENT file:  CURRENT
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: IDENTITY file:  IDENTITY
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                         Options.error_if_exists: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.create_if_missing: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                         Options.paranoid_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                                     Options.env: 0x5581cabc6380
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                                Options.info_log: 0x5581cbb4f640
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_file_opening_threads: 16
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                              Options.statistics: (nil)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.use_fsync: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.max_log_file_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                         Options.allow_fallocate: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.use_direct_reads: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.create_missing_column_families: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                              Options.db_log_dir: 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                                 Options.wal_dir: db.wal
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.advise_random_on_open: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.write_buffer_manager: 0x5581cab25540
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                            Options.rate_limiter: (nil)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.unordered_write: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.row_cache: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                              Options.wal_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.allow_ingest_behind: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.two_write_queues: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.manual_wal_flush: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.wal_compression: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.atomic_flush: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.log_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.allow_data_in_errors: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.db_host_id: __hostname__
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.max_background_jobs: 4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.max_background_compactions: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.max_subcompactions: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.max_open_files: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.bytes_per_sync: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 02 07:47:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aeb4002e1e42fbaa3cc2086fd25a5a1d9772e0ff1a5838be07ad7ff814184c2e/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.max_background_flushes: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Compression algorithms supported:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kZSTD supported: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kXpressCompression supported: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kBZip2Compression supported: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kLZ4Compression supported: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kZlibCompression supported: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kLZ4HCCompression supported: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         kSnappyCompression supported: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4e080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: 
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4e080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4e080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4e080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4e080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain podman[31962]: 2025-12-02 07:47:10.609238136 +0000 UTC m=+0.168173313 container init 1ee6409ad0c389872f4246af8672e412adf06162189a0e98e03b6f35d069a11d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate-test, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4e080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain podman[31962]: 2025-12-02 07:47:10.619827224 +0000 UTC m=+0.178762381 container start 1ee6409ad0c389872f4246af8672e412adf06162189a0e98e03b6f35d069a11d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate-test, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4e080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab122d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain podman[31962]: 2025-12-02 07:47:10.620737239 +0000 UTC m=+0.179672416 container attach 1ee6409ad0c389872f4246af8672e412adf06162189a0e98e03b6f35d069a11d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate-test, name=rhceph, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_BRANCH=main, vendor=Red Hat, Inc.)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4f2c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab13610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4f2c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab13610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5581cbb4f2c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5581cab13610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: a7df2b79-a8f8-4f57-b14e-09b951f22d3a
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661630622344, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661630628467, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764661630, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a7df2b79-a8f8-4f57-b14e-09b951f22d3a", "db_session_id": "RL381G0UN127R7VJTA21", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661630632431, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764661630, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a7df2b79-a8f8-4f57-b14e-09b951f22d3a", "db_session_id": "RL381G0UN127R7VJTA21", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661630636350, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764661630, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a7df2b79-a8f8-4f57-b14e-09b951f22d3a", "db_session_id": "RL381G0UN127R7VJTA21", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661630640263, "job": 1, "event": "recovery_finished"}
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5581cabd8380
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: DB pointer 0x5581cba33a00
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: _get_class not permitted to load lua
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: _get_class not permitted to load sdk
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: _get_class not permitted to load test_remote_reads
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: osd.0 0 load_pgs
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: osd.0 0 load_pgs opened 0 pgs
Dec 02 07:47:10 np0005541913.localdomain ceph-osd[31622]: osd.0 0 log_to_monitors true
Dec 02 07:47:10 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0[31618]: 2025-12-02T07:47:10.674+0000 7f80a00e9a80 -1 osd.0 0 log_to_monitors true
Dec 02 07:47:10 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate-test[31977]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec 02 07:47:10 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate-test[31977]:                             [--no-systemd] [--no-tmpfs]
Dec 02 07:47:10 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate-test[31977]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 02 07:47:10 np0005541913.localdomain systemd[1]: libpod-1ee6409ad0c389872f4246af8672e412adf06162189a0e98e03b6f35d069a11d.scope: Deactivated successfully.
Dec 02 07:47:10 np0005541913.localdomain podman[31962]: 2025-12-02 07:47:10.841838142 +0000 UTC m=+0.400773329 container died 1ee6409ad0c389872f4246af8672e412adf06162189a0e98e03b6f35d069a11d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate-test, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=1763362218, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.)
Dec 02 07:47:10 np0005541913.localdomain podman[32198]: 2025-12-02 07:47:10.918380777 +0000 UTC m=+0.065636658 container remove 1ee6409ad0c389872f4246af8672e412adf06162189a0e98e03b6f35d069a11d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate-test, description=Red Hat Ceph Storage 7, architecture=x86_64, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=rhceph, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 07:47:10 np0005541913.localdomain systemd[1]: libpod-conmon-1ee6409ad0c389872f4246af8672e412adf06162189a0e98e03b6f35d069a11d.scope: Deactivated successfully.
Dec 02 07:47:11 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:47:11 np0005541913.localdomain systemd-sysv-generator[32255]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:47:11 np0005541913.localdomain systemd-rc-local-generator[32251]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:47:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:47:11 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:47:11 np0005541913.localdomain systemd-rc-local-generator[32294]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:47:11 np0005541913.localdomain systemd-sysv-generator[32299]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:47:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:47:11 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 02 07:47:11 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 02 07:47:11 np0005541913.localdomain systemd[1]: Starting Ceph osd.3 for c7c8e171-a193-56fb-95fa-8879fcfa7074...
Dec 02 07:47:12 np0005541913.localdomain podman[32359]: 
Dec 02 07:47:12 np0005541913.localdomain podman[32359]: 2025-12-02 07:47:12.066604669 +0000 UTC m=+0.070488331 container create 3a41e61c6b1328ff22fb256c600bf665043ad9bde1cbbf1766e9a54f0124238e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, version=7)
Dec 02 07:47:12 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74262ca50bbe4ea2feed3107115f357b8ff80a7f451ed5d28b11548cbd9ebcf9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:12 np0005541913.localdomain podman[32359]: 2025-12-02 07:47:12.038293598 +0000 UTC m=+0.042177270 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74262ca50bbe4ea2feed3107115f357b8ff80a7f451ed5d28b11548cbd9ebcf9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74262ca50bbe4ea2feed3107115f357b8ff80a7f451ed5d28b11548cbd9ebcf9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74262ca50bbe4ea2feed3107115f357b8ff80a7f451ed5d28b11548cbd9ebcf9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74262ca50bbe4ea2feed3107115f357b8ff80a7f451ed5d28b11548cbd9ebcf9/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:12 np0005541913.localdomain podman[32359]: 2025-12-02 07:47:12.179900206 +0000 UTC m=+0.183783858 container init 3a41e61c6b1328ff22fb256c600bf665043ad9bde1cbbf1766e9a54f0124238e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, release=1763362218, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 07:47:12 np0005541913.localdomain podman[32359]: 2025-12-02 07:47:12.190136604 +0000 UTC m=+0.194020266 container start 3a41e61c6b1328ff22fb256c600bf665043ad9bde1cbbf1766e9a54f0124238e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vendor=Red Hat, Inc., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph)
Dec 02 07:47:12 np0005541913.localdomain podman[32359]: 2025-12-02 07:47:12.190467114 +0000 UTC m=+0.194350766 container attach 3a41e61c6b1328ff22fb256c600bf665043ad9bde1cbbf1766e9a54f0124238e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z)
Dec 02 07:47:12 np0005541913.localdomain ceph-osd[31622]: osd.0 0 done with init, starting boot process
Dec 02 07:47:12 np0005541913.localdomain ceph-osd[31622]: osd.0 0 start_boot
Dec 02 07:47:12 np0005541913.localdomain ceph-osd[31622]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 02 07:47:12 np0005541913.localdomain ceph-osd[31622]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 02 07:47:12 np0005541913.localdomain ceph-osd[31622]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 02 07:47:12 np0005541913.localdomain ceph-osd[31622]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 02 07:47:12 np0005541913.localdomain ceph-osd[31622]: osd.0 0  bench count 12288000 bsize 4 KiB
Dec 02 07:47:12 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate[32373]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Dec 02 07:47:12 np0005541913.localdomain bash[32359]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Dec 02 07:47:12 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate[32373]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec 02 07:47:12 np0005541913.localdomain bash[32359]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec 02 07:47:12 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate[32373]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec 02 07:47:12 np0005541913.localdomain bash[32359]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec 02 07:47:12 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate[32373]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 02 07:47:12 np0005541913.localdomain bash[32359]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 02 07:47:12 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate[32373]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:12 np0005541913.localdomain bash[32359]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:12 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate[32373]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Dec 02 07:47:12 np0005541913.localdomain bash[32359]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Dec 02 07:47:12 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate[32373]: --> ceph-volume raw activate successful for osd ID: 3
Dec 02 07:47:12 np0005541913.localdomain bash[32359]: --> ceph-volume raw activate successful for osd ID: 3
Dec 02 07:47:12 np0005541913.localdomain systemd[1]: libpod-3a41e61c6b1328ff22fb256c600bf665043ad9bde1cbbf1766e9a54f0124238e.scope: Deactivated successfully.
Dec 02 07:47:12 np0005541913.localdomain podman[32504]: 2025-12-02 07:47:12.914370126 +0000 UTC m=+0.054702562 container died 3a41e61c6b1328ff22fb256c600bf665043ad9bde1cbbf1766e9a54f0124238e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 07:47:13 np0005541913.localdomain podman[32504]: 2025-12-02 07:47:13.044270194 +0000 UTC m=+0.184602600 container remove 3a41e61c6b1328ff22fb256c600bf665043ad9bde1cbbf1766e9a54f0124238e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3-activate, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, RELEASE=main, name=rhceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Dec 02 07:47:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-74262ca50bbe4ea2feed3107115f357b8ff80a7f451ed5d28b11548cbd9ebcf9-merged.mount: Deactivated successfully.
Dec 02 07:47:13 np0005541913.localdomain podman[32564]: 
Dec 02 07:47:13 np0005541913.localdomain podman[32564]: 2025-12-02 07:47:13.281749234 +0000 UTC m=+0.070790949 container create effc5649c674e91178ff79d0d995136974a324018af8217643cd4efac175683e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True)
Dec 02 07:47:13 np0005541913.localdomain podman[32564]: 2025-12-02 07:47:13.242780152 +0000 UTC m=+0.031821907 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:13 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6545d2ae4e904d25466047e0fcd96d182ed72298d5730c999cd5c073a92284b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:13 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6545d2ae4e904d25466047e0fcd96d182ed72298d5730c999cd5c073a92284b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:13 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6545d2ae4e904d25466047e0fcd96d182ed72298d5730c999cd5c073a92284b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:13 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6545d2ae4e904d25466047e0fcd96d182ed72298d5730c999cd5c073a92284b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:13 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6545d2ae4e904d25466047e0fcd96d182ed72298d5730c999cd5c073a92284b/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:13 np0005541913.localdomain podman[32564]: 2025-12-02 07:47:13.449135445 +0000 UTC m=+0.238177160 container init effc5649c674e91178ff79d0d995136974a324018af8217643cd4efac175683e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main)
Dec 02 07:47:13 np0005541913.localdomain podman[32564]: 2025-12-02 07:47:13.481134266 +0000 UTC m=+0.270176061 container start effc5649c674e91178ff79d0d995136974a324018af8217643cd4efac175683e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3, io.buildah.version=1.41.4, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1763362218, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main)
Dec 02 07:47:13 np0005541913.localdomain bash[32564]: effc5649c674e91178ff79d0d995136974a324018af8217643cd4efac175683e
Dec 02 07:47:13 np0005541913.localdomain systemd[1]: Started Ceph osd.3 for c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: set uid:gid to 167:167 (ceph:ceph)
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: pidfile_write: ignore empty --pid-file
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) close
Dec 02 07:47:13 np0005541913.localdomain sudo[31650]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:13 np0005541913.localdomain sudo[32595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:47:13 np0005541913.localdomain sudo[32595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:13 np0005541913.localdomain sudo[32595]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:13 np0005541913.localdomain sudo[32610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 -- raw list --format json
Dec 02 07:47:13 np0005541913.localdomain sudo[32610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:13 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) close
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: starting osd.3 osd_data /var/lib/ceph/osd/ceph-3 /var/lib/ceph/osd/ceph-3/journal
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: load: jerasure load: lrc 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) close
Dec 02 07:47:14 np0005541913.localdomain podman[32674]: 
Dec 02 07:47:14 np0005541913.localdomain podman[32674]: 2025-12-02 07:47:14.276475864 +0000 UTC m=+0.072009863 container create 732b4cb93825d980d158b91aa225db1c98b5052aebe04cf979d8b639f8c78c3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_ganguly, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, RELEASE=main)
Dec 02 07:47:14 np0005541913.localdomain systemd[1]: Started libpod-conmon-732b4cb93825d980d158b91aa225db1c98b5052aebe04cf979d8b639f8c78c3d.scope.
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) close
Dec 02 07:47:14 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:14 np0005541913.localdomain podman[32674]: 2025-12-02 07:47:14.249019666 +0000 UTC m=+0.044553635 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:14 np0005541913.localdomain podman[32674]: 2025-12-02 07:47:14.377082945 +0000 UTC m=+0.172616914 container init 732b4cb93825d980d158b91aa225db1c98b5052aebe04cf979d8b639f8c78c3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_ganguly, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, name=rhceph, release=1763362218, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main)
Dec 02 07:47:14 np0005541913.localdomain quirky_ganguly[32687]: 167 167
Dec 02 07:47:14 np0005541913.localdomain systemd[1]: libpod-732b4cb93825d980d158b91aa225db1c98b5052aebe04cf979d8b639f8c78c3d.scope: Deactivated successfully.
Dec 02 07:47:14 np0005541913.localdomain podman[32674]: 2025-12-02 07:47:14.407777152 +0000 UTC m=+0.203311131 container start 732b4cb93825d980d158b91aa225db1c98b5052aebe04cf979d8b639f8c78c3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_ganguly, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True, release=1763362218)
Dec 02 07:47:14 np0005541913.localdomain podman[32674]: 2025-12-02 07:47:14.408338057 +0000 UTC m=+0.203872036 container attach 732b4cb93825d980d158b91aa225db1c98b5052aebe04cf979d8b639f8c78c3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_ganguly, io.buildah.version=1.41.4, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, name=rhceph, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, ceph=True)
Dec 02 07:47:14 np0005541913.localdomain podman[32674]: 2025-12-02 07:47:14.411077501 +0000 UTC m=+0.206611470 container died 732b4cb93825d980d158b91aa225db1c98b5052aebe04cf979d8b639f8c78c3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_ganguly, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 02 07:47:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b5cfded5dd0d470e991bef72eeebcd268ef284defdab52262bd3a7aa992ff5c8-merged.mount: Deactivated successfully.
Dec 02 07:47:14 np0005541913.localdomain podman[32696]: 2025-12-02 07:47:14.515069054 +0000 UTC m=+0.111717784 container remove 732b4cb93825d980d158b91aa225db1c98b5052aebe04cf979d8b639f8c78c3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_ganguly, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, RELEASE=main, architecture=x86_64)
Dec 02 07:47:14 np0005541913.localdomain systemd[1]: libpod-conmon-732b4cb93825d980d158b91aa225db1c98b5052aebe04cf979d8b639f8c78c3d.scope: Deactivated successfully.
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: osd.3:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfae00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluefs mount
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluefs mount shared_bdev_used = 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: RocksDB version: 7.9.2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Git sha 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: DB SUMMARY
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: DB Session ID:  FHL87VSJHB1TI1XONKXI
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: CURRENT file:  CURRENT
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: IDENTITY file:  IDENTITY
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                         Options.error_if_exists: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.create_if_missing: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                         Options.paranoid_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                                     Options.env: 0x56524408ec40
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                                Options.info_log: 0x565244d9c780
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_file_opening_threads: 16
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                              Options.statistics: (nil)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.use_fsync: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.max_log_file_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                         Options.allow_fallocate: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.use_direct_reads: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.create_missing_column_families: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                              Options.db_log_dir: 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                                 Options.wal_dir: db.wal
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.advise_random_on_open: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.write_buffer_manager: 0x565243de4140
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                            Options.rate_limiter: (nil)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.unordered_write: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.row_cache: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                              Options.wal_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.allow_ingest_behind: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.two_write_queues: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.manual_wal_flush: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.wal_compression: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.atomic_flush: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.log_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.allow_data_in_errors: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.db_host_id: __hostname__
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.max_background_jobs: 4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.max_background_compactions: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.max_subcompactions: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.max_open_files: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.bytes_per_sync: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.max_background_flushes: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Compression algorithms supported:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kZSTD supported: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kXpressCompression supported: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kBZip2Compression supported: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kLZ4Compression supported: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kZlibCompression supported: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kLZ4HCCompression supported: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kSnappyCompression supported: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd2850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd2850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd2850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd2850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd2850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd2850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd2850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cb60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cb60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cb60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fce38134-5a74-433d-a8c4-f491f68a5a3b
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661634645380, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661634645555, "job": 1, "event": "recovery_finished"}
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old nid_max 1025
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old blobid_max 10240
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta min_alloc_size 0x1000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: freelist init
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: freelist _read_cfg
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluefs umount
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) close
Dec 02 07:47:14 np0005541913.localdomain podman[32910]: 
Dec 02 07:47:14 np0005541913.localdomain podman[32910]: 2025-12-02 07:47:14.726471504 +0000 UTC m=+0.069764461 container create 0691cc376268e2dcc04c0bc8a30996bde72c3b7924fd317ec7fa2af03d4d7e68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_khayyam, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, architecture=x86_64, release=1763362218, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True)
Dec 02 07:47:14 np0005541913.localdomain systemd[1]: Started libpod-conmon-0691cc376268e2dcc04c0bc8a30996bde72c3b7924fd317ec7fa2af03d4d7e68.scope.
Dec 02 07:47:14 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:14 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6d319a04eb84dc57153a6a551fb3dd77eaba1c3870937f82853f3d915426996/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:14 np0005541913.localdomain podman[32910]: 2025-12-02 07:47:14.693741432 +0000 UTC m=+0.037034369 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:14 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6d319a04eb84dc57153a6a551fb3dd77eaba1c3870937f82853f3d915426996/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:14 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6d319a04eb84dc57153a6a551fb3dd77eaba1c3870937f82853f3d915426996/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:14 np0005541913.localdomain podman[32910]: 2025-12-02 07:47:14.829217384 +0000 UTC m=+0.172510361 container init 0691cc376268e2dcc04c0bc8a30996bde72c3b7924fd317ec7fa2af03d4d7e68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_khayyam, ceph=True, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 07:47:14 np0005541913.localdomain podman[32910]: 2025-12-02 07:47:14.861634346 +0000 UTC m=+0.204927323 container start 0691cc376268e2dcc04c0bc8a30996bde72c3b7924fd317ec7fa2af03d4d7e68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_khayyam, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, architecture=x86_64, name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=)
Dec 02 07:47:14 np0005541913.localdomain podman[32910]: 2025-12-02 07:47:14.862010676 +0000 UTC m=+0.205303653 container attach 0691cc376268e2dcc04c0bc8a30996bde72c3b7924fd317ec7fa2af03d4d7e68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_khayyam, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, version=7, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bdev(0x565243dfb180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluefs mount
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluefs mount shared_bdev_used = 4718592
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: RocksDB version: 7.9.2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Git sha 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: DB SUMMARY
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: DB Session ID:  FHL87VSJHB1TI1XONKXJ
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: CURRENT file:  CURRENT
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: IDENTITY file:  IDENTITY
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                         Options.error_if_exists: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.create_if_missing: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                         Options.paranoid_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                                     Options.env: 0x565243f20a80
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                                Options.info_log: 0x565243f0e460
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_file_opening_threads: 16
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                              Options.statistics: (nil)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.use_fsync: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.max_log_file_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                         Options.allow_fallocate: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.use_direct_reads: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.create_missing_column_families: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                              Options.db_log_dir: 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                                 Options.wal_dir: db.wal
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.advise_random_on_open: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.write_buffer_manager: 0x565243de4140
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                            Options.rate_limiter: (nil)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.unordered_write: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.row_cache: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                              Options.wal_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.allow_ingest_behind: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.two_write_queues: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.manual_wal_flush: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.wal_compression: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.atomic_flush: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.log_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.allow_data_in_errors: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.db_host_id: __hostname__
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.max_background_jobs: 4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.max_background_compactions: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.max_subcompactions: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.max_open_files: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.bytes_per_sync: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.max_background_flushes: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Compression algorithms supported:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kZSTD supported: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kXpressCompression supported: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kBZip2Compression supported: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kLZ4Compression supported: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kZlibCompression supported: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kLZ4HCCompression supported: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         kSnappyCompression supported: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cd00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: 
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cd00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cd00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cd00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cd00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cd00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565244d9cd00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd22d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565243f0e820)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd3610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565243f0e820)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd3610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:           Options.merge_operator: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x565243f0e820)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x565243dd3610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.write_buffer_size: 16777216
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.max_write_buffer_number: 64
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.compression: LZ4
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.num_levels: 7
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.bloom_locality: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                               Options.ttl: 2592000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                       Options.enable_blob_files: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                           Options.min_blob_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fce38134-5a74-433d-a8c4-f491f68a5a3b
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661634901542, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661634922225, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764661634, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fce38134-5a74-433d-a8c4-f491f68a5a3b", "db_session_id": "FHL87VSJHB1TI1XONKXJ", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661634926404, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1604, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 463, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764661634, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fce38134-5a74-433d-a8c4-f491f68a5a3b", "db_session_id": "FHL87VSJHB1TI1XONKXJ", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661634955485, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764661634, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fce38134-5a74-433d-a8c4-f491f68a5a3b", "db_session_id": "FHL87VSJHB1TI1XONKXJ", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764661634961041, "job": 1, "event": "recovery_finished"}
Dec 02 07:47:14 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x565243e3a700
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: DB pointer 0x565244cf3a00
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super from 4, latest 4
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super done
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: _get_class not permitted to load lua
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: _get_class not permitted to load sdk
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: _get_class not permitted to load test_remote_reads
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: osd.3 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: osd.3 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: osd.3 0 load_pgs
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: osd.3 0 load_pgs opened 0 pgs
Dec 02 07:47:15 np0005541913.localdomain ceph-osd[32582]: osd.3 0 log_to_monitors true
Dec 02 07:47:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3[32578]: 2025-12-02T07:47:15.055+0000 7f6818a06a80 -1 osd.3 0 log_to_monitors true
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]: {
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:     "580fd654-ce1e-4384-8610-e58c3d508de1": {
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "ceph_fsid": "c7c8e171-a193-56fb-95fa-8879fcfa7074",
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "osd_id": 3,
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "osd_uuid": "580fd654-ce1e-4384-8610-e58c3d508de1",
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "type": "bluestore"
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:     },
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:     "79866ec3-47a0-4109-900e-7f4b902017d5": {
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "ceph_fsid": "c7c8e171-a193-56fb-95fa-8879fcfa7074",
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "osd_id": 0,
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "osd_uuid": "79866ec3-47a0-4109-900e-7f4b902017d5",
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:         "type": "bluestore"
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]:     }
Dec 02 07:47:15 np0005541913.localdomain condescending_khayyam[32926]: }
Dec 02 07:47:15 np0005541913.localdomain systemd[1]: libpod-0691cc376268e2dcc04c0bc8a30996bde72c3b7924fd317ec7fa2af03d4d7e68.scope: Deactivated successfully.
Dec 02 07:47:15 np0005541913.localdomain podman[32910]: 2025-12-02 07:47:15.409434911 +0000 UTC m=+0.752727928 container died 0691cc376268e2dcc04c0bc8a30996bde72c3b7924fd317ec7fa2af03d4d7e68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_khayyam, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 07:47:15 np0005541913.localdomain systemd[1]: tmp-crun.jIHSbu.mount: Deactivated successfully.
Dec 02 07:47:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f6d319a04eb84dc57153a6a551fb3dd77eaba1c3870937f82853f3d915426996-merged.mount: Deactivated successfully.
Dec 02 07:47:15 np0005541913.localdomain podman[33177]: 2025-12-02 07:47:15.516537039 +0000 UTC m=+0.097457557 container remove 0691cc376268e2dcc04c0bc8a30996bde72c3b7924fd317ec7fa2af03d4d7e68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_khayyam, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 07:47:15 np0005541913.localdomain systemd[1]: libpod-conmon-0691cc376268e2dcc04c0bc8a30996bde72c3b7924fd317ec7fa2af03d4d7e68.scope: Deactivated successfully.
Dec 02 07:47:15 np0005541913.localdomain sudo[32610]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 25.116 iops: 6429.754 elapsed_sec: 0.467
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [WRN] : OSD bench result of 6429.754338 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: osd.0 0 waiting for initial osdmap
Dec 02 07:47:16 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0[31618]: 2025-12-02T07:47:16.157+0000 7f809c87d640 -1 osd.0 0 waiting for initial osdmap
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: osd.0 12 crush map has features 288514050185494528, adjusting msgr requires for clients
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: osd.0 12 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: osd.0 12 crush map has features 3314932999778484224, adjusting msgr requires for osds
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: osd.0 12 check_osdmap_features require_osd_release unknown -> reef
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: osd.0 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: osd.0 12 set_numa_affinity not setting numa affinity
Dec 02 07:47:16 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-0[31618]: 2025-12-02T07:47:16.173+0000 7f8097692640 -1 osd.0 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 02 07:47:16 np0005541913.localdomain ceph-osd[31622]: osd.0 12 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Dec 02 07:47:17 np0005541913.localdomain ceph-osd[31622]: osd.0 13 state: booting -> active
Dec 02 07:47:17 np0005541913.localdomain ceph-osd[32582]: osd.3 0 done with init, starting boot process
Dec 02 07:47:17 np0005541913.localdomain ceph-osd[32582]: osd.3 0 start_boot
Dec 02 07:47:17 np0005541913.localdomain ceph-osd[32582]: osd.3 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 02 07:47:17 np0005541913.localdomain ceph-osd[32582]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 02 07:47:17 np0005541913.localdomain ceph-osd[32582]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 02 07:47:17 np0005541913.localdomain ceph-osd[32582]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 02 07:47:17 np0005541913.localdomain ceph-osd[32582]: osd.3 0  bench count 12288000 bsize 4 KiB
Dec 02 07:47:17 np0005541913.localdomain sudo[33192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:47:17 np0005541913.localdomain sudo[33192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:17 np0005541913.localdomain sudo[33192]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:17 np0005541913.localdomain sudo[33207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:47:17 np0005541913.localdomain sudo[33207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:17 np0005541913.localdomain sudo[33207]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:17 np0005541913.localdomain sudo[33222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 07:47:17 np0005541913.localdomain sudo[33222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:18 np0005541913.localdomain podman[33308]: 2025-12-02 07:47:18.5657752 +0000 UTC m=+0.090248029 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.41.4, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7)
Dec 02 07:47:18 np0005541913.localdomain podman[33308]: 2025-12-02 07:47:18.696750419 +0000 UTC m=+0.221223238 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.41.4, vcs-type=git, release=1763362218, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main)
Dec 02 07:47:18 np0005541913.localdomain sudo[33222]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:19 np0005541913.localdomain sudo[33372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:47:19 np0005541913.localdomain sudo[33372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:19 np0005541913.localdomain sudo[33372]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:19 np0005541913.localdomain sudo[33387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:47:19 np0005541913.localdomain sudo[33387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:19 np0005541913.localdomain sudo[33387]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:20 np0005541913.localdomain sudo[33433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:47:20 np0005541913.localdomain sudo[33433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:20 np0005541913.localdomain sudo[33433]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:20 np0005541913.localdomain sudo[33448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 -- inventory --format=json-pretty --filter-for-batch
Dec 02 07:47:20 np0005541913.localdomain sudo[33448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[31622]: osd.0 16 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[31622]: osd.0 16 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[31622]: osd.0 16 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 02 07:47:20 np0005541913.localdomain podman[33499]: 
Dec 02 07:47:20 np0005541913.localdomain podman[33499]: 2025-12-02 07:47:20.557432601 +0000 UTC m=+0.057123337 container create c54287c3e6079dc8824e725de759ac7890775e3173baec7c281b45046bccd6c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_nash, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public)
Dec 02 07:47:20 np0005541913.localdomain systemd[1]: Started libpod-conmon-c54287c3e6079dc8824e725de759ac7890775e3173baec7c281b45046bccd6c8.scope.
Dec 02 07:47:20 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:20 np0005541913.localdomain podman[33499]: 2025-12-02 07:47:20.618285569 +0000 UTC m=+0.117976275 container init c54287c3e6079dc8824e725de759ac7890775e3173baec7c281b45046bccd6c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_nash, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph)
Dec 02 07:47:20 np0005541913.localdomain podman[33499]: 2025-12-02 07:47:20.623912762 +0000 UTC m=+0.123603468 container start c54287c3e6079dc8824e725de759ac7890775e3173baec7c281b45046bccd6c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_nash, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 07:47:20 np0005541913.localdomain podman[33499]: 2025-12-02 07:47:20.624098037 +0000 UTC m=+0.123788793 container attach c54287c3e6079dc8824e725de759ac7890775e3173baec7c281b45046bccd6c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_nash, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, build-date=2025-11-26T19:44:28Z, release=1763362218, io.buildah.version=1.41.4, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True)
Dec 02 07:47:20 np0005541913.localdomain dreamy_nash[33514]: 167 167
Dec 02 07:47:20 np0005541913.localdomain systemd[1]: libpod-c54287c3e6079dc8824e725de759ac7890775e3173baec7c281b45046bccd6c8.scope: Deactivated successfully.
Dec 02 07:47:20 np0005541913.localdomain podman[33499]: 2025-12-02 07:47:20.628148698 +0000 UTC m=+0.127839464 container died c54287c3e6079dc8824e725de759ac7890775e3173baec7c281b45046bccd6c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_nash, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Dec 02 07:47:20 np0005541913.localdomain podman[33499]: 2025-12-02 07:47:20.53057905 +0000 UTC m=+0.030269756 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:20 np0005541913.localdomain systemd[1]: tmp-crun.tb91cZ.mount: Deactivated successfully.
Dec 02 07:47:20 np0005541913.localdomain podman[33519]: 2025-12-02 07:47:20.706385609 +0000 UTC m=+0.065719681 container remove c54287c3e6079dc8824e725de759ac7890775e3173baec7c281b45046bccd6c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_nash, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: osd.3 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 25.541 iops: 6538.433 elapsed_sec: 0.459
Dec 02 07:47:20 np0005541913.localdomain systemd[1]: libpod-conmon-c54287c3e6079dc8824e725de759ac7890775e3173baec7c281b45046bccd6c8.scope: Deactivated successfully.
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [WRN] : OSD bench result of 6538.432602 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.3. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: osd.3 0 waiting for initial osdmap
Dec 02 07:47:20 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3[32578]: 2025-12-02T07:47:20.707+0000 7f6814985640 -1 osd.3 0 waiting for initial osdmap
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: osd.3 16 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: osd.3 16 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: osd.3 16 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: osd.3 16 check_osdmap_features require_osd_release unknown -> reef
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: osd.3 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: osd.3 16 set_numa_affinity not setting numa affinity
Dec 02 07:47:20 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-osd-3[32578]: 2025-12-02T07:47:20.727+0000 7f680ffaf640 -1 osd.3 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 02 07:47:20 np0005541913.localdomain ceph-osd[32582]: osd.3 16 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Dec 02 07:47:20 np0005541913.localdomain podman[33540]: 
Dec 02 07:47:20 np0005541913.localdomain podman[33540]: 2025-12-02 07:47:20.867110628 +0000 UTC m=+0.072224899 container create 9cd05049eef7e9afe599ca51ffb7d2e13c1e405ecc519b3b1d1124396e074946 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_swanson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1763362218, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True)
Dec 02 07:47:20 np0005541913.localdomain systemd[1]: Started libpod-conmon-9cd05049eef7e9afe599ca51ffb7d2e13c1e405ecc519b3b1d1124396e074946.scope.
Dec 02 07:47:20 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 07:47:20 np0005541913.localdomain podman[33540]: 2025-12-02 07:47:20.83783808 +0000 UTC m=+0.042952411 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 07:47:20 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60b3f922765012848beb9047fd6f99789646bd916897a8ada6d5311e94b7a5f7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:20 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60b3f922765012848beb9047fd6f99789646bd916897a8ada6d5311e94b7a5f7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:20 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60b3f922765012848beb9047fd6f99789646bd916897a8ada6d5311e94b7a5f7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 07:47:20 np0005541913.localdomain podman[33540]: 2025-12-02 07:47:20.974510464 +0000 UTC m=+0.179624745 container init 9cd05049eef7e9afe599ca51ffb7d2e13c1e405ecc519b3b1d1124396e074946 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_swanson, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_CLEAN=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 07:47:20 np0005541913.localdomain podman[33540]: 2025-12-02 07:47:20.983779746 +0000 UTC m=+0.188894027 container start 9cd05049eef7e9afe599ca51ffb7d2e13c1e405ecc519b3b1d1124396e074946 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_swanson, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, release=1763362218, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 07:47:20 np0005541913.localdomain podman[33540]: 2025-12-02 07:47:20.984839306 +0000 UTC m=+0.189953577 container attach 9cd05049eef7e9afe599ca51ffb7d2e13c1e405ecc519b3b1d1124396e074946 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_swanson, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, version=7, name=rhceph, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7)
Dec 02 07:47:21 np0005541913.localdomain ceph-osd[32582]: osd.3 17 state: booting -> active
Dec 02 07:47:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8a69e68136ced8377d960a81f292b992785ccdddd6f62f8a188f5c9221a01ccc-merged.mount: Deactivated successfully.
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]: [
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:     {
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         "available": false,
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         "ceph_device": false,
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         "lsm_data": {},
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         "lvs": [],
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         "path": "/dev/sr0",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         "rejected_reasons": [
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "Has a FileSystem",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "Insufficient space (<5GB)"
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         ],
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         "sys_api": {
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "actuators": null,
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "device_nodes": "sr0",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "human_readable_size": "482.00 KB",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "id_bus": "ata",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "model": "QEMU DVD-ROM",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "nr_requests": "2",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "partitions": {},
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "path": "/dev/sr0",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "removable": "1",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "rev": "2.5+",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "ro": "0",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "rotational": "1",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "sas_address": "",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "sas_device_handle": "",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "scheduler_mode": "mq-deadline",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "sectors": 0,
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "sectorsize": "2048",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "size": 493568.0,
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "support_discard": "0",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "type": "disk",
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:             "vendor": "QEMU"
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:         }
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]:     }
Dec 02 07:47:21 np0005541913.localdomain trusting_swanson[33555]: ]
Dec 02 07:47:21 np0005541913.localdomain systemd[1]: libpod-9cd05049eef7e9afe599ca51ffb7d2e13c1e405ecc519b3b1d1124396e074946.scope: Deactivated successfully.
Dec 02 07:47:21 np0005541913.localdomain podman[33540]: 2025-12-02 07:47:21.749920729 +0000 UTC m=+0.955035040 container died 9cd05049eef7e9afe599ca51ffb7d2e13c1e405ecc519b3b1d1124396e074946 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_swanson, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, architecture=x86_64, ceph=True, build-date=2025-11-26T19:44:28Z, vcs-type=git, version=7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 07:47:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-60b3f922765012848beb9047fd6f99789646bd916897a8ada6d5311e94b7a5f7-merged.mount: Deactivated successfully.
Dec 02 07:47:21 np0005541913.localdomain podman[34852]: 2025-12-02 07:47:21.844775444 +0000 UTC m=+0.080912786 container remove 9cd05049eef7e9afe599ca51ffb7d2e13c1e405ecc519b3b1d1124396e074946 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_swanson, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, vcs-type=git, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 02 07:47:21 np0005541913.localdomain systemd[1]: libpod-conmon-9cd05049eef7e9afe599ca51ffb7d2e13c1e405ecc519b3b1d1124396e074946.scope: Deactivated successfully.
Dec 02 07:47:21 np0005541913.localdomain sudo[33448]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:23 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 18 pg[1.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=18) [1,5,3] r=2 lpr=18 pi=[16,18)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 07:47:24 np0005541913.localdomain sudo[34866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:47:24 np0005541913.localdomain sudo[34866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:24 np0005541913.localdomain sudo[34866]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:30 np0005541913.localdomain sudo[34881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:47:30 np0005541913.localdomain sudo[34881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:30 np0005541913.localdomain sudo[34881]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:30 np0005541913.localdomain sudo[34896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 07:47:30 np0005541913.localdomain sudo[34896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:31 np0005541913.localdomain systemd[25916]: Starting Mark boot as successful...
Dec 02 07:47:31 np0005541913.localdomain podman[34980]: 2025-12-02 07:47:31.459043425 +0000 UTC m=+0.118087788 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True)
Dec 02 07:47:31 np0005541913.localdomain systemd[25916]: Finished Mark boot as successful.
Dec 02 07:47:31 np0005541913.localdomain podman[34980]: 2025-12-02 07:47:31.594761912 +0000 UTC m=+0.253806285 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.buildah.version=1.41.4, ceph=True, com.redhat.component=rhceph-container, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, release=1763362218, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-type=git, build-date=2025-11-26T19:44:28Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph)
Dec 02 07:47:31 np0005541913.localdomain sudo[34896]: pam_unix(sudo:session): session closed for user root
Dec 02 07:47:32 np0005541913.localdomain sudo[35044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:47:32 np0005541913.localdomain sudo[35044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:47:32 np0005541913.localdomain sudo[35044]: pam_unix(sudo:session): session closed for user root
Dec 02 07:48:32 np0005541913.localdomain sudo[35059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:48:32 np0005541913.localdomain sudo[35059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:48:32 np0005541913.localdomain sudo[35059]: pam_unix(sudo:session): session closed for user root
Dec 02 07:48:32 np0005541913.localdomain sudo[35074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 07:48:32 np0005541913.localdomain sudo[35074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:48:33 np0005541913.localdomain systemd[1]: tmp-crun.T4K8gV.mount: Deactivated successfully.
Dec 02 07:48:33 np0005541913.localdomain podman[35157]: 2025-12-02 07:48:33.435885848 +0000 UTC m=+0.093363104 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, RELEASE=main, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7)
Dec 02 07:48:33 np0005541913.localdomain podman[35157]: 2025-12-02 07:48:33.566335473 +0000 UTC m=+0.223812719 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, architecture=x86_64, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph)
Dec 02 07:48:33 np0005541913.localdomain sudo[35074]: pam_unix(sudo:session): session closed for user root
Dec 02 07:48:34 np0005541913.localdomain sudo[35226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:48:34 np0005541913.localdomain sudo[35226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:48:34 np0005541913.localdomain sudo[35226]: pam_unix(sudo:session): session closed for user root
Dec 02 07:48:34 np0005541913.localdomain sudo[35241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:48:34 np0005541913.localdomain sudo[35241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:48:34 np0005541913.localdomain sudo[35241]: pam_unix(sudo:session): session closed for user root
Dec 02 07:48:35 np0005541913.localdomain sudo[35287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:48:35 np0005541913.localdomain sudo[35287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:48:35 np0005541913.localdomain sudo[35287]: pam_unix(sudo:session): session closed for user root
Dec 02 07:48:42 np0005541913.localdomain sshd[24416]: Received disconnect from 192.168.122.100 port 60882:11: disconnected by user
Dec 02 07:48:42 np0005541913.localdomain sshd[24416]: Disconnected from user zuul 192.168.122.100 port 60882
Dec 02 07:48:42 np0005541913.localdomain sshd[24413]: pam_unix(sshd:session): session closed for user zuul
Dec 02 07:48:42 np0005541913.localdomain systemd-logind[757]: Session 14 logged out. Waiting for processes to exit.
Dec 02 07:48:42 np0005541913.localdomain systemd[1]: session-14.scope: Deactivated successfully.
Dec 02 07:48:42 np0005541913.localdomain systemd[1]: session-14.scope: Consumed 21.488s CPU time.
Dec 02 07:48:42 np0005541913.localdomain systemd-logind[757]: Removed session 14.
Dec 02 07:49:02 np0005541913.localdomain anacron[18350]: Job `cron.daily' started
Dec 02 07:49:02 np0005541913.localdomain anacron[18350]: Job `cron.daily' terminated
Dec 02 07:49:35 np0005541913.localdomain sudo[35305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:49:35 np0005541913.localdomain sudo[35305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:49:35 np0005541913.localdomain sudo[35305]: pam_unix(sudo:session): session closed for user root
Dec 02 07:49:35 np0005541913.localdomain sudo[35320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:49:35 np0005541913.localdomain sudo[35320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:49:36 np0005541913.localdomain sudo[35320]: pam_unix(sudo:session): session closed for user root
Dec 02 07:49:36 np0005541913.localdomain sudo[35364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:49:36 np0005541913.localdomain sudo[35364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:49:36 np0005541913.localdomain sudo[35364]: pam_unix(sudo:session): session closed for user root
Dec 02 07:50:36 np0005541913.localdomain sudo[35379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:50:36 np0005541913.localdomain sudo[35379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:50:36 np0005541913.localdomain sudo[35379]: pam_unix(sudo:session): session closed for user root
Dec 02 07:50:37 np0005541913.localdomain sudo[35394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:50:37 np0005541913.localdomain sudo[35394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:50:37 np0005541913.localdomain sudo[35394]: pam_unix(sudo:session): session closed for user root
Dec 02 07:50:38 np0005541913.localdomain sudo[35441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:50:38 np0005541913.localdomain sudo[35441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:50:38 np0005541913.localdomain sudo[35441]: pam_unix(sudo:session): session closed for user root
Dec 02 07:51:16 np0005541913.localdomain systemd[25916]: Created slice User Background Tasks Slice.
Dec 02 07:51:16 np0005541913.localdomain systemd[25916]: Starting Cleanup of User's Temporary Files and Directories...
Dec 02 07:51:16 np0005541913.localdomain systemd[25916]: Finished Cleanup of User's Temporary Files and Directories.
Dec 02 07:51:38 np0005541913.localdomain sudo[35457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:51:38 np0005541913.localdomain sudo[35457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:51:38 np0005541913.localdomain sudo[35457]: pam_unix(sudo:session): session closed for user root
Dec 02 07:51:38 np0005541913.localdomain sudo[35472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:51:38 np0005541913.localdomain sudo[35472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:51:39 np0005541913.localdomain sudo[35472]: pam_unix(sudo:session): session closed for user root
Dec 02 07:51:39 np0005541913.localdomain sudo[35519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:51:39 np0005541913.localdomain sudo[35519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:51:39 np0005541913.localdomain sudo[35519]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:06 np0005541913.localdomain sshd[35534]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:52:06 np0005541913.localdomain sshd[35534]: Accepted publickey for zuul from 192.168.122.100 port 45012 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:52:06 np0005541913.localdomain systemd-logind[757]: New session 28 of user zuul.
Dec 02 07:52:06 np0005541913.localdomain systemd[1]: Started Session 28 of User zuul.
Dec 02 07:52:06 np0005541913.localdomain sshd[35534]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 07:52:06 np0005541913.localdomain sudo[35580]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-andtklmawimykwtirkjrqoldegwwjqsp ; /usr/bin/python3
Dec 02 07:52:06 np0005541913.localdomain sudo[35580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:52:07 np0005541913.localdomain python3[35582]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 02 07:52:07 np0005541913.localdomain sudo[35580]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:07 np0005541913.localdomain sudo[35625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abumkhwgfzksecyzwezapucqexnnpthw ; /usr/bin/python3
Dec 02 07:52:07 np0005541913.localdomain sudo[35625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:52:07 np0005541913.localdomain python3[35627]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 07:52:07 np0005541913.localdomain sudo[35625]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:08 np0005541913.localdomain sudo[35645]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbfpvqpewjbzvrnzslytnycybrbfzfff ; /usr/bin/python3
Dec 02 07:52:08 np0005541913.localdomain sudo[35645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:52:08 np0005541913.localdomain python3[35647]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005541913.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 02 07:52:08 np0005541913.localdomain useradd[35649]: new group: name=tripleo-admin, GID=1003
Dec 02 07:52:08 np0005541913.localdomain useradd[35649]: new user: name=tripleo-admin, UID=1003, GID=1003, home=/home/tripleo-admin, shell=/bin/bash, from=none
Dec 02 07:52:08 np0005541913.localdomain sudo[35645]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:08 np0005541913.localdomain sudo[35701]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psjawltumhhygkdttnxnvknsnxuhpiet ; /usr/bin/python3
Dec 02 07:52:08 np0005541913.localdomain sudo[35701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:52:08 np0005541913.localdomain python3[35703]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:52:08 np0005541913.localdomain sudo[35701]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:09 np0005541913.localdomain sudo[35744]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynwxbzwhinkioxftmiczyemktanzfysm ; /usr/bin/python3
Dec 02 07:52:09 np0005541913.localdomain sudo[35744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:52:09 np0005541913.localdomain python3[35746]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764661928.636323-65475-121280229714750/source _original_basename=tmpgqowd7s2 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:52:09 np0005541913.localdomain sudo[35744]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:09 np0005541913.localdomain sudo[35774]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcbcyxssvjdjiqlrudqqdqwtxsjbwqpw ; /usr/bin/python3
Dec 02 07:52:09 np0005541913.localdomain sudo[35774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:52:09 np0005541913.localdomain python3[35776]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:52:09 np0005541913.localdomain sudo[35774]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:09 np0005541913.localdomain sudo[35790]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xeyowngmukjrdyxtscxxpobzttfqrvtu ; /usr/bin/python3
Dec 02 07:52:09 np0005541913.localdomain sudo[35790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:52:10 np0005541913.localdomain python3[35792]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:52:10 np0005541913.localdomain sudo[35790]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:10 np0005541913.localdomain sudo[35806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzcfoifeyukxhainhtjqbbtpxwxbuptu ; /usr/bin/python3
Dec 02 07:52:10 np0005541913.localdomain sudo[35806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:52:10 np0005541913.localdomain python3[35808]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:52:10 np0005541913.localdomain sudo[35806]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:11 np0005541913.localdomain sudo[35822]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juoxviapacxozjjmvybxjedjjgqudkki ; /usr/bin/python3
Dec 02 07:52:11 np0005541913.localdomain sudo[35822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 07:52:11 np0005541913.localdomain python3[35824]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCfcGXFPS+XIPHLw+7WTk1crQnJj1F7l/bATNqEM8HqdPREfaSIeF883HXh8Bv+rj9cjcgSPu+200+1SEsq35V+19mPwwkoxgdhfQu8jGk7vv17tL7k61zl9rWne61hn/7PnFptl+SBaMvOq/9ZdnPuMzb1YBTWbKm6kC3RPkgDUOa/BER5PJh1E6x6wYj1wRGMwVREczSSv+66aA5tTRelsFh16OXZXpq4ddoi7OeuimE3lWuMAHorxzJwF5AN+gPTgKYRkMwbMMHU4nPx7TXt5G3zjqWhmos08Xgdl+lPNHY5i463T96l4hGiycZKO4FOCq0ZMzldYkovXnyZi1CjSYUDcEn+EHIRJyZaK9ZJlJ1no5HVdwv1rwVMw4KkpZvH7HBh/iX47Wsi4qxK+L3X5hwZ7s6iSpNWeEMT5CLZsiDCkrdideFnZ8kW2jgnNIV0h+pUPISFfl1j03bjS9fHJjgl4BndVBxRJZJQf8Szyjx5WcIyBUidtYPnHzSLbmk= zuul-build-sshkey
                                                          regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:52:11 np0005541913.localdomain sudo[35822]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:11 np0005541913.localdomain python3[35838]: ansible-ping Invoked with data=pong
Dec 02 07:52:22 np0005541913.localdomain sshd[35839]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:52:22 np0005541913.localdomain sshd[35839]: Accepted publickey for tripleo-admin from 192.168.122.100 port 57858 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 07:52:22 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 1003.
Dec 02 07:52:22 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Dec 02 07:52:22 np0005541913.localdomain systemd-logind[757]: New session 29 of user tripleo-admin.
Dec 02 07:52:22 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Dec 02 07:52:22 np0005541913.localdomain systemd[1]: Starting User Manager for UID 1003...
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Queued start job for default target Main User Target.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Created slice User Application Slice.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Reached target Paths.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Reached target Timers.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Starting D-Bus User Message Bus Socket...
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Starting Create User's Volatile Files and Directories...
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Listening on D-Bus User Message Bus Socket.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Finished Create User's Volatile Files and Directories.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Reached target Sockets.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Reached target Basic System.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Reached target Main User Target.
Dec 02 07:52:22 np0005541913.localdomain systemd[35843]: Startup finished in 136ms.
Dec 02 07:52:22 np0005541913.localdomain systemd[1]: Started User Manager for UID 1003.
Dec 02 07:52:22 np0005541913.localdomain systemd[1]: Started Session 29 of User tripleo-admin.
Dec 02 07:52:22 np0005541913.localdomain sshd[35839]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 02 07:52:23 np0005541913.localdomain sudo[35902]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlofiotlexifbcrcygmwijuoynnjouca ; /usr/bin/python3
Dec 02 07:52:23 np0005541913.localdomain sudo[35902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:23 np0005541913.localdomain python3[35904]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 07:52:23 np0005541913.localdomain sudo[35902]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:28 np0005541913.localdomain sudo[35922]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-megtryrpndgnnrsrxlsacdslxjevmqgw ; /usr/bin/python3
Dec 02 07:52:28 np0005541913.localdomain sudo[35922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:28 np0005541913.localdomain python3[35924]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Dec 02 07:52:28 np0005541913.localdomain sudo[35922]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:29 np0005541913.localdomain sudo[35938]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwnjkrshrozrgxcztpubfzjizibkjfeb ; /usr/bin/python3
Dec 02 07:52:29 np0005541913.localdomain sudo[35938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:29 np0005541913.localdomain python3[35940]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Dec 02 07:52:29 np0005541913.localdomain sudo[35938]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:29 np0005541913.localdomain sudo[35986]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjstgwlyoepwwqeajbdbkndilnrkoygf ; /usr/bin/python3
Dec 02 07:52:29 np0005541913.localdomain sudo[35986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:29 np0005541913.localdomain python3[35988]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.b2zqkx1htmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:52:29 np0005541913.localdomain sudo[35986]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:30 np0005541913.localdomain sudo[36016]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upwjpawirmfubeonzzyxwusrgvsczpfo ; /usr/bin/python3
Dec 02 07:52:30 np0005541913.localdomain sudo[36016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:30 np0005541913.localdomain python3[36018]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.b2zqkx1htmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:52:30 np0005541913.localdomain sudo[36016]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:31 np0005541913.localdomain sudo[36032]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bassrxwmtibsgwjhzsvhssyyvsxwfvhy ; /usr/bin/python3
Dec 02 07:52:31 np0005541913.localdomain sudo[36032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:31 np0005541913.localdomain python3[36034]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.b2zqkx1htmphosts insertbefore=BOF block=172.17.0.106 np0005541912.localdomain np0005541912
                                                         172.18.0.106 np0005541912.storage.localdomain np0005541912.storage
                                                         172.20.0.106 np0005541912.storagemgmt.localdomain np0005541912.storagemgmt
                                                         172.17.0.106 np0005541912.internalapi.localdomain np0005541912.internalapi
                                                         172.19.0.106 np0005541912.tenant.localdomain np0005541912.tenant
                                                         192.168.122.106 np0005541912.ctlplane.localdomain np0005541912.ctlplane
                                                         172.17.0.107 np0005541913.localdomain np0005541913
                                                         172.18.0.107 np0005541913.storage.localdomain np0005541913.storage
                                                         172.20.0.107 np0005541913.storagemgmt.localdomain np0005541913.storagemgmt
                                                         172.17.0.107 np0005541913.internalapi.localdomain np0005541913.internalapi
                                                         172.19.0.107 np0005541913.tenant.localdomain np0005541913.tenant
                                                         192.168.122.107 np0005541913.ctlplane.localdomain np0005541913.ctlplane
                                                         172.17.0.108 np0005541914.localdomain np0005541914
                                                         172.18.0.108 np0005541914.storage.localdomain np0005541914.storage
                                                         172.20.0.108 np0005541914.storagemgmt.localdomain np0005541914.storagemgmt
                                                         172.17.0.108 np0005541914.internalapi.localdomain np0005541914.internalapi
                                                         172.19.0.108 np0005541914.tenant.localdomain np0005541914.tenant
                                                         192.168.122.108 np0005541914.ctlplane.localdomain np0005541914.ctlplane
                                                         172.17.0.103 np0005541909.localdomain np0005541909
                                                         172.18.0.103 np0005541909.storage.localdomain np0005541909.storage
                                                         172.20.0.103 np0005541909.storagemgmt.localdomain np0005541909.storagemgmt
                                                         172.17.0.103 np0005541909.internalapi.localdomain np0005541909.internalapi
                                                         172.19.0.103 np0005541909.tenant.localdomain np0005541909.tenant
                                                         192.168.122.103 np0005541909.ctlplane.localdomain np0005541909.ctlplane
                                                         172.17.0.104 np0005541910.localdomain np0005541910
                                                         172.18.0.104 np0005541910.storage.localdomain np0005541910.storage
                                                         172.20.0.104 np0005541910.storagemgmt.localdomain np0005541910.storagemgmt
                                                         172.17.0.104 np0005541910.internalapi.localdomain np0005541910.internalapi
                                                         172.19.0.104 np0005541910.tenant.localdomain np0005541910.tenant
                                                         192.168.122.104 np0005541910.ctlplane.localdomain np0005541910.ctlplane
                                                         172.17.0.105 np0005541911.localdomain np0005541911
                                                         172.18.0.105 np0005541911.storage.localdomain np0005541911.storage
                                                         172.20.0.105 np0005541911.storagemgmt.localdomain np0005541911.storagemgmt
                                                         172.17.0.105 np0005541911.internalapi.localdomain np0005541911.internalapi
                                                         172.19.0.105 np0005541911.tenant.localdomain np0005541911.tenant
                                                         192.168.122.105 np0005541911.ctlplane.localdomain np0005541911.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                         192.168.122.99  overcloud.ctlplane.localdomain
                                                         172.18.0.121  overcloud.storage.localdomain
                                                         172.20.0.222  overcloud.storagemgmt.localdomain
                                                         172.17.0.136  overcloud.internalapi.localdomain
                                                         172.21.0.241  overcloud.localdomain
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:52:31 np0005541913.localdomain sudo[36032]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:31 np0005541913.localdomain sudo[36049]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxcqbktqjipcxefnguffpnksbwwbbmjk ; /usr/bin/python3
Dec 02 07:52:31 np0005541913.localdomain sudo[36049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:31 np0005541913.localdomain python3[36051]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.b2zqkx1htmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:52:31 np0005541913.localdomain sudo[36049]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:32 np0005541913.localdomain sudo[36066]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awkgerkgoldviifyodptnqpxoekxjbwz ; /usr/bin/python3
Dec 02 07:52:32 np0005541913.localdomain sudo[36066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:32 np0005541913.localdomain python3[36068]: ansible-file Invoked with path=/tmp/ansible.b2zqkx1htmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:52:32 np0005541913.localdomain sudo[36066]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:33 np0005541913.localdomain sudo[36082]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sozlegxxcakiscoommoeoxinlijjoayk ; /usr/bin/python3
Dec 02 07:52:33 np0005541913.localdomain sudo[36082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:33 np0005541913.localdomain python3[36084]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:52:33 np0005541913.localdomain sudo[36082]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:33 np0005541913.localdomain sudo[36099]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqxpobfdgdqhklneaweughfkuuverdlf ; /usr/bin/python3
Dec 02 07:52:33 np0005541913.localdomain sudo[36099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:34 np0005541913.localdomain python3[36101]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:52:37 np0005541913.localdomain sudo[36099]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:38 np0005541913.localdomain sudo[36118]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmppdlqfdlmxydcldnqlmgwafhodtirc ; /usr/bin/python3
Dec 02 07:52:38 np0005541913.localdomain sudo[36118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:38 np0005541913.localdomain python3[36120]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:52:38 np0005541913.localdomain sudo[36118]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:38 np0005541913.localdomain sudo[36135]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oeskygdzvossqmcpxnwscjxsxebfjbux ; /usr/bin/python3
Dec 02 07:52:38 np0005541913.localdomain sudo[36135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:52:39 np0005541913.localdomain python3[36137]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:52:39 np0005541913.localdomain sudo[36139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:52:39 np0005541913.localdomain sudo[36139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:52:39 np0005541913.localdomain sudo[36139]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:39 np0005541913.localdomain sudo[36154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:52:39 np0005541913.localdomain sudo[36154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:52:40 np0005541913.localdomain sudo[36154]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:41 np0005541913.localdomain sudo[36201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:52:41 np0005541913.localdomain sudo[36201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:52:41 np0005541913.localdomain sudo[36201]: pam_unix(sudo:session): session closed for user root
Dec 02 07:52:55 np0005541913.localdomain groupadd[36386]: group added to /etc/group: name=puppet, GID=52
Dec 02 07:52:55 np0005541913.localdomain groupadd[36386]: group added to /etc/gshadow: name=puppet
Dec 02 07:52:55 np0005541913.localdomain groupadd[36386]: new group: name=puppet, GID=52
Dec 02 07:52:55 np0005541913.localdomain useradd[36393]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Dec 02 07:53:41 np0005541913.localdomain sudo[36965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:53:41 np0005541913.localdomain sudo[36965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:53:41 np0005541913.localdomain sudo[36965]: pam_unix(sudo:session): session closed for user root
Dec 02 07:53:41 np0005541913.localdomain sudo[36980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:53:41 np0005541913.localdomain sudo[36980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:53:42 np0005541913.localdomain sudo[36980]: pam_unix(sudo:session): session closed for user root
Dec 02 07:53:42 np0005541913.localdomain sudo[37031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:53:42 np0005541913.localdomain sudo[37031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:53:42 np0005541913.localdomain sudo[37031]: pam_unix(sudo:session): session closed for user root
Dec 02 07:53:53 np0005541913.localdomain kernel: SELinux:  Converting 2700 SID table entries...
Dec 02 07:53:53 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 07:53:53 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 07:53:53 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 07:53:53 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 07:53:53 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 07:53:53 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 07:53:53 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 07:53:53 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 02 07:53:53 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 07:53:53 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 07:53:53 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:53:53 np0005541913.localdomain systemd-rc-local-generator[37209]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:53:53 np0005541913.localdomain systemd-sysv-generator[37213]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:53:53 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:53:54 np0005541913.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 07:53:54 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 07:53:54 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 07:53:54 np0005541913.localdomain systemd[1]: run-r9bfe09271b094052ac972abeb413283b.service: Deactivated successfully.
Dec 02 07:53:55 np0005541913.localdomain sudo[36135]: pam_unix(sudo:session): session closed for user root
Dec 02 07:53:55 np0005541913.localdomain sudo[37642]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaovajhhvawtuilmzcuafagmjbmveyif ; /usr/bin/python3
Dec 02 07:53:55 np0005541913.localdomain sudo[37642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:53:55 np0005541913.localdomain python3[37644]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:53:56 np0005541913.localdomain sudo[37642]: pam_unix(sudo:session): session closed for user root
Dec 02 07:53:56 np0005541913.localdomain sudo[37781]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiaysggkplhrrbkcnmyfjltaydejywwk ; /usr/bin/python3
Dec 02 07:53:56 np0005541913.localdomain sudo[37781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:53:57 np0005541913.localdomain python3[37783]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:53:57 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:53:57 np0005541913.localdomain systemd-rc-local-generator[37813]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:53:57 np0005541913.localdomain systemd-sysv-generator[37817]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:53:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:53:57 np0005541913.localdomain sudo[37781]: pam_unix(sudo:session): session closed for user root
Dec 02 07:53:57 np0005541913.localdomain sudo[37835]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txtchjaomyycsueceqcdyqatgewhhisr ; /usr/bin/python3
Dec 02 07:53:57 np0005541913.localdomain sudo[37835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:53:57 np0005541913.localdomain python3[37837]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:53:57 np0005541913.localdomain sudo[37835]: pam_unix(sudo:session): session closed for user root
Dec 02 07:53:58 np0005541913.localdomain sudo[37851]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxwudrotbxqnjjcockxscafrrycramfy ; /usr/bin/python3
Dec 02 07:53:58 np0005541913.localdomain sudo[37851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:53:58 np0005541913.localdomain python3[37853]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:53:58 np0005541913.localdomain sudo[37851]: pam_unix(sudo:session): session closed for user root
Dec 02 07:53:59 np0005541913.localdomain sudo[37868]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kunfhorkudlgrdqnpcscuigbxesyvwbm ; /usr/bin/python3
Dec 02 07:53:59 np0005541913.localdomain sudo[37868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:53:59 np0005541913.localdomain python3[37870]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 07:53:59 np0005541913.localdomain sudo[37868]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:01 np0005541913.localdomain sudo[37886]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbejviezeacrferxsnhtcqhbfqrflrlz ; /usr/bin/python3
Dec 02 07:54:01 np0005541913.localdomain sudo[37886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:01 np0005541913.localdomain python3[37888]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:01 np0005541913.localdomain sudo[37886]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:01 np0005541913.localdomain sudo[37904]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhtwyaitmcqvwunematwvuhvpiaipjvo ; /usr/bin/python3
Dec 02 07:54:01 np0005541913.localdomain sudo[37904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:01 np0005541913.localdomain python3[37906]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:01 np0005541913.localdomain sudo[37904]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:02 np0005541913.localdomain sudo[37922]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqjzdgehwtjypfrdnsbleodwfypnndnf ; /usr/bin/python3
Dec 02 07:54:02 np0005541913.localdomain sudo[37922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:02 np0005541913.localdomain python3[37924]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 07:54:02 np0005541913.localdomain systemd[1]: Reloading Network Manager...
Dec 02 07:54:02 np0005541913.localdomain NetworkManager[5965]: <info>  [1764662042.4569] audit: op="reload" arg="0" pid=37927 uid=0 result="success"
Dec 02 07:54:02 np0005541913.localdomain NetworkManager[5965]: <info>  [1764662042.4581] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Dec 02 07:54:02 np0005541913.localdomain NetworkManager[5965]: <info>  [1764662042.4582] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Dec 02 07:54:02 np0005541913.localdomain systemd[1]: Reloaded Network Manager.
Dec 02 07:54:02 np0005541913.localdomain sudo[37922]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:02 np0005541913.localdomain sudo[37941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abkpnxuudujyimhphjmhoyihwicfhhgm ; /usr/bin/python3
Dec 02 07:54:02 np0005541913.localdomain sudo[37941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:02 np0005541913.localdomain python3[37943]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:54:02 np0005541913.localdomain sudo[37941]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:04 np0005541913.localdomain sudo[37958]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjbqsewliafnelozzychonqqzmxhbeor ; /usr/bin/python3
Dec 02 07:54:04 np0005541913.localdomain sudo[37958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:04 np0005541913.localdomain python3[37960]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:54:04 np0005541913.localdomain sudo[37958]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:04 np0005541913.localdomain sudo[37976]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrhfsuwcjbqwiufappypftkhsizjmmot ; /usr/bin/python3
Dec 02 07:54:04 np0005541913.localdomain sudo[37976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:04 np0005541913.localdomain python3[37978]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:54:04 np0005541913.localdomain sudo[37976]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:04 np0005541913.localdomain sudo[37992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpacejewmqysdxtnirvxyclztkxiamhi ; /usr/bin/python3
Dec 02 07:54:04 np0005541913.localdomain sudo[37992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:05 np0005541913.localdomain python3[37994]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:05 np0005541913.localdomain sudo[37992]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:06 np0005541913.localdomain sudo[38008]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edbdeuhvngogdfmkxhmbsbhdccozmdci ; /usr/bin/python3
Dec 02 07:54:06 np0005541913.localdomain sudo[38008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:06 np0005541913.localdomain python3[38010]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 02 07:54:06 np0005541913.localdomain sudo[38008]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:06 np0005541913.localdomain sudo[38024]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kevvkyyxukozfqjbzoqwturbqfooosxd ; /usr/bin/python3
Dec 02 07:54:06 np0005541913.localdomain sudo[38024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:06 np0005541913.localdomain python3[38026]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:54:06 np0005541913.localdomain sudo[38024]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:07 np0005541913.localdomain sudo[38040]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyoqqvvsxibhirwwohwhfjjpyiziwelh ; /usr/bin/python3
Dec 02 07:54:07 np0005541913.localdomain sudo[38040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:07 np0005541913.localdomain python3[38042]: ansible-blockinfile Invoked with path=/tmp/ansible.wl_m9jvj block=[192.168.122.106]*,[np0005541912.ctlplane.localdomain]*,[172.17.0.106]*,[np0005541912.internalapi.localdomain]*,[172.18.0.106]*,[np0005541912.storage.localdomain]*,[172.20.0.106]*,[np0005541912.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005541912.tenant.localdomain]*,[np0005541912.localdomain]*,[np0005541912]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDKgyHtHHKWFdaOqx5AsvOJPmNsbjVxvzh05A7Hy02rgbdg4zBUd/E0mqG+tYVGg12fIdbRNgjUfM+PEGJznZdEQnZCtLgMhbpRC33IbCXMw7Ev/tRfkffpP+H8VdyGL83zCFFnMIMD2IDWU+MjTf/ais63Zv/UiBL24pkZ18u3nypjN3uN2FdeDF4JNtnSVK6i1a+wE6wLmdSAfX8ovFbLhZMgAAPU3I3Fu5D/pSa6OjKshEcNy0m6KCKwQoT6cbDGsnMjd2sdE1Vc+KgkrBN3fMmrChdgi2Ig7CpkdGvQF0G/t53cwNatjp78FrNCHjpLcIAFw3QgfepiTiXQbXQ/jC5xkdM+5wIcSmB3rf3GKaUgaxnjk55GAXxrHwAFwOi+ltxSNPszH9vfIBLluThUdmQmvtCOCvEFZ5uuVuu94A5frS9BzOIzz7ylrqau3nHGaPjbT80XubnqZsHlOahsovbk1mu3ewvoitAVb0E+BBroNWeHT9BbA8Igh+sxwGM=
                                                         [192.168.122.107]*,[np0005541913.ctlplane.localdomain]*,[172.17.0.107]*,[np0005541913.internalapi.localdomain]*,[172.18.0.107]*,[np0005541913.storage.localdomain]*,[172.20.0.107]*,[np0005541913.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005541913.tenant.localdomain]*,[np0005541913.localdomain]*,[np0005541913]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDYXeXWwxJkeR9i2V9hYiVGqEGSbkwFIKUbTm3m8em9m5o380jUORSYXOITLm0CAl/waSYEc4fiPu2sAYDISig1zqAItfAODEdayFoKK63ui7vq92ZPKayhmjahj2jNo3KMAZ5aFzNBcowsRooRqLNJ7R9BAQ4H8kdqL9xdRjy5bvfWJHGrm8PvWcUaRYebCQ35j+7nHq4RFRYsd964NKjrq+FxkjyOSs2AxE+SHYOVgAAd8Jp2uyr3dR56IzWy8WqQzPj6tlsER8+/Kt1lASATcuMFeteA0M7tbjZxEIAPyfktPVQOq9mgeFOFmTf8oTbt94Rk2QmyNI4oE7sQHFWo9UWrvZd9LpDDartUls5uHunn4SzvgvtRimO3e1hNXn0VQLGNfSUwGij0R3iOYJpACHgly3J7sbX3tROvwRpawZlGIGZY46vaYRMXGClXz+lUCa6ZZO+f6BX6bEt0VfYWX8IVmnH2oJXEJBYJPVXZML+OcczJc8zEfHxBylpZn4k=
                                                         [192.168.122.108]*,[np0005541914.ctlplane.localdomain]*,[172.17.0.108]*,[np0005541914.internalapi.localdomain]*,[172.18.0.108]*,[np0005541914.storage.localdomain]*,[172.20.0.108]*,[np0005541914.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005541914.tenant.localdomain]*,[np0005541914.localdomain]*,[np0005541914]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCHh7115UF/t7QzqWY1fk2wHPOuHuMPRhaYTC/yfMWr+nqJ5/TNZTuFxq0aW/1gHanB2usmC0wpWf4c1KsPZ71Ehs/j5nV1wfGtNVEq5Zj7uhs0ea/SQToF2RS406RoIzJW6ogv4Kl3nxGEK6c44WCu8+Ki98dCQ4wesh5kSBkqgiSq2IZkL2gjoAKeXdracGRJ596gTB0yfsMl/qdJDneVHMq/rptlFhabLeiEN+7C0o0gsZwYsxCd2oSB+DD9KfXhWIBeXRr1B7mFcMZpGNG7pG0d1IjYOUmqjvVpECHrLvjiitS3800ZEFwygU4sbM/DWHelobjtJB/fxxPTtGNlbH4MK/OGFh2mm5jB1LMqWSsifA/ZAHASAAffWDwKtF+xJ06OHRDT6gjzOd7VJpc8kR9Jn9pT7UnjypnrM12GtrO0CH8Lf3rin71kf9iZRIphqWXhiLN3G/mdJC2XPIxJp7NQ1Mqc5IhHciCv80bvsGrzLCtAr16/b+cPYo7vIGU=
                                                         [192.168.122.103]*,[np0005541909.ctlplane.localdomain]*,[172.17.0.103]*,[np0005541909.internalapi.localdomain]*,[172.18.0.103]*,[np0005541909.storage.localdomain]*,[172.20.0.103]*,[np0005541909.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005541909.tenant.localdomain]*,[np0005541909.localdomain]*,[np0005541909]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0b4xecJ9cZa0s7FCPYSs6kLrfHyBh8YL/KS+tj3DrfUU03KCcmbHQesHBBcRxB6PDYjueAsvx5rGXzjMojO5Jz2DlZoSPaBM9tm/HAKWhaiL+seTfrRsNLFvxfWyxU/x0FUSOTf01ZThrT/IJ5WkfJD4UgZQSzUPucffImwFt4y2oERfa96sAwSwE4o5RuLzRdKuWB3npxcApj2/3+pyWR59yubokMiU506MI37Hbg8xCaC5qn4ISKB8WBJObICoNQoatrbcqSOrrUEFv/vcWANDYUEw6XzTTwkuIu6dJPJiJh8j5TzDnnvKSK+f3eEG7OCiz814F+o82tDo7U6k5ERO0xmElXdOlPYsiuM5+CTQmmm6xmFN2L3HIvZlyPn3oF26oV+INAd3XsF5MIFcfpGUXH5b04gE7LhpdVLVfLGGYSVWjZhzxl/Wa0OiHoMaDUYoN2bPG0h5SPUDIyDv2jW3FDxhOWANR/9ITUCQpz3gSwl/1AVN3HCWf+RUeLuE=
                                                         [192.168.122.104]*,[np0005541910.ctlplane.localdomain]*,[172.17.0.104]*,[np0005541910.internalapi.localdomain]*,[172.18.0.104]*,[np0005541910.storage.localdomain]*,[172.20.0.104]*,[np0005541910.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005541910.tenant.localdomain]*,[np0005541910.localdomain]*,[np0005541910]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDOmh2HMG9Y5+9VA8Ap3pHIOQhG/GfAsIqnmfJJuGwKb8N2T9r1Yd+kmoP7Xs41cto4h6Fw1f4Pa6Tw050y3LmwpXvDN+2Qq1qYI0rT4pqOiYBkyMbOQhqLF5tA+MNYGdibQj/fWkG+gKa8wwzkTgCEAn6PgEZiqR9LFJrqr4RfQDxaWCLmXM96+AVGG5/SXWx5u6T3lanUnpcfISvB2yx4HifsINAHPgLR4weEzra/b7e0QNyxItxvlDseasPyeYHD3Hdi2PNuUmoZC+zWEoWoU3BMAQeXR7lmEcdtyK5wr0pIBmf0CKFdvGrdVWrzAUbDc8ZHXmWyKlWHHZvHch1V2r/S4J2983UsG3sJwM8954Tj325LgS1nldIYBSjwMGfhZFYzmy9obAN7ZSV5qwD0h+rxt/I9RNdXS3SRu9tOZI+AN59De44cF23OJS5MfrfnB7JUnBOv4ScVML4rPjPx9L4/omOlfbBVJx42b1RlboXEk52J7Aa3xRseA4Elvuk=
                                                         [192.168.122.105]*,[np0005541911.ctlplane.localdomain]*,[172.17.0.105]*,[np0005541911.internalapi.localdomain]*,[172.18.0.105]*,[np0005541911.storage.localdomain]*,[172.20.0.105]*,[np0005541911.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005541911.tenant.localdomain]*,[np0005541911.localdomain]*,[np0005541911]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzI5YTDMvj8zBlKqeNplIMBQQJ43gcDfB5cRE7DwwpHBRcqOuhSoIm7r0C3h5ABQJYkTXEGRY0i5HC5eMErD7SKRJJ3q9aZ+uv4VvUGagr7M9S/JGUjZej2+ACXZ7L+d9MLt389xVtIuuNh5Cy3U8muIBEAS1b4mXOJ95eiW3M5b2hxmol0DTjUMX/bLtJU/MQ09wE72pj6Uqz/CCFsUwDBZlQ3jcVK74fYwgItCNkLJ+D2E4wTl4Ei8XOlEY9cV8B1E+aK6iUKesiya0Vfi/Ant77ONQDeCsI21AJDbi5wtUXg4qXBu3Z/zObZiEmedzqWj7K46Nv8lDlQoeoKuxzTCwxgn0PaorQgkUvUdAyk5Qo4BaUOv8ojICiZvRy9QZ3jblr1dCM/Jy3g4Sz6Hz4QHxtV21nUw//sBN2X6jCHQVGTJeZrbVvgGNcGiqcCzQTW/4NoiOB0ho7RVNtD+oYb5UE+Lh+Ibua3bv7zfnLjsw1GiyclsCgrQTKBl8Netc=
                                                          create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:07 np0005541913.localdomain sudo[38040]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:07 np0005541913.localdomain sudo[38056]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyqztyqfjdnnxdiutmzrmajwnvomwofw ; /usr/bin/python3
Dec 02 07:54:07 np0005541913.localdomain sudo[38056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:07 np0005541913.localdomain python3[38058]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.wl_m9jvj' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:54:07 np0005541913.localdomain sudo[38056]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:08 np0005541913.localdomain sudo[38074]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmgiwbrpmeblgfuayyopmwzkakekmpkz ; /usr/bin/python3
Dec 02 07:54:08 np0005541913.localdomain sudo[38074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:08 np0005541913.localdomain python3[38076]: ansible-file Invoked with path=/tmp/ansible.wl_m9jvj state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:08 np0005541913.localdomain sudo[38074]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:08 np0005541913.localdomain sudo[38090]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efqmdhngonmqayfhpbimwxxbbchvovib ; /usr/bin/python3
Dec 02 07:54:08 np0005541913.localdomain sudo[38090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:09 np0005541913.localdomain python3[38092]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:54:09 np0005541913.localdomain sudo[38090]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:09 np0005541913.localdomain sudo[38106]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubvqlatgqlfnlyxrrfuxrtilsvdvwjzm ; /usr/bin/python3
Dec 02 07:54:09 np0005541913.localdomain sudo[38106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:09 np0005541913.localdomain python3[38108]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:54:09 np0005541913.localdomain sudo[38106]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:09 np0005541913.localdomain sudo[38124]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdkrnrajwetnwixhlzpqcqsmruynvhrr ; /usr/bin/python3
Dec 02 07:54:09 np0005541913.localdomain sudo[38124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:09 np0005541913.localdomain python3[38126]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:54:09 np0005541913.localdomain sudo[38124]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:10 np0005541913.localdomain sudo[38143]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrdojwprrblywwxuozkrzxotkwsbxghv ; /usr/bin/python3
Dec 02 07:54:10 np0005541913.localdomain sudo[38143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:10 np0005541913.localdomain python3[38145]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Dec 02 07:54:10 np0005541913.localdomain sudo[38143]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:10 np0005541913.localdomain sudo[38159]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlyedcsllnkuamidwpmwsgdeskyihrxo ; /usr/bin/python3
Dec 02 07:54:10 np0005541913.localdomain sudo[38159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:10 np0005541913.localdomain sudo[38159]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:10 np0005541913.localdomain sudo[38207]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shagybnhoqalahikgiivjszapxmupknq ; /usr/bin/python3
Dec 02 07:54:10 np0005541913.localdomain sudo[38207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:11 np0005541913.localdomain sudo[38207]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:11 np0005541913.localdomain sudo[38250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcltsbxsnkoamxopknfpxlwlrcmuchjo ; /usr/bin/python3
Dec 02 07:54:11 np0005541913.localdomain sudo[38250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:11 np0005541913.localdomain sudo[38250]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:12 np0005541913.localdomain sudo[38280]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wddrykdkzzbetuivxdszwyzacynuzhhg ; /usr/bin/python3
Dec 02 07:54:12 np0005541913.localdomain sudo[38280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:12 np0005541913.localdomain python3[38282]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:54:12 np0005541913.localdomain sudo[38280]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:12 np0005541913.localdomain sudo[38297]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqtahikbzcunsxgbxmcigdhkxpwclezs ; /usr/bin/python3
Dec 02 07:54:12 np0005541913.localdomain sudo[38297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:13 np0005541913.localdomain python3[38299]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:54:16 np0005541913.localdomain dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Dec 02 07:54:16 np0005541913.localdomain dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Dec 02 07:54:16 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:54:17 np0005541913.localdomain systemd-rc-local-generator[38368]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:54:17 np0005541913.localdomain systemd-sysv-generator[38374]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: tuned.service: Deactivated successfully.
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: tuned.service: Consumed 2.317s CPU time.
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 07:54:17 np0005541913.localdomain systemd[1]: run-r0c64b10bcab6453d9b822e7452915af9.service: Deactivated successfully.
Dec 02 07:54:18 np0005541913.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Dec 02 07:54:18 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 07:54:18 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 07:54:18 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 07:54:18 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 07:54:18 np0005541913.localdomain systemd[1]: run-rd9c8c7bcc4064a7c8926f3cb3baa1060.service: Deactivated successfully.
Dec 02 07:54:19 np0005541913.localdomain sudo[38297]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:19 np0005541913.localdomain sudo[38733]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwnrgplgwpzlsvhhbhbqksyltqyptxqp ; /usr/bin/python3
Dec 02 07:54:19 np0005541913.localdomain sudo[38733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:19 np0005541913.localdomain python3[38735]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:54:19 np0005541913.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 02 07:54:20 np0005541913.localdomain systemd[1]: tuned.service: Deactivated successfully.
Dec 02 07:54:20 np0005541913.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 02 07:54:20 np0005541913.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 02 07:54:21 np0005541913.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Dec 02 07:54:21 np0005541913.localdomain sudo[38733]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:21 np0005541913.localdomain sudo[38928]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bevehfhhtqsrqdqkqwwtioafgfbovmvv ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Dec 02 07:54:21 np0005541913.localdomain sudo[38928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:21 np0005541913.localdomain python3[38930]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:54:21 np0005541913.localdomain sudo[38928]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:22 np0005541913.localdomain sudo[38945]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tehzaxqqkmmxkclgedlffospbvvmbrgv ; /usr/bin/python3
Dec 02 07:54:22 np0005541913.localdomain sudo[38945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:22 np0005541913.localdomain python3[38947]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Dec 02 07:54:22 np0005541913.localdomain sudo[38945]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:22 np0005541913.localdomain sudo[38961]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neigyedmgzcnrpxqkcddasmyaykxjvlm ; /usr/bin/python3
Dec 02 07:54:22 np0005541913.localdomain sudo[38961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:22 np0005541913.localdomain python3[38963]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:54:22 np0005541913.localdomain sudo[38961]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:23 np0005541913.localdomain sudo[38977]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drwnkoqpgxerealujcvrrgeibmzbqghk ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Dec 02 07:54:23 np0005541913.localdomain sudo[38977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:23 np0005541913.localdomain python3[38979]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:54:24 np0005541913.localdomain sudo[38977]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:24 np0005541913.localdomain sudo[38997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwovzxyofcwsnadtcbxnozmdbvtwuklv ; /usr/bin/python3
Dec 02 07:54:24 np0005541913.localdomain sudo[38997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:25 np0005541913.localdomain python3[38999]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:54:25 np0005541913.localdomain sudo[38997]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:25 np0005541913.localdomain sudo[39014]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyzmcxhczeyovluaqzkgxzwohpbvprro ; /usr/bin/python3
Dec 02 07:54:25 np0005541913.localdomain sudo[39014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:25 np0005541913.localdomain python3[39016]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:54:25 np0005541913.localdomain sudo[39014]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:28 np0005541913.localdomain sudo[39030]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzjuvsennqunqbhatwwoqxuisrxptneq ; /usr/bin/python3
Dec 02 07:54:28 np0005541913.localdomain sudo[39030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:28 np0005541913.localdomain python3[39032]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:28 np0005541913.localdomain sudo[39030]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:32 np0005541913.localdomain sudo[39046]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arlzoycxekaypfonecmbdiirkuvsqver ; /usr/bin/python3
Dec 02 07:54:32 np0005541913.localdomain sudo[39046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:33 np0005541913.localdomain python3[39048]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:33 np0005541913.localdomain sudo[39046]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:33 np0005541913.localdomain sudo[39094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnowjimzkddekvvecpqbpzfkdkyewrid ; /usr/bin/python3
Dec 02 07:54:33 np0005541913.localdomain sudo[39094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:33 np0005541913.localdomain python3[39096]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:33 np0005541913.localdomain sudo[39094]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:33 np0005541913.localdomain sudo[39139]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-appduxolrlklkjwaghrwobdnqugbxtbm ; /usr/bin/python3
Dec 02 07:54:33 np0005541913.localdomain sudo[39139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:33 np0005541913.localdomain python3[39141]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662073.255895-70027-258494704750880/source _original_basename=tmpqyal53v0 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:33 np0005541913.localdomain sudo[39139]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:34 np0005541913.localdomain sudo[39169]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gahjhxlrfisnyhrvbcumhumygppipehc ; /usr/bin/python3
Dec 02 07:54:34 np0005541913.localdomain sudo[39169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:34 np0005541913.localdomain python3[39171]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:34 np0005541913.localdomain sudo[39169]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:34 np0005541913.localdomain sudo[39217]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxnbvzlvbefysakuytnmhdyhgejtshgb ; /usr/bin/python3
Dec 02 07:54:34 np0005541913.localdomain sudo[39217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:34 np0005541913.localdomain python3[39219]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:34 np0005541913.localdomain sudo[39217]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:34 np0005541913.localdomain sudo[39260]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmewuonfhxhhivjgtwugxdrhwsgtohtg ; /usr/bin/python3
Dec 02 07:54:34 np0005541913.localdomain sudo[39260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:35 np0005541913.localdomain python3[39262]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662074.462108-70101-256517337408385/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=303a9e8dd06eeb9157c66bb31355109aa4c872ae backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:35 np0005541913.localdomain sudo[39260]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:35 np0005541913.localdomain sudo[39322]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlepercnopaivupdqdaihayfemfbhgqm ; /usr/bin/python3
Dec 02 07:54:35 np0005541913.localdomain sudo[39322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:35 np0005541913.localdomain python3[39324]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:35 np0005541913.localdomain sudo[39322]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:35 np0005541913.localdomain sudo[39365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpzkemwivaqhflccxvuwnjvdmfhcmduv ; /usr/bin/python3
Dec 02 07:54:35 np0005541913.localdomain sudo[39365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:35 np0005541913.localdomain python3[39367]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662075.3088977-70338-118411139602545/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=da1c3b8584bf2231cac158ee0d91c3ea69fbb742 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:35 np0005541913.localdomain sudo[39365]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:36 np0005541913.localdomain sudo[39427]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwlgqlwjgcftavcvnqpvawfejxvesclw ; /usr/bin/python3
Dec 02 07:54:36 np0005541913.localdomain sudo[39427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:36 np0005541913.localdomain python3[39429]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:36 np0005541913.localdomain sudo[39427]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:36 np0005541913.localdomain sudo[39470]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btfjyymdbvmklabrdczmmmlrebnzfiqz ; /usr/bin/python3
Dec 02 07:54:36 np0005541913.localdomain sudo[39470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:36 np0005541913.localdomain python3[39472]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662076.0728672-70338-200263392274181/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=cefd5bd69caea640bd56356af0b9c6878752d6a2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:36 np0005541913.localdomain sudo[39470]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:37 np0005541913.localdomain sudo[39532]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohsdxnxyesyudtwgtxjxcbhnvkfjiloe ; /usr/bin/python3
Dec 02 07:54:37 np0005541913.localdomain sudo[39532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:37 np0005541913.localdomain python3[39534]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:37 np0005541913.localdomain sudo[39532]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:37 np0005541913.localdomain sudo[39575]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlgjcmtkddgvjmupoyhyejvpsvewuerc ; /usr/bin/python3
Dec 02 07:54:37 np0005541913.localdomain sudo[39575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:37 np0005541913.localdomain python3[39577]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662076.8803103-70338-74319233140249/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=175c760950d63a47f443f25b58088dba962f090b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:37 np0005541913.localdomain sudo[39575]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:37 np0005541913.localdomain sudo[39637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbcmkhgeuuodvygaoyasapaojxcvpsdr ; /usr/bin/python3
Dec 02 07:54:37 np0005541913.localdomain sudo[39637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:38 np0005541913.localdomain python3[39639]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:38 np0005541913.localdomain sudo[39637]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:38 np0005541913.localdomain sudo[39680]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niaxrzefcxdafmzfjllwyjhcmvzxoqtc ; /usr/bin/python3
Dec 02 07:54:38 np0005541913.localdomain sudo[39680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:38 np0005541913.localdomain python3[39682]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662077.7541177-70338-239947092356002/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:38 np0005541913.localdomain sudo[39680]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:38 np0005541913.localdomain sudo[39742]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbmtauedcgganmroqhrwlxjffzcznlxt ; /usr/bin/python3
Dec 02 07:54:38 np0005541913.localdomain sudo[39742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:38 np0005541913.localdomain python3[39744]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:38 np0005541913.localdomain sudo[39742]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:39 np0005541913.localdomain sudo[39785]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sludjcusyqryotocshymbycsstqhgwzx ; /usr/bin/python3
Dec 02 07:54:39 np0005541913.localdomain sudo[39785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:39 np0005541913.localdomain python3[39787]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662078.5195305-70338-224258781882864/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=ee812c4410e77888a2aa029c6a63e712c30d05b7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:39 np0005541913.localdomain sudo[39785]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:39 np0005541913.localdomain sudo[39847]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmpktykkqxhpqfimddzltxayqaebshem ; /usr/bin/python3
Dec 02 07:54:39 np0005541913.localdomain sudo[39847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:39 np0005541913.localdomain python3[39849]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:39 np0005541913.localdomain sudo[39847]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:39 np0005541913.localdomain sudo[39890]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmnuyayxzfihbaqeyynfdwsjreeuzmtf ; /usr/bin/python3
Dec 02 07:54:39 np0005541913.localdomain sudo[39890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:40 np0005541913.localdomain python3[39892]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662079.3448932-70338-67530307605258/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:40 np0005541913.localdomain sudo[39890]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:40 np0005541913.localdomain sudo[39952]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiohnhergivpgkzrptfldlheficwbaaw ; /usr/bin/python3
Dec 02 07:54:40 np0005541913.localdomain sudo[39952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:40 np0005541913.localdomain python3[39954]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:40 np0005541913.localdomain sudo[39952]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:40 np0005541913.localdomain sudo[39995]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnycshiriywtxshfswanncpmakygayok ; /usr/bin/python3
Dec 02 07:54:40 np0005541913.localdomain sudo[39995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:40 np0005541913.localdomain python3[39997]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662080.2452297-70338-90429950494491/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=c605747c28ed219c21bc7a334ba3c66112b9a2b8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:40 np0005541913.localdomain sudo[39995]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:41 np0005541913.localdomain sudo[40057]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzbobamnpsbefcpjvyjhikgtkawcvoer ; /usr/bin/python3
Dec 02 07:54:41 np0005541913.localdomain sudo[40057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:41 np0005541913.localdomain python3[40059]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:41 np0005541913.localdomain sudo[40057]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:41 np0005541913.localdomain sudo[40100]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqfoidzzvputnsylnypfixucwwpxukad ; /usr/bin/python3
Dec 02 07:54:41 np0005541913.localdomain sudo[40100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:41 np0005541913.localdomain python3[40102]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662081.0901384-70338-232170586210173/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:41 np0005541913.localdomain sudo[40100]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:42 np0005541913.localdomain sudo[40162]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdculaygswobbsmnbknbkfskwpjzsxot ; /usr/bin/python3
Dec 02 07:54:42 np0005541913.localdomain sudo[40162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:42 np0005541913.localdomain python3[40164]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:42 np0005541913.localdomain sudo[40162]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:42 np0005541913.localdomain sudo[40205]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zngwswmycfjqfegdzhduoomxszbwxrwd ; /usr/bin/python3
Dec 02 07:54:42 np0005541913.localdomain sudo[40205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:42 np0005541913.localdomain python3[40207]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662081.939774-70338-72770872888350/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:42 np0005541913.localdomain sudo[40205]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:42 np0005541913.localdomain sudo[40208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:54:42 np0005541913.localdomain sudo[40208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:54:42 np0005541913.localdomain sudo[40208]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:42 np0005541913.localdomain sudo[40236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:54:42 np0005541913.localdomain sudo[40236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:54:42 np0005541913.localdomain sudo[40297]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjuvwafwfblljwqhjjlyuhpiyzpohsxv ; /usr/bin/python3
Dec 02 07:54:42 np0005541913.localdomain sudo[40297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:43 np0005541913.localdomain python3[40299]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:43 np0005541913.localdomain sudo[40297]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:43 np0005541913.localdomain sudo[40360]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgqodyrivrgytjjrltosveqpupuuyouj ; /usr/bin/python3
Dec 02 07:54:43 np0005541913.localdomain sudo[40360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:43 np0005541913.localdomain sudo[40236]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:43 np0005541913.localdomain python3[40368]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662082.7777872-70338-275549495189120/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=10edb31dfbca94f943eb45361d83d805daa0e00e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:43 np0005541913.localdomain sudo[40360]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:43 np0005541913.localdomain sudo[40402]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmhxujqamayryskricidewhkraphphlg ; /usr/bin/python3
Dec 02 07:54:43 np0005541913.localdomain sudo[40402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:43 np0005541913.localdomain python3[40404]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:54:43 np0005541913.localdomain sudo[40402]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:44 np0005541913.localdomain sudo[40405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:54:44 np0005541913.localdomain sudo[40405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:54:44 np0005541913.localdomain sudo[40405]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:44 np0005541913.localdomain sudo[40465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dztkiunsjgbhvfwueivijiinkicnzvhd ; /usr/bin/python3
Dec 02 07:54:44 np0005541913.localdomain sudo[40465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:44 np0005541913.localdomain python3[40467]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:54:44 np0005541913.localdomain sudo[40465]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:44 np0005541913.localdomain sudo[40508]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eoeefukqthkfpofrsxwmmpcxxezdhlvl ; /usr/bin/python3
Dec 02 07:54:44 np0005541913.localdomain sudo[40508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:44 np0005541913.localdomain python3[40510]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662084.2868075-70928-241374715003322/source _original_basename=tmprees3h1u follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:54:44 np0005541913.localdomain sudo[40508]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:50 np0005541913.localdomain sudo[40538]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nakcvviizqfudckmbedzbhviicgfvwmk ; /usr/bin/python3
Dec 02 07:54:50 np0005541913.localdomain sudo[40538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:50 np0005541913.localdomain python3[40540]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 07:54:50 np0005541913.localdomain sudo[40538]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:51 np0005541913.localdomain sudo[40599]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zueskfftughyyvuliytcwzsgjhxzqtmx ; /usr/bin/python3
Dec 02 07:54:51 np0005541913.localdomain sudo[40599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:51 np0005541913.localdomain python3[40601]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:54:56 np0005541913.localdomain sudo[40599]: pam_unix(sudo:session): session closed for user root
Dec 02 07:54:56 np0005541913.localdomain sudo[40616]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etqiufttuxjhndgwbnjfdrtnqwpyqmyb ; /usr/bin/python3
Dec 02 07:54:56 np0005541913.localdomain sudo[40616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:54:56 np0005541913.localdomain python3[40618]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:00 np0005541913.localdomain sudo[40616]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:01 np0005541913.localdomain sudo[40633]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evpkjnlnmvognfedmdtcbyuhfjpuqagg ; /usr/bin/python3
Dec 02 07:55:01 np0005541913.localdomain sudo[40633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:01 np0005541913.localdomain python3[40635]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:01 np0005541913.localdomain sudo[40633]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:01 np0005541913.localdomain sudo[40656]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kobcctiauicuggaymluwvalokfjosriy ; /usr/bin/python3
Dec 02 07:55:01 np0005541913.localdomain sudo[40656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:02 np0005541913.localdomain python3[40658]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:06 np0005541913.localdomain sudo[40656]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:06 np0005541913.localdomain sudo[40673]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrjuuzgmwzqtzqfetxwryixinmvsduri ; /usr/bin/python3
Dec 02 07:55:06 np0005541913.localdomain sudo[40673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:06 np0005541913.localdomain python3[40675]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:06 np0005541913.localdomain sudo[40673]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:07 np0005541913.localdomain sudo[40696]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knfbayihijtceqvmxsksgfvnlbbxbhtq ; /usr/bin/python3
Dec 02 07:55:07 np0005541913.localdomain sudo[40696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:07 np0005541913.localdomain python3[40698]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:11 np0005541913.localdomain sudo[40696]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:11 np0005541913.localdomain sudo[40713]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dattpyzqcwysfdeyoekvkvlexyzchkvv ; /usr/bin/python3
Dec 02 07:55:11 np0005541913.localdomain sudo[40713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:11 np0005541913.localdomain python3[40715]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:15 np0005541913.localdomain sudo[40713]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:16 np0005541913.localdomain sudo[40730]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yilufuhoaoldvjntlxzozvinfcwiygjw ; /usr/bin/python3
Dec 02 07:55:16 np0005541913.localdomain sudo[40730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:16 np0005541913.localdomain python3[40732]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:16 np0005541913.localdomain systemd[35843]: Starting Mark boot as successful...
Dec 02 07:55:16 np0005541913.localdomain systemd[35843]: Finished Mark boot as successful.
Dec 02 07:55:17 np0005541913.localdomain sudo[40730]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:17 np0005541913.localdomain sudo[40754]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvnpbgyqifjlgijwpqbtxmqbwtdtzxpf ; /usr/bin/python3
Dec 02 07:55:17 np0005541913.localdomain sudo[40754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:17 np0005541913.localdomain python3[40756]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:21 np0005541913.localdomain sudo[40754]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:21 np0005541913.localdomain sudo[40771]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icjrazpawwsppclqvifpdtsvqkgkoohq ; /usr/bin/python3
Dec 02 07:55:21 np0005541913.localdomain sudo[40771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:22 np0005541913.localdomain python3[40773]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:26 np0005541913.localdomain sudo[40771]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:26 np0005541913.localdomain sudo[40788]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahdrorigtnstwxrivzysurxwnxnxmtbc ; /usr/bin/python3
Dec 02 07:55:26 np0005541913.localdomain sudo[40788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:26 np0005541913.localdomain python3[40790]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:26 np0005541913.localdomain sudo[40788]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:26 np0005541913.localdomain sudo[40811]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snjgwaqxkhrfyqxcwenkocitocujtxim ; /usr/bin/python3
Dec 02 07:55:26 np0005541913.localdomain sudo[40811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:26 np0005541913.localdomain python3[40813]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:31 np0005541913.localdomain sudo[40811]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:31 np0005541913.localdomain sudo[40828]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iquwcllgpanfczzmbzbdwakwvpgkzgqz ; /usr/bin/python3
Dec 02 07:55:31 np0005541913.localdomain sudo[40828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:31 np0005541913.localdomain python3[40830]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:35 np0005541913.localdomain sudo[40828]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:35 np0005541913.localdomain sudo[40845]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcngvuvdzdeefyuuicoqclpjmgqhezwi ; /usr/bin/python3
Dec 02 07:55:35 np0005541913.localdomain sudo[40845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:35 np0005541913.localdomain python3[40847]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:35 np0005541913.localdomain sudo[40845]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:36 np0005541913.localdomain sudo[40868]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kogegkxgrudutsojnrfnrhzbbsrydxai ; /usr/bin/python3
Dec 02 07:55:36 np0005541913.localdomain sudo[40868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:36 np0005541913.localdomain python3[40870]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:40 np0005541913.localdomain sudo[40868]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:40 np0005541913.localdomain sudo[40885]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iddvwshwlbsyuttmbsgyrkxqpmlloaai ; /usr/bin/python3
Dec 02 07:55:40 np0005541913.localdomain sudo[40885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:40 np0005541913.localdomain python3[40887]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:55:44 np0005541913.localdomain sudo[40889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:55:44 np0005541913.localdomain sudo[40889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:55:44 np0005541913.localdomain sudo[40889]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:44 np0005541913.localdomain sudo[40904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:55:44 np0005541913.localdomain sudo[40904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:55:44 np0005541913.localdomain sudo[40885]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:44 np0005541913.localdomain sudo[40904]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:45 np0005541913.localdomain sudo[40964]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xswdpbxmykwvaaeluhjaaskpcdnlegra ; /usr/bin/python3
Dec 02 07:55:45 np0005541913.localdomain sudo[40964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:45 np0005541913.localdomain python3[40966]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:45 np0005541913.localdomain sudo[40964]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:46 np0005541913.localdomain sudo[41012]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtjlfpnvzchkecoijxognybtnfuweiow ; /usr/bin/python3
Dec 02 07:55:46 np0005541913.localdomain sudo[41012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:46 np0005541913.localdomain python3[41014]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:46 np0005541913.localdomain sudo[41012]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:46 np0005541913.localdomain sudo[41030]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysdxbihlsyicyugxrwsheylsehmklaje ; /usr/bin/python3
Dec 02 07:55:46 np0005541913.localdomain sudo[41030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:46 np0005541913.localdomain python3[41032]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpel0hwj_s recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:46 np0005541913.localdomain sudo[41030]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:46 np0005541913.localdomain sudo[41060]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wetcmwfycymjahxxtiehoqsmbkkfvtxn ; /usr/bin/python3
Dec 02 07:55:46 np0005541913.localdomain sudo[41060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:47 np0005541913.localdomain python3[41062]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:47 np0005541913.localdomain sudo[41060]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:47 np0005541913.localdomain sudo[41063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:55:47 np0005541913.localdomain sudo[41063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:55:47 np0005541913.localdomain sudo[41063]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:47 np0005541913.localdomain sudo[41123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvxybjvehxyrqbcktjeosdazpbamkyoc ; /usr/bin/python3
Dec 02 07:55:47 np0005541913.localdomain sudo[41123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:47 np0005541913.localdomain python3[41125]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:47 np0005541913.localdomain sudo[41123]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:47 np0005541913.localdomain sudo[41141]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cebjgbfpebzxdromrppowyslnluruhwp ; /usr/bin/python3
Dec 02 07:55:47 np0005541913.localdomain sudo[41141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:47 np0005541913.localdomain python3[41143]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:47 np0005541913.localdomain sudo[41141]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:48 np0005541913.localdomain sudo[41203]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izogmzkokwmoutgikytpdyfncxgitumq ; /usr/bin/python3
Dec 02 07:55:48 np0005541913.localdomain sudo[41203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:48 np0005541913.localdomain python3[41205]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:48 np0005541913.localdomain sudo[41203]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:48 np0005541913.localdomain sudo[41221]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crslsqaazgijvcsuhvzoyzzqwditlgif ; /usr/bin/python3
Dec 02 07:55:48 np0005541913.localdomain sudo[41221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:48 np0005541913.localdomain python3[41223]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:48 np0005541913.localdomain sudo[41221]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:49 np0005541913.localdomain sudo[41283]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxunsesssrglwwhlsyfwctbrxvsfmeql ; /usr/bin/python3
Dec 02 07:55:49 np0005541913.localdomain sudo[41283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:49 np0005541913.localdomain python3[41285]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:49 np0005541913.localdomain sudo[41283]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:49 np0005541913.localdomain sudo[41301]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyxvdsdbwvxxonfssygyqzvozzimjyup ; /usr/bin/python3
Dec 02 07:55:49 np0005541913.localdomain sudo[41301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:49 np0005541913.localdomain python3[41303]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:49 np0005541913.localdomain sudo[41301]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:49 np0005541913.localdomain sudo[41363]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shplpzonskjzjvoccdjtlwdhrjoenlgf ; /usr/bin/python3
Dec 02 07:55:49 np0005541913.localdomain sudo[41363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:50 np0005541913.localdomain python3[41365]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:50 np0005541913.localdomain sudo[41363]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:50 np0005541913.localdomain sudo[41381]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfpmachmuavwyrgdoohopconmgaimwoj ; /usr/bin/python3
Dec 02 07:55:50 np0005541913.localdomain sudo[41381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:50 np0005541913.localdomain python3[41383]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:50 np0005541913.localdomain sudo[41381]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:50 np0005541913.localdomain sudo[41443]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhidtlcqyvsyhplgiovylnmjrfxvzmbh ; /usr/bin/python3
Dec 02 07:55:50 np0005541913.localdomain sudo[41443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:50 np0005541913.localdomain python3[41445]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:50 np0005541913.localdomain sudo[41443]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:50 np0005541913.localdomain sudo[41461]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wydjqjzjklvtjnlcpafhodcsojyrurod ; /usr/bin/python3
Dec 02 07:55:50 np0005541913.localdomain sudo[41461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:50 np0005541913.localdomain python3[41463]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:50 np0005541913.localdomain sudo[41461]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:51 np0005541913.localdomain sudo[41523]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eieqlrowxnylxsblbuqjyrrqmrprapah ; /usr/bin/python3
Dec 02 07:55:51 np0005541913.localdomain sudo[41523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:51 np0005541913.localdomain python3[41525]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:51 np0005541913.localdomain sudo[41523]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:51 np0005541913.localdomain sudo[41541]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycjeutnsxkybupfhlyavjgbmzxlwwjmv ; /usr/bin/python3
Dec 02 07:55:51 np0005541913.localdomain sudo[41541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:51 np0005541913.localdomain python3[41543]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:51 np0005541913.localdomain sudo[41541]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:52 np0005541913.localdomain sudo[41603]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnpewvbsijisjioejvvcxryyfavaotrr ; /usr/bin/python3
Dec 02 07:55:52 np0005541913.localdomain sudo[41603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:52 np0005541913.localdomain python3[41605]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:52 np0005541913.localdomain sudo[41603]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:52 np0005541913.localdomain sudo[41621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fucllfdpuidbrwrcyrmzqvnflisngztw ; /usr/bin/python3
Dec 02 07:55:52 np0005541913.localdomain sudo[41621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:52 np0005541913.localdomain python3[41623]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:52 np0005541913.localdomain sudo[41621]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:52 np0005541913.localdomain sudo[41683]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npmtjhfyehxgtipnklowuhehqsyrgkrr ; /usr/bin/python3
Dec 02 07:55:52 np0005541913.localdomain sudo[41683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:53 np0005541913.localdomain python3[41685]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:53 np0005541913.localdomain sudo[41683]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:53 np0005541913.localdomain sudo[41701]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqhntliktwcsafhdmituugwlorrquhit ; /usr/bin/python3
Dec 02 07:55:53 np0005541913.localdomain sudo[41701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:53 np0005541913.localdomain python3[41703]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:53 np0005541913.localdomain sudo[41701]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:53 np0005541913.localdomain sudo[41763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzvfdtrcvzbzfaatcecfkexgbphznpom ; /usr/bin/python3
Dec 02 07:55:53 np0005541913.localdomain sudo[41763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:53 np0005541913.localdomain python3[41765]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:53 np0005541913.localdomain sudo[41763]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:53 np0005541913.localdomain sudo[41781]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzbcouuzpvaondzlglgmvirrdumgpeeb ; /usr/bin/python3
Dec 02 07:55:53 np0005541913.localdomain sudo[41781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:53 np0005541913.localdomain python3[41783]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:54 np0005541913.localdomain sudo[41781]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:54 np0005541913.localdomain sudo[41843]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtxjdgmqutqvyvwinlhimcgthnaejizs ; /usr/bin/python3
Dec 02 07:55:54 np0005541913.localdomain sudo[41843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:54 np0005541913.localdomain python3[41845]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:54 np0005541913.localdomain sudo[41843]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:54 np0005541913.localdomain sudo[41861]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wevhxlwddevevlnjreclgjkvqhuppmtu ; /usr/bin/python3
Dec 02 07:55:54 np0005541913.localdomain sudo[41861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:54 np0005541913.localdomain python3[41863]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:54 np0005541913.localdomain sudo[41861]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:55 np0005541913.localdomain sudo[41923]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-girftkjbokxhnubqjusnlgtmuknuqfek ; /usr/bin/python3
Dec 02 07:55:55 np0005541913.localdomain sudo[41923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:55 np0005541913.localdomain python3[41925]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:55 np0005541913.localdomain sudo[41923]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:55 np0005541913.localdomain sudo[41941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgoirbsxmrcxubufzcrurrmmxhrjwwjy ; /usr/bin/python3
Dec 02 07:55:55 np0005541913.localdomain sudo[41941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:55 np0005541913.localdomain python3[41943]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:55 np0005541913.localdomain sudo[41941]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:55 np0005541913.localdomain sudo[41971]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awacvpygczjmpijbfgrhcivclayjddpc ; /usr/bin/python3
Dec 02 07:55:55 np0005541913.localdomain sudo[41971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:55 np0005541913.localdomain python3[41973]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:55:55 np0005541913.localdomain sudo[41971]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:56 np0005541913.localdomain sudo[42019]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jghieiyqcpytmjrozyzyzhblprzeexcw ; /usr/bin/python3
Dec 02 07:55:56 np0005541913.localdomain sudo[42019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:56 np0005541913.localdomain python3[42021]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:55:56 np0005541913.localdomain sudo[42019]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:56 np0005541913.localdomain sudo[42037]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spiqflbpzgrusdmvaxwpeuhvqcniurtn ; /usr/bin/python3
Dec 02 07:55:56 np0005541913.localdomain sudo[42037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:56 np0005541913.localdomain python3[42039]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpnai8sik4 recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:55:56 np0005541913.localdomain sudo[42037]: pam_unix(sudo:session): session closed for user root
Dec 02 07:55:59 np0005541913.localdomain sudo[42067]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfhlsmkyzgxqclqkuvyiqlanshfppuep ; /usr/bin/python3
Dec 02 07:55:59 np0005541913.localdomain sudo[42067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:55:59 np0005541913.localdomain python3[42069]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:56:02 np0005541913.localdomain sudo[42067]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:04 np0005541913.localdomain sudo[42084]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqkfzgiopkkbsioytkutbiufblfpibzs ; /usr/bin/python3
Dec 02 07:56:04 np0005541913.localdomain sudo[42084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:04 np0005541913.localdomain python3[42086]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:56:04 np0005541913.localdomain sudo[42084]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:04 np0005541913.localdomain sudo[42102]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsjnxnyvkuwhxikvrumivgezfgcblfyl ; /usr/bin/python3
Dec 02 07:56:04 np0005541913.localdomain sudo[42102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:05 np0005541913.localdomain python3[42104]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:56:05 np0005541913.localdomain sudo[42102]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:05 np0005541913.localdomain sudo[42120]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtvewilndzlzqgwwnwmncdlydzvrfkus ; /usr/bin/python3
Dec 02 07:56:05 np0005541913.localdomain sudo[42120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:05 np0005541913.localdomain python3[42122]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:56:05 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:56:05 np0005541913.localdomain systemd-rc-local-generator[42145]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:56:05 np0005541913.localdomain systemd-sysv-generator[42150]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:56:05 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:56:06 np0005541913.localdomain systemd[1]: Starting Netfilter Tables...
Dec 02 07:56:06 np0005541913.localdomain systemd[1]: Finished Netfilter Tables.
Dec 02 07:56:06 np0005541913.localdomain sudo[42120]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:06 np0005541913.localdomain sudo[42209]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxcrlounfqqffsapkwzzgadkpxbblono ; /usr/bin/python3
Dec 02 07:56:06 np0005541913.localdomain sudo[42209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:06 np0005541913.localdomain python3[42211]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:06 np0005541913.localdomain sudo[42209]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:07 np0005541913.localdomain sudo[42252]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arrbditkpzwhaugglfqmfshoiyyibnai ; /usr/bin/python3
Dec 02 07:56:07 np0005541913.localdomain sudo[42252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:07 np0005541913.localdomain python3[42254]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662166.5338497-73746-239159711976096/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:07 np0005541913.localdomain sudo[42252]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:07 np0005541913.localdomain sudo[42282]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxekuxsxzkxugwpshdgrihzkfndfheyp ; /usr/bin/python3
Dec 02 07:56:07 np0005541913.localdomain sudo[42282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:07 np0005541913.localdomain python3[42284]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:56:07 np0005541913.localdomain sudo[42282]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:07 np0005541913.localdomain sudo[42300]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mouqpfihbsnrxfpcccibvjztuemgxnyc ; /usr/bin/python3
Dec 02 07:56:07 np0005541913.localdomain sudo[42300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:08 np0005541913.localdomain python3[42302]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:56:08 np0005541913.localdomain sudo[42300]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:08 np0005541913.localdomain sudo[42349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eujmuzdememvzelersjyceclejmskccp ; /usr/bin/python3
Dec 02 07:56:08 np0005541913.localdomain sudo[42349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:08 np0005541913.localdomain python3[42351]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:08 np0005541913.localdomain sudo[42349]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:08 np0005541913.localdomain sudo[42392]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypcvuyokxclbrgbmtwhxokvcjmnbbyqp ; /usr/bin/python3
Dec 02 07:56:08 np0005541913.localdomain sudo[42392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:08 np0005541913.localdomain python3[42394]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662168.185676-73982-262951498292517/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:08 np0005541913.localdomain sudo[42392]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:09 np0005541913.localdomain sudo[42454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-levkytlkeyakwtllmhbuwdbypixcvwgo ; /usr/bin/python3
Dec 02 07:56:09 np0005541913.localdomain sudo[42454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:09 np0005541913.localdomain python3[42456]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:09 np0005541913.localdomain sudo[42454]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:09 np0005541913.localdomain sudo[42497]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsddintybongpoauqgqsuqjbzypbdztu ; /usr/bin/python3
Dec 02 07:56:09 np0005541913.localdomain sudo[42497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:09 np0005541913.localdomain python3[42499]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662169.0877914-74135-28281127510382/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:09 np0005541913.localdomain sudo[42497]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:10 np0005541913.localdomain sudo[42559]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzrbsvgnwvnpvouncwukuvcabnrbgzoi ; /usr/bin/python3
Dec 02 07:56:10 np0005541913.localdomain sudo[42559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:10 np0005541913.localdomain python3[42561]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:10 np0005541913.localdomain sudo[42559]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:10 np0005541913.localdomain sudo[42602]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwxflkktwyaagvjtxtbyuuwvsgcgjyzh ; /usr/bin/python3
Dec 02 07:56:10 np0005541913.localdomain sudo[42602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:10 np0005541913.localdomain python3[42604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662170.1694684-74200-173283403494308/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:10 np0005541913.localdomain sudo[42602]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:11 np0005541913.localdomain sudo[42664]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uasqmwewfpggezolasaziiywsthkvukq ; /usr/bin/python3
Dec 02 07:56:11 np0005541913.localdomain sudo[42664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:11 np0005541913.localdomain python3[42666]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:11 np0005541913.localdomain sudo[42664]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:11 np0005541913.localdomain sudo[42707]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxsfcyqcgjriouqiqtntuudpqfoeqatu ; /usr/bin/python3
Dec 02 07:56:11 np0005541913.localdomain sudo[42707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:11 np0005541913.localdomain python3[42709]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662171.1695676-74264-270578668711725/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:11 np0005541913.localdomain sudo[42707]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:12 np0005541913.localdomain sudo[42769]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jffalbuuaivegyxsirnakuzxxlnhbane ; /usr/bin/python3
Dec 02 07:56:12 np0005541913.localdomain sudo[42769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:12 np0005541913.localdomain python3[42771]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:12 np0005541913.localdomain sudo[42769]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:13 np0005541913.localdomain sudo[42812]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvoawujcbubevrkoadogfssbwlzedtxu ; /usr/bin/python3
Dec 02 07:56:13 np0005541913.localdomain sudo[42812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:13 np0005541913.localdomain python3[42814]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662172.1352103-74315-62201974635096/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:13 np0005541913.localdomain sudo[42812]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:13 np0005541913.localdomain sudo[42842]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blbbtvpnzhwbdanlaoocnbjooynoonjh ; /usr/bin/python3
Dec 02 07:56:13 np0005541913.localdomain sudo[42842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:13 np0005541913.localdomain python3[42844]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:56:14 np0005541913.localdomain sudo[42842]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:14 np0005541913.localdomain sudo[42907]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yljgvuxmuigqljwirhdhlpxikveweems ; /usr/bin/python3
Dec 02 07:56:14 np0005541913.localdomain sudo[42907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:14 np0005541913.localdomain python3[42909]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/tripleo-chains.nft"
                                                         include "/etc/nftables/tripleo-rules.nft"
                                                         include "/etc/nftables/tripleo-jumps.nft"
                                                          state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:14 np0005541913.localdomain sudo[42907]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:14 np0005541913.localdomain sudo[42924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvuubyexwxxwuporblzydcyrvhriptej ; /usr/bin/python3
Dec 02 07:56:14 np0005541913.localdomain sudo[42924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:14 np0005541913.localdomain python3[42926]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:56:14 np0005541913.localdomain sudo[42924]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:15 np0005541913.localdomain sudo[42941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixvprlsugnlgfnzizqelgzneeffxxxsd ; /usr/bin/python3
Dec 02 07:56:15 np0005541913.localdomain sudo[42941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:15 np0005541913.localdomain python3[42943]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:56:15 np0005541913.localdomain sudo[42941]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:15 np0005541913.localdomain sudo[42960]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdmpswnijphjscclnrmioqsssblpyrut ; /usr/bin/python3
Dec 02 07:56:15 np0005541913.localdomain sudo[42960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:15 np0005541913.localdomain python3[42962]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:15 np0005541913.localdomain sudo[42960]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:15 np0005541913.localdomain sudo[42976]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mplsdjrbonelirkspyohcszsqorlhinn ; /usr/bin/python3
Dec 02 07:56:15 np0005541913.localdomain sudo[42976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:15 np0005541913.localdomain python3[42978]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:15 np0005541913.localdomain sudo[42976]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:16 np0005541913.localdomain sudo[42992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vebszmadqhgcsnlgdjxfacfinbcrivuf ; /usr/bin/python3
Dec 02 07:56:16 np0005541913.localdomain sudo[42992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:16 np0005541913.localdomain python3[42994]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:16 np0005541913.localdomain sudo[42992]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:16 np0005541913.localdomain sudo[43008]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfruqmiqcnghzijrgddkyxvegdluvqwh ; /usr/bin/python3
Dec 02 07:56:16 np0005541913.localdomain sudo[43008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:16 np0005541913.localdomain python3[43010]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 02 07:56:17 np0005541913.localdomain sudo[43008]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:17 np0005541913.localdomain sudo[43028]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqqrbfjeqdxigjwtabgioujimwdwdwlz ; /usr/bin/python3
Dec 02 07:56:17 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Dec 02 07:56:17 np0005541913.localdomain sudo[43028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:18 np0005541913.localdomain python3[43030]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 02 07:56:18 np0005541913.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Dec 02 07:56:18 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 07:56:18 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 07:56:18 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 07:56:18 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 07:56:18 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 07:56:18 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 07:56:18 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 07:56:18 np0005541913.localdomain sudo[43028]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:19 np0005541913.localdomain sudo[43049]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dubxcwjfgfhqgvegszfomyvnlodpzebx ; /usr/bin/python3
Dec 02 07:56:19 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 02 07:56:19 np0005541913.localdomain sudo[43049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:19 np0005541913.localdomain python3[43051]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 02 07:56:20 np0005541913.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Dec 02 07:56:20 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 07:56:20 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 07:56:20 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 07:56:20 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 07:56:20 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 07:56:20 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 07:56:20 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 07:56:20 np0005541913.localdomain sudo[43049]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:20 np0005541913.localdomain sudo[43070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeghgwpslpecpbynvdowjjapwmdpvzko ; /usr/bin/python3
Dec 02 07:56:20 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 02 07:56:20 np0005541913.localdomain sudo[43070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:20 np0005541913.localdomain python3[43072]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 02 07:56:21 np0005541913.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Dec 02 07:56:21 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 07:56:21 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 07:56:21 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 07:56:21 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 07:56:21 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 07:56:21 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 07:56:21 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 07:56:21 np0005541913.localdomain sudo[43070]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:21 np0005541913.localdomain sudo[43091]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avhyhvgaoksabmxcuiarnnfzrejpgavc ; /usr/bin/python3
Dec 02 07:56:21 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 02 07:56:21 np0005541913.localdomain sudo[43091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:21 np0005541913.localdomain python3[43093]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:21 np0005541913.localdomain sudo[43091]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:21 np0005541913.localdomain sudo[43107]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-majrmhwhprwrkiksphzwfnoxdqfbjxyq ; /usr/bin/python3
Dec 02 07:56:21 np0005541913.localdomain sudo[43107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:22 np0005541913.localdomain python3[43109]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:22 np0005541913.localdomain sudo[43107]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:22 np0005541913.localdomain sudo[43123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbmaununyieohiglqedvetddeeishxib ; /usr/bin/python3
Dec 02 07:56:22 np0005541913.localdomain sudo[43123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:22 np0005541913.localdomain python3[43125]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:22 np0005541913.localdomain sudo[43123]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:22 np0005541913.localdomain sudo[43139]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvdmcjwyxfaquyjzzxnqilhuqgebiyov ; /usr/bin/python3
Dec 02 07:56:22 np0005541913.localdomain sudo[43139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:22 np0005541913.localdomain python3[43141]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:56:22 np0005541913.localdomain sudo[43139]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:22 np0005541913.localdomain sudo[43155]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huwnlyelugjgqphgcyqgkahhwblntnag ; /usr/bin/python3
Dec 02 07:56:22 np0005541913.localdomain sudo[43155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:23 np0005541913.localdomain python3[43157]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:56:23 np0005541913.localdomain sudo[43155]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:23 np0005541913.localdomain sudo[43172]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrubfnamdrbbsjnjtqznnyqgfnsipbap ; /usr/bin/python3
Dec 02 07:56:23 np0005541913.localdomain sudo[43172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:24 np0005541913.localdomain python3[43174]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:56:27 np0005541913.localdomain sudo[43172]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:27 np0005541913.localdomain sudo[43189]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiphnurxxtltlimxndnmetwllpyuxkdg ; /usr/bin/python3
Dec 02 07:56:27 np0005541913.localdomain sudo[43189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:28 np0005541913.localdomain python3[43191]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:28 np0005541913.localdomain sudo[43189]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:28 np0005541913.localdomain sudo[43237]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvwdhtkbnnkntchmdfhdpykyblfcvuww ; /usr/bin/python3
Dec 02 07:56:28 np0005541913.localdomain sudo[43237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:28 np0005541913.localdomain python3[43239]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:28 np0005541913.localdomain sudo[43237]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:28 np0005541913.localdomain sudo[43280]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmwahuhkuvweubvyzckpqjplaqdlagmp ; /usr/bin/python3
Dec 02 07:56:28 np0005541913.localdomain sudo[43280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:28 np0005541913.localdomain python3[43282]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662188.2161925-75162-51963711957140/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:28 np0005541913.localdomain sudo[43280]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:29 np0005541913.localdomain sudo[43310]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzkfnhkelnvykwqmxnzmmdaogovkckzq ; /usr/bin/python3
Dec 02 07:56:29 np0005541913.localdomain sudo[43310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:29 np0005541913.localdomain python3[43312]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 07:56:29 np0005541913.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 02 07:56:29 np0005541913.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 02 07:56:29 np0005541913.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 02 07:56:29 np0005541913.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 02 07:56:29 np0005541913.localdomain kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 02 07:56:29 np0005541913.localdomain kernel: Bridge firewalling registered
Dec 02 07:56:29 np0005541913.localdomain systemd-modules-load[43315]: Inserted module 'br_netfilter'
Dec 02 07:56:29 np0005541913.localdomain systemd-modules-load[43315]: Module 'msr' is built in
Dec 02 07:56:29 np0005541913.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 02 07:56:29 np0005541913.localdomain sudo[43310]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:29 np0005541913.localdomain sudo[43364]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtembhzahneryzercjmvxkdvbefplrsz ; /usr/bin/python3
Dec 02 07:56:29 np0005541913.localdomain sudo[43364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:30 np0005541913.localdomain python3[43366]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:30 np0005541913.localdomain sudo[43364]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:30 np0005541913.localdomain sudo[43407]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uerlehvajcnlvhrrqtmomhcsfxugwulv ; /usr/bin/python3
Dec 02 07:56:30 np0005541913.localdomain sudo[43407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:30 np0005541913.localdomain python3[43409]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662189.6984463-75228-114654614750421/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:30 np0005541913.localdomain sudo[43407]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:30 np0005541913.localdomain sudo[43437]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dewvqjhshovylgdyikpmeqdmabuujcty ; /usr/bin/python3
Dec 02 07:56:30 np0005541913.localdomain sudo[43437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:30 np0005541913.localdomain python3[43439]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:30 np0005541913.localdomain sudo[43437]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:30 np0005541913.localdomain sudo[43454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srgfmxfagalvjaautvovsrmstjvmcawd ; /usr/bin/python3
Dec 02 07:56:30 np0005541913.localdomain sudo[43454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:31 np0005541913.localdomain python3[43456]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:31 np0005541913.localdomain sudo[43454]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:31 np0005541913.localdomain sudo[43472]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnqgondsodkmckzokjhdroaagmuafnqz ; /usr/bin/python3
Dec 02 07:56:31 np0005541913.localdomain sudo[43472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:31 np0005541913.localdomain python3[43474]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:31 np0005541913.localdomain sudo[43472]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:31 np0005541913.localdomain sudo[43490]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvlrswhhurbkfjljnbehrxzjirwojust ; /usr/bin/python3
Dec 02 07:56:31 np0005541913.localdomain sudo[43490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:31 np0005541913.localdomain python3[43492]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:31 np0005541913.localdomain sudo[43490]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:31 np0005541913.localdomain sudo[43507]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emdjqpgmkvkhhdtsptiducnlpbttnhvy ; /usr/bin/python3
Dec 02 07:56:31 np0005541913.localdomain sudo[43507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:32 np0005541913.localdomain python3[43509]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:32 np0005541913.localdomain sudo[43507]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:32 np0005541913.localdomain sudo[43524]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyplqqgjqxvnluxybkweyxpzjzqoehxl ; /usr/bin/python3
Dec 02 07:56:32 np0005541913.localdomain sudo[43524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:32 np0005541913.localdomain python3[43526]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:32 np0005541913.localdomain sudo[43524]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:32 np0005541913.localdomain sudo[43541]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmyhxyprrzztlgfwwqelrmjlnpjejxbm ; /usr/bin/python3
Dec 02 07:56:32 np0005541913.localdomain sudo[43541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:32 np0005541913.localdomain python3[43543]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:32 np0005541913.localdomain sudo[43541]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:32 np0005541913.localdomain sudo[43559]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppgldvxpqhfteciuoykcxwenejjfthgg ; /usr/bin/python3
Dec 02 07:56:32 np0005541913.localdomain sudo[43559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:33 np0005541913.localdomain python3[43561]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:33 np0005541913.localdomain sudo[43559]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:33 np0005541913.localdomain sudo[43577]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhqpltedqvqmtthoeoxgtfgisascttku ; /usr/bin/python3
Dec 02 07:56:33 np0005541913.localdomain sudo[43577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:33 np0005541913.localdomain python3[43579]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:33 np0005541913.localdomain sudo[43577]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:33 np0005541913.localdomain sudo[43595]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-noqalctoypncgbwmligbbxertvgdffwh ; /usr/bin/python3
Dec 02 07:56:33 np0005541913.localdomain sudo[43595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:33 np0005541913.localdomain python3[43597]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:33 np0005541913.localdomain sudo[43595]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:33 np0005541913.localdomain sudo[43613]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twwcblacaqgxpmlweetjcvlkzhfybekj ; /usr/bin/python3
Dec 02 07:56:33 np0005541913.localdomain sudo[43613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:34 np0005541913.localdomain python3[43615]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:34 np0005541913.localdomain sudo[43613]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:34 np0005541913.localdomain sudo[43631]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtumudlmuicbpitvjcvyipnuiflxztpr ; /usr/bin/python3
Dec 02 07:56:34 np0005541913.localdomain sudo[43631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:34 np0005541913.localdomain python3[43633]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:34 np0005541913.localdomain sudo[43631]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:34 np0005541913.localdomain sudo[43649]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwadtmmybqanoiyvcprhleucwfwuefcg ; /usr/bin/python3
Dec 02 07:56:34 np0005541913.localdomain sudo[43649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:34 np0005541913.localdomain python3[43651]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:34 np0005541913.localdomain sudo[43649]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:34 np0005541913.localdomain sudo[43667]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdulnberqhollooyuumquionkuwgnzlg ; /usr/bin/python3
Dec 02 07:56:34 np0005541913.localdomain sudo[43667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:34 np0005541913.localdomain python3[43669]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:34 np0005541913.localdomain sudo[43667]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:35 np0005541913.localdomain sudo[43684]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvqeyyujssknxjkofqxtdirinpiubkxl ; /usr/bin/python3
Dec 02 07:56:35 np0005541913.localdomain sudo[43684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:35 np0005541913.localdomain python3[43686]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:35 np0005541913.localdomain sudo[43684]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:35 np0005541913.localdomain sudo[43701]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iskfxdzzrrriyatyzndtsskzgvjmydcm ; /usr/bin/python3
Dec 02 07:56:35 np0005541913.localdomain sudo[43701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:35 np0005541913.localdomain python3[43703]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:35 np0005541913.localdomain sudo[43701]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:35 np0005541913.localdomain sudo[43718]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abieyxghnkafoyehupekvvhydkrivnud ; /usr/bin/python3
Dec 02 07:56:35 np0005541913.localdomain sudo[43718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:35 np0005541913.localdomain python3[43720]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:35 np0005541913.localdomain sudo[43718]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:36 np0005541913.localdomain sudo[43735]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thzfbdtyrcsksrehixssvqfsbpbyuzhb ; /usr/bin/python3
Dec 02 07:56:36 np0005541913.localdomain sudo[43735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:36 np0005541913.localdomain python3[43737]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 02 07:56:36 np0005541913.localdomain sudo[43735]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:36 np0005541913.localdomain sudo[43753]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjcyizqnniukmiwdsxhiagdhdbxyixgw ; /usr/bin/python3
Dec 02 07:56:36 np0005541913.localdomain sudo[43753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:36 np0005541913.localdomain python3[43755]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 07:56:36 np0005541913.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 02 07:56:36 np0005541913.localdomain systemd[1]: Stopped Apply Kernel Variables.
Dec 02 07:56:36 np0005541913.localdomain systemd[1]: Stopping Apply Kernel Variables...
Dec 02 07:56:36 np0005541913.localdomain systemd[1]: Starting Apply Kernel Variables...
Dec 02 07:56:36 np0005541913.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 02 07:56:36 np0005541913.localdomain systemd[1]: Finished Apply Kernel Variables.
Dec 02 07:56:36 np0005541913.localdomain sudo[43753]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:37 np0005541913.localdomain sudo[43773]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgoylbdijeqsffssubcyohzaroabxwvi ; /usr/bin/python3
Dec 02 07:56:37 np0005541913.localdomain sudo[43773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:37 np0005541913.localdomain python3[43775]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:37 np0005541913.localdomain sudo[43773]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:37 np0005541913.localdomain sudo[43789]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdkkvehdlqiilswhjaisjrcbzyrnbukr ; /usr/bin/python3
Dec 02 07:56:37 np0005541913.localdomain sudo[43789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:37 np0005541913.localdomain python3[43791]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:37 np0005541913.localdomain sudo[43789]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:37 np0005541913.localdomain sudo[43805]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maxmimeiyougifocspdjxuhqottgkxis ; /usr/bin/python3
Dec 02 07:56:37 np0005541913.localdomain sudo[43805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:37 np0005541913.localdomain python3[43807]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:37 np0005541913.localdomain sudo[43805]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:38 np0005541913.localdomain sudo[43821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eospvvxtmlgodreqnnldroaptpewsirr ; /usr/bin/python3
Dec 02 07:56:38 np0005541913.localdomain sudo[43821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:38 np0005541913.localdomain python3[43823]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:56:38 np0005541913.localdomain sudo[43821]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:38 np0005541913.localdomain sudo[43837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-viwdrqruwqbgilcluryaxtxixekaucwd ; /usr/bin/python3
Dec 02 07:56:38 np0005541913.localdomain sudo[43837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:38 np0005541913.localdomain python3[43839]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:38 np0005541913.localdomain sudo[43837]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:38 np0005541913.localdomain sudo[43853]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adwynirtwqyvznbhegtgvbfodgrapizd ; /usr/bin/python3
Dec 02 07:56:38 np0005541913.localdomain sudo[43853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:38 np0005541913.localdomain python3[43855]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:39 np0005541913.localdomain sudo[43853]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:39 np0005541913.localdomain sudo[43869]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvkbuejdyzkihensoqzgdeiuthaawvju ; /usr/bin/python3
Dec 02 07:56:39 np0005541913.localdomain sudo[43869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:39 np0005541913.localdomain python3[43871]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:39 np0005541913.localdomain sudo[43869]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:39 np0005541913.localdomain sudo[43885]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xovllbkxbhirtkiceegtmdciubdqtbtn ; /usr/bin/python3
Dec 02 07:56:39 np0005541913.localdomain sudo[43885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:39 np0005541913.localdomain python3[43887]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:39 np0005541913.localdomain sudo[43885]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:39 np0005541913.localdomain sudo[43901]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wabrkpwsquyljntuetncwqzjfanmfawe ; /usr/bin/python3
Dec 02 07:56:39 np0005541913.localdomain sudo[43901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:39 np0005541913.localdomain python3[43903]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:39 np0005541913.localdomain sudo[43901]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:40 np0005541913.localdomain sudo[43949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwhfvkmkbwvbhhipblhrfufdihtxmqpl ; /usr/bin/python3
Dec 02 07:56:40 np0005541913.localdomain sudo[43949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:40 np0005541913.localdomain python3[43951]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:40 np0005541913.localdomain sudo[43949]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:40 np0005541913.localdomain sudo[43992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quohjttfvspfmbpjhnkqlymjuvglvjbt ; /usr/bin/python3
Dec 02 07:56:40 np0005541913.localdomain sudo[43992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:40 np0005541913.localdomain python3[43994]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662200.0371897-75713-15013321711854/source _original_basename=tmprx53x7m_ follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:40 np0005541913.localdomain sudo[43992]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:40 np0005541913.localdomain sudo[44022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rerfysnrmhwluwikmbayzaeseiniwnam ; /usr/bin/python3
Dec 02 07:56:40 np0005541913.localdomain sudo[44022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:41 np0005541913.localdomain python3[44024]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:56:41 np0005541913.localdomain sudo[44022]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:42 np0005541913.localdomain sudo[44039]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwrlzepqbnpigtvcxwoplbwmddwgdydd ; /usr/bin/python3
Dec 02 07:56:42 np0005541913.localdomain sudo[44039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:42 np0005541913.localdomain python3[44041]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:42 np0005541913.localdomain sudo[44039]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:43 np0005541913.localdomain sudo[44087]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iubxygljldrcewkexvfnqvqsslqcfbmm ; /usr/bin/python3
Dec 02 07:56:43 np0005541913.localdomain sudo[44087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:43 np0005541913.localdomain python3[44089]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:43 np0005541913.localdomain sudo[44087]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:43 np0005541913.localdomain sudo[44130]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djsivxegsiyfzojwtjrenouocgbusisc ; /usr/bin/python3
Dec 02 07:56:43 np0005541913.localdomain sudo[44130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:43 np0005541913.localdomain python3[44132]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662202.987473-75875-263441585664043/source _original_basename=tmpaxtbu9uv follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:43 np0005541913.localdomain sudo[44130]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:44 np0005541913.localdomain sudo[44160]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icuneosbguljbcdlnjaacnwxzlxvmxmw ; /usr/bin/python3
Dec 02 07:56:44 np0005541913.localdomain sudo[44160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:44 np0005541913.localdomain python3[44162]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:44 np0005541913.localdomain sudo[44160]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:44 np0005541913.localdomain sudo[44176]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivbetyrwpbdpdtvqbcghwzjgwwlnmdxb ; /usr/bin/python3
Dec 02 07:56:44 np0005541913.localdomain sudo[44176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:44 np0005541913.localdomain python3[44178]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:44 np0005541913.localdomain sudo[44176]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:44 np0005541913.localdomain sudo[44192]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgmjsagyomcqxpsffjcnrzaardfzhmpg ; /usr/bin/python3
Dec 02 07:56:44 np0005541913.localdomain sudo[44192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:44 np0005541913.localdomain python3[44194]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:44 np0005541913.localdomain sudo[44192]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:45 np0005541913.localdomain sudo[44208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nywpxxbfibjintjkiuwkalwphznhwwjc ; /usr/bin/python3
Dec 02 07:56:45 np0005541913.localdomain sudo[44208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:45 np0005541913.localdomain python3[44210]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:45 np0005541913.localdomain sudo[44208]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:45 np0005541913.localdomain sudo[44224]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldzqflagodtfvslczxvfwoykqybzzpym ; /usr/bin/python3
Dec 02 07:56:45 np0005541913.localdomain sudo[44224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:45 np0005541913.localdomain python3[44226]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:45 np0005541913.localdomain sudo[44224]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:45 np0005541913.localdomain sudo[44240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dujihaqmpfxpagzunhqysmcjrxcitpwj ; /usr/bin/python3
Dec 02 07:56:45 np0005541913.localdomain sudo[44240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:45 np0005541913.localdomain python3[44242]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:45 np0005541913.localdomain sudo[44240]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:46 np0005541913.localdomain sudo[44256]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnlhumfuqzgtdwwbrzcbpfndbmjxtpte ; /usr/bin/python3
Dec 02 07:56:46 np0005541913.localdomain sudo[44256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:46 np0005541913.localdomain python3[44258]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:46 np0005541913.localdomain sudo[44256]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:46 np0005541913.localdomain sudo[44272]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bddnptdpaibtthnjbymubsaarvhnxlgd ; /usr/bin/python3
Dec 02 07:56:46 np0005541913.localdomain sudo[44272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:46 np0005541913.localdomain python3[44274]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:46 np0005541913.localdomain sudo[44272]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:46 np0005541913.localdomain sudo[44288]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grqikdnhueoysjxbivggvxwwtbtznufg ; /usr/bin/python3
Dec 02 07:56:46 np0005541913.localdomain sudo[44288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:46 np0005541913.localdomain python3[44290]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:46 np0005541913.localdomain sudo[44288]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:47 np0005541913.localdomain sudo[44304]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgprkdcrhcbsxaafuejoxjnjahmaddbe ; /usr/bin/python3
Dec 02 07:56:47 np0005541913.localdomain sudo[44304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:47 np0005541913.localdomain sudo[44307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:56:47 np0005541913.localdomain sudo[44307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:56:47 np0005541913.localdomain sudo[44307]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:47 np0005541913.localdomain python3[44306]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Dec 02 07:56:47 np0005541913.localdomain groupadd[44327]: group added to /etc/group: name=qemu, GID=107
Dec 02 07:56:47 np0005541913.localdomain groupadd[44327]: group added to /etc/gshadow: name=qemu
Dec 02 07:56:47 np0005541913.localdomain sudo[44322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 07:56:47 np0005541913.localdomain groupadd[44327]: new group: name=qemu, GID=107
Dec 02 07:56:47 np0005541913.localdomain sudo[44322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:56:47 np0005541913.localdomain sudo[44304]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:47 np0005541913.localdomain sudo[44376]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibxjwpbhllkdhhgousbtgrezjicehibf ; /usr/bin/python3
Dec 02 07:56:47 np0005541913.localdomain sudo[44322]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:47 np0005541913.localdomain sudo[44376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:47 np0005541913.localdomain sudo[44379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:56:47 np0005541913.localdomain sudo[44379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:56:47 np0005541913.localdomain sudo[44379]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:48 np0005541913.localdomain python3[44378]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005541913.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 02 07:56:48 np0005541913.localdomain useradd[44395]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Dec 02 07:56:48 np0005541913.localdomain sudo[44396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:56:48 np0005541913.localdomain sudo[44396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:56:48 np0005541913.localdomain sudo[44376]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:48 np0005541913.localdomain sudo[44430]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rravltkjfzmtlrwcjdtovuhvaofqvbfk ; /usr/bin/python3
Dec 02 07:56:48 np0005541913.localdomain sudo[44430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:48 np0005541913.localdomain python3[44432]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Dec 02 07:56:48 np0005541913.localdomain sudo[44430]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:48 np0005541913.localdomain sudo[44466]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpjxutvendmttxugysoyvzckduqivcad ; /usr/bin/python3
Dec 02 07:56:48 np0005541913.localdomain sudo[44466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:48 np0005541913.localdomain sudo[44396]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:48 np0005541913.localdomain python3[44478]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:56:48 np0005541913.localdomain sudo[44466]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:49 np0005541913.localdomain sudo[44527]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxscxvgamevgvjaimhqnogqwlykcunmp ; /usr/bin/python3
Dec 02 07:56:49 np0005541913.localdomain sudo[44527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:49 np0005541913.localdomain sudo[44528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:56:49 np0005541913.localdomain sudo[44528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:56:49 np0005541913.localdomain sudo[44528]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:49 np0005541913.localdomain python3[44542]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:49 np0005541913.localdomain sudo[44527]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:49 np0005541913.localdomain sudo[44585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oihepdtdfnxlpteyrwnhhsrkplbojmhc ; /usr/bin/python3
Dec 02 07:56:49 np0005541913.localdomain sudo[44585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:49 np0005541913.localdomain python3[44587]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662209.1391532-76182-222739817819908/source _original_basename=tmpoc3bs4fq follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:49 np0005541913.localdomain sudo[44585]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:50 np0005541913.localdomain sudo[44615]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfetqjevnkgqhrbmtpnxtxzafgmgaokq ; /usr/bin/python3
Dec 02 07:56:50 np0005541913.localdomain sudo[44615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:50 np0005541913.localdomain python3[44617]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 02 07:56:50 np0005541913.localdomain sudo[44615]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:51 np0005541913.localdomain sudo[44639]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xktttgfkmxbnunsiosyzbedopfgrnkje ; /usr/bin/python3
Dec 02 07:56:51 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 02 07:56:51 np0005541913.localdomain sudo[44639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:51 np0005541913.localdomain python3[44641]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:51 np0005541913.localdomain sudo[44639]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:51 np0005541913.localdomain sudo[44655]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfeqrfvahscivbeizeuicngpyolcqtqy ; /usr/bin/python3
Dec 02 07:56:51 np0005541913.localdomain sudo[44655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:51 np0005541913.localdomain python3[44657]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:56:51 np0005541913.localdomain sudo[44655]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:51 np0005541913.localdomain sudo[44671]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnnuvealzhwytcyexrvijorjvgoszvtv ; /usr/bin/python3
Dec 02 07:56:51 np0005541913.localdomain sudo[44671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:52 np0005541913.localdomain python3[44673]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Dec 02 07:56:52 np0005541913.localdomain sudo[44671]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:53 np0005541913.localdomain sudo[44691]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghorjojxlukschkhamikdtmccollgqje ; /usr/bin/python3
Dec 02 07:56:53 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec 02 07:56:53 np0005541913.localdomain sudo[44691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:53 np0005541913.localdomain python3[44693]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:56:56 np0005541913.localdomain sudo[44691]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:56 np0005541913.localdomain sudo[44708]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uacspeqqwaytvvldvtbtjsutkndtopfl ; /usr/bin/python3
Dec 02 07:56:56 np0005541913.localdomain sudo[44708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:57 np0005541913.localdomain python3[44710]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 07:56:57 np0005541913.localdomain sudo[44708]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:57 np0005541913.localdomain sudo[44769]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpybtomhoqlwptsuzkxdcpcpsfjvxucj ; /usr/bin/python3
Dec 02 07:56:57 np0005541913.localdomain sudo[44769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:57 np0005541913.localdomain python3[44771]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:57 np0005541913.localdomain sudo[44769]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:57 np0005541913.localdomain sudo[44785]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nchlfpmbigactwunequzigzdeobqoylr ; /usr/bin/python3
Dec 02 07:56:57 np0005541913.localdomain sudo[44785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:58 np0005541913.localdomain python3[44787]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:56:58 np0005541913.localdomain sudo[44785]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:58 np0005541913.localdomain sudo[44844]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkfixrqousdokhmleononagwzhqywpog ; /usr/bin/python3
Dec 02 07:56:58 np0005541913.localdomain sudo[44844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:58 np0005541913.localdomain python3[44846]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:58 np0005541913.localdomain sudo[44844]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:58 np0005541913.localdomain sudo[44887]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ricsnjqvfxwnjfxmnzjridsegqyezsqs ; /usr/bin/python3
Dec 02 07:56:58 np0005541913.localdomain sudo[44887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:59 np0005541913.localdomain python3[44889]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662218.2124321-76572-115944801792356/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=c2417559b11b6be9524eb43292c609dbba924ea1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:56:59 np0005541913.localdomain sudo[44887]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:59 np0005541913.localdomain sudo[44949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siysuxmekyqhpinorlalxlovfcmdbatd ; /usr/bin/python3
Dec 02 07:56:59 np0005541913.localdomain sudo[44949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:56:59 np0005541913.localdomain python3[44951]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:56:59 np0005541913.localdomain sudo[44949]: pam_unix(sudo:session): session closed for user root
Dec 02 07:56:59 np0005541913.localdomain sudo[44994]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhvdvdqyybniasgzijsdsbcwjugegeyz ; /usr/bin/python3
Dec 02 07:56:59 np0005541913.localdomain sudo[44994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:00 np0005541913.localdomain python3[44996]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662219.2084208-76682-13346576187072/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:00 np0005541913.localdomain sudo[44994]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:00 np0005541913.localdomain sudo[45024]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lddbnpbirghckgnvhbzwflmymsapqjjz ; /usr/bin/python3
Dec 02 07:57:00 np0005541913.localdomain sudo[45024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:00 np0005541913.localdomain python3[45026]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:00 np0005541913.localdomain sudo[45024]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:00 np0005541913.localdomain sudo[45040]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edyqlwpwsaslmsietznlcufjdgsejeco ; /usr/bin/python3
Dec 02 07:57:00 np0005541913.localdomain sudo[45040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:00 np0005541913.localdomain python3[45042]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:00 np0005541913.localdomain sudo[45040]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:00 np0005541913.localdomain sudo[45056]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjhxjsgzgysjxzcovosybthfuzznjhcx ; /usr/bin/python3
Dec 02 07:57:00 np0005541913.localdomain sudo[45056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:00 np0005541913.localdomain python3[45058]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:00 np0005541913.localdomain sudo[45056]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:01 np0005541913.localdomain sudo[45072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmgmvixikduhyaiainebksbehnkontlw ; /usr/bin/python3
Dec 02 07:57:01 np0005541913.localdomain sudo[45072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:01 np0005541913.localdomain python3[45074]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:01 np0005541913.localdomain sudo[45072]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:01 np0005541913.localdomain sudo[45120]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlctniwwxpknimccmoescajhealknnim ; /usr/bin/python3
Dec 02 07:57:01 np0005541913.localdomain sudo[45120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:01 np0005541913.localdomain python3[45122]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:57:01 np0005541913.localdomain sudo[45120]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:02 np0005541913.localdomain sudo[45163]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbziinsgyxrmgpegznulvefhzszinxbj ; /usr/bin/python3
Dec 02 07:57:02 np0005541913.localdomain sudo[45163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:02 np0005541913.localdomain python3[45165]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662221.6137989-76793-260819696414042/source _original_basename=tmp6ut5pp_4 follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:02 np0005541913.localdomain sudo[45163]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:02 np0005541913.localdomain sudo[45193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufsxsjtytrbdvzpbeiudgfsqwzhimzxg ; /usr/bin/python3
Dec 02 07:57:02 np0005541913.localdomain sudo[45193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:02 np0005541913.localdomain python3[45195]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:02 np0005541913.localdomain sudo[45193]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:02 np0005541913.localdomain sudo[45209]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alkmoquayeziuaxbgjrruvbgluejjzin ; /usr/bin/python3
Dec 02 07:57:02 np0005541913.localdomain sudo[45209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:03 np0005541913.localdomain python3[45211]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:03 np0005541913.localdomain sudo[45209]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:03 np0005541913.localdomain sudo[45225]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkbqealcglyntnzzunsuxpgofldkpenh ; /usr/bin/python3
Dec 02 07:57:03 np0005541913.localdomain sudo[45225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:03 np0005541913.localdomain python3[45227]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:57:06 np0005541913.localdomain sudo[45225]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:07 np0005541913.localdomain sudo[45274]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjuywwvrrnsoyhqcrxopmimrizcwqbmr ; /usr/bin/python3
Dec 02 07:57:07 np0005541913.localdomain sudo[45274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:07 np0005541913.localdomain python3[45276]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:57:07 np0005541913.localdomain sudo[45274]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:07 np0005541913.localdomain sudo[45319]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiyhcbeajfknspyxbtgovzzwoiwqvjtk ; /usr/bin/python3
Dec 02 07:57:07 np0005541913.localdomain sudo[45319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:07 np0005541913.localdomain python3[45321]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662227.1881635-77016-264859561248747/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:07 np0005541913.localdomain sudo[45319]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:08 np0005541913.localdomain sudo[45350]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuikktxoqjzvlqvtznxfjrhquasyugqg ; /usr/bin/python3
Dec 02 07:57:08 np0005541913.localdomain sudo[45350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:08 np0005541913.localdomain python3[45352]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 02 07:57:08 np0005541913.localdomain sshd[1130]: Received signal 15; terminating.
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: sshd.service: Consumed 2.042s CPU time, read 2.1M from disk, written 8.0K to disk.
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 02 07:57:08 np0005541913.localdomain sshd[45356]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 07:57:08 np0005541913.localdomain sshd[45356]: Server listening on 0.0.0.0 port 22.
Dec 02 07:57:08 np0005541913.localdomain sshd[45356]: Server listening on :: port 22.
Dec 02 07:57:08 np0005541913.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 02 07:57:08 np0005541913.localdomain sudo[45350]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:08 np0005541913.localdomain sudo[45370]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgidoprmdwwoelmebkxcdmgrvheohttc ; /usr/bin/python3
Dec 02 07:57:08 np0005541913.localdomain sudo[45370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:08 np0005541913.localdomain python3[45372]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:08 np0005541913.localdomain sudo[45370]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:09 np0005541913.localdomain sudo[45388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddgwhqhwprceifxoibjnexpgyzyxisrg ; /usr/bin/python3
Dec 02 07:57:09 np0005541913.localdomain sudo[45388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:09 np0005541913.localdomain python3[45390]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:09 np0005541913.localdomain sudo[45388]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:10 np0005541913.localdomain sudo[45406]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mulhuktndojaqklqttueqclakvebqrjm ; /usr/bin/python3
Dec 02 07:57:10 np0005541913.localdomain sudo[45406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:10 np0005541913.localdomain python3[45408]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:57:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 07:57:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3259 writes, 16K keys, 3259 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s
                                                          Cumulative WAL: 3259 writes, 145 syncs, 22.48 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3259 writes, 16K keys, 3259 commit groups, 1.0 writes per commit group, ingest: 14.69 MB, 0.02 MB/s
                                                          Interval WAL: 3259 writes, 145 syncs, 22.48 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 02 07:57:13 np0005541913.localdomain sudo[45406]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:14 np0005541913.localdomain sudo[45455]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vagrxqeebqkrukejxmkpuwcdkvmmqiwb ; /usr/bin/python3
Dec 02 07:57:14 np0005541913.localdomain sudo[45455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:14 np0005541913.localdomain python3[45457]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:57:14 np0005541913.localdomain sudo[45455]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:14 np0005541913.localdomain sudo[45473]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygsdpkozoimbyttuhlghrtkwtwezqpya ; /usr/bin/python3
Dec 02 07:57:14 np0005541913.localdomain sudo[45473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:14 np0005541913.localdomain python3[45475]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:14 np0005541913.localdomain sudo[45473]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:14 np0005541913.localdomain sudo[45503]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdmrwiquzackxmtbbhhzbvwdaeusbyyz ; /usr/bin/python3
Dec 02 07:57:14 np0005541913.localdomain sudo[45503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 07:57:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Cumulative writes: 3388 writes, 16K keys, 3388 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s
                                                          Cumulative WAL: 3388 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3388 writes, 16K keys, 3388 commit groups, 1.0 writes per commit group, ingest: 15.29 MB, 0.03 MB/s
                                                          Interval WAL: 3388 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 02 07:57:15 np0005541913.localdomain python3[45505]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:57:15 np0005541913.localdomain sudo[45503]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:15 np0005541913.localdomain sudo[45553]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cynuywglimzheaafxdruqcjfebbeshvx ; /usr/bin/python3
Dec 02 07:57:15 np0005541913.localdomain sudo[45553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:15 np0005541913.localdomain python3[45555]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:57:15 np0005541913.localdomain sudo[45553]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:16 np0005541913.localdomain sudo[45571]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnrlmepipkuadkybroopdeodsxmbxpfy ; /usr/bin/python3
Dec 02 07:57:16 np0005541913.localdomain sudo[45571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:16 np0005541913.localdomain python3[45573]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:16 np0005541913.localdomain sudo[45571]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:16 np0005541913.localdomain sudo[45601]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lctyqvjiqnoyzqyhvbwgpedblfmxhxld ; /usr/bin/python3
Dec 02 07:57:16 np0005541913.localdomain sudo[45601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:16 np0005541913.localdomain python3[45603]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 07:57:16 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:57:16 np0005541913.localdomain systemd-rc-local-generator[45624]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:57:16 np0005541913.localdomain systemd-sysv-generator[45628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:57:16 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:57:16 np0005541913.localdomain systemd[1]: Starting chronyd online sources service...
Dec 02 07:57:16 np0005541913.localdomain chronyc[45642]: 200 OK
Dec 02 07:57:16 np0005541913.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Dec 02 07:57:16 np0005541913.localdomain systemd[1]: Finished chronyd online sources service.
Dec 02 07:57:17 np0005541913.localdomain sudo[45601]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:17 np0005541913.localdomain sudo[45656]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtkyiqskzkpzlhactquknhveziyjcymf ; /usr/bin/python3
Dec 02 07:57:17 np0005541913.localdomain sudo[45656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:17 np0005541913.localdomain python3[45658]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:17 np0005541913.localdomain chronyd[25712]: System clock was stepped by -0.000069 seconds
Dec 02 07:57:17 np0005541913.localdomain sudo[45656]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:17 np0005541913.localdomain sudo[45673]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkeqelehywhbcgsqtdfpqddaqulycxps ; /usr/bin/python3
Dec 02 07:57:17 np0005541913.localdomain sudo[45673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:17 np0005541913.localdomain python3[45675]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:17 np0005541913.localdomain sudo[45673]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:18 np0005541913.localdomain sudo[45690]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmqinddnrtzjgtvxdcmgzelarscovryr ; /usr/bin/python3
Dec 02 07:57:18 np0005541913.localdomain sudo[45690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:18 np0005541913.localdomain python3[45692]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:18 np0005541913.localdomain chronyd[25712]: System clock was stepped by 0.000000 seconds
Dec 02 07:57:18 np0005541913.localdomain sudo[45690]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:18 np0005541913.localdomain sudo[45707]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfvjiaolgofujjtqezrxrkjgetxxyaec ; /usr/bin/python3
Dec 02 07:57:18 np0005541913.localdomain sudo[45707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:18 np0005541913.localdomain python3[45709]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:18 np0005541913.localdomain sudo[45707]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:18 np0005541913.localdomain sudo[45724]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiszafcibwqxfvpgwbkeyrzrevzqretg ; /usr/bin/python3
Dec 02 07:57:18 np0005541913.localdomain sudo[45724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:18 np0005541913.localdomain python3[45726]: ansible-timezone Invoked with name=UTC hwclock=None
Dec 02 07:57:19 np0005541913.localdomain systemd[1]: Starting Time & Date Service...
Dec 02 07:57:19 np0005541913.localdomain systemd[1]: Started Time & Date Service.
Dec 02 07:57:20 np0005541913.localdomain sudo[45724]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:20 np0005541913.localdomain sudo[45744]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfmnnzfwnqqlenrpuhwivklqnjvgfwdo ; /usr/bin/python3
Dec 02 07:57:20 np0005541913.localdomain sudo[45744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:21 np0005541913.localdomain python3[45746]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:21 np0005541913.localdomain sudo[45744]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:21 np0005541913.localdomain sudo[45761]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyujwbybjsvreapydyholnkuhbzvutyr ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Dec 02 07:57:21 np0005541913.localdomain sudo[45761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:21 np0005541913.localdomain python3[45763]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:21 np0005541913.localdomain sudo[45761]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:22 np0005541913.localdomain sudo[45778]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbhkavfdvsikhvdwtsbbiqephyimqbpq ; /usr/bin/python3
Dec 02 07:57:22 np0005541913.localdomain sudo[45778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:22 np0005541913.localdomain python3[45780]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Dec 02 07:57:22 np0005541913.localdomain sudo[45778]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:22 np0005541913.localdomain sudo[45794]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuyqjzvurfhzcaozqfiexgbjzgzaxwnv ; /usr/bin/python3
Dec 02 07:57:22 np0005541913.localdomain sudo[45794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:22 np0005541913.localdomain python3[45796]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:57:22 np0005541913.localdomain sudo[45794]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:22 np0005541913.localdomain sudo[45810]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlmfbutzdzrmublaayfyacwdoooqvhsw ; /usr/bin/python3
Dec 02 07:57:22 np0005541913.localdomain sudo[45810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:23 np0005541913.localdomain python3[45812]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:23 np0005541913.localdomain sudo[45810]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:23 np0005541913.localdomain sudo[45826]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akcmhfwwksiiuahzncdrjmzcpvuuykpo ; /usr/bin/python3
Dec 02 07:57:23 np0005541913.localdomain sudo[45826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:23 np0005541913.localdomain python3[45828]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:23 np0005541913.localdomain sudo[45826]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:23 np0005541913.localdomain sudo[45874]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydhsexjxzgnrjyfhfkevawbzgwynjjsv ; /usr/bin/python3
Dec 02 07:57:23 np0005541913.localdomain sudo[45874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:23 np0005541913.localdomain python3[45876]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:57:23 np0005541913.localdomain sudo[45874]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:24 np0005541913.localdomain sudo[45917]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utlblgqlckhihbocpxgdjlgjapngvkws ; /usr/bin/python3
Dec 02 07:57:24 np0005541913.localdomain sudo[45917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:24 np0005541913.localdomain python3[45919]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662243.5780742-78161-8054668052950/source _original_basename=tmp24xzf3zx follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:24 np0005541913.localdomain sudo[45917]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:24 np0005541913.localdomain sudo[45979]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkgyyizhqbqdkbjzxacbmidyreztsdwg ; /usr/bin/python3
Dec 02 07:57:24 np0005541913.localdomain sudo[45979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:24 np0005541913.localdomain python3[45981]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:57:24 np0005541913.localdomain sudo[45979]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:24 np0005541913.localdomain sudo[46022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykuyrkoalwdnepifqhfhdidvgwdbtnen ; /usr/bin/python3
Dec 02 07:57:24 np0005541913.localdomain sudo[46022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:25 np0005541913.localdomain python3[46024]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662244.4650989-78213-258433896603807/source _original_basename=tmpakzmg5ne follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:25 np0005541913.localdomain sudo[46022]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:25 np0005541913.localdomain sudo[46052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwtoxpchwkudvsbujekwqymvjewyxfbt ; /usr/bin/python3
Dec 02 07:57:25 np0005541913.localdomain sudo[46052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:25 np0005541913.localdomain python3[46054]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 07:57:25 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:57:25 np0005541913.localdomain systemd-sysv-generator[46087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:57:25 np0005541913.localdomain systemd-rc-local-generator[46080]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:57:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:57:26 np0005541913.localdomain sudo[46052]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:26 np0005541913.localdomain sudo[46106]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abbfcldxlcpkurpkpbhvdzrgbxttwanj ; /usr/bin/python3
Dec 02 07:57:26 np0005541913.localdomain sudo[46106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:26 np0005541913.localdomain python3[46108]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:26 np0005541913.localdomain sudo[46106]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:26 np0005541913.localdomain sudo[46122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgsfkpivcpaewpjifvljrdsokntscuvn ; /usr/bin/python3
Dec 02 07:57:26 np0005541913.localdomain sudo[46122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:26 np0005541913.localdomain python3[46124]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:26 np0005541913.localdomain systemd[35843]: Created slice User Background Tasks Slice.
Dec 02 07:57:26 np0005541913.localdomain systemd[35843]: Starting Cleanup of User's Temporary Files and Directories...
Dec 02 07:57:26 np0005541913.localdomain sudo[46122]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:26 np0005541913.localdomain systemd[35843]: Finished Cleanup of User's Temporary Files and Directories.
Dec 02 07:57:26 np0005541913.localdomain sudo[46140]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blzcmirctuzwfpzftsgdkoqwblkuzxvn ; /usr/bin/python3
Dec 02 07:57:26 np0005541913.localdomain sudo[46140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:27 np0005541913.localdomain python3[46142]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:57:27 np0005541913.localdomain systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Dec 02 07:57:27 np0005541913.localdomain sudo[46140]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:27 np0005541913.localdomain sudo[46157]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utoaozgbrmyrogugwtrisdxocqqlxtkl ; /usr/bin/python3
Dec 02 07:57:27 np0005541913.localdomain sudo[46157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:27 np0005541913.localdomain python3[46159]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:27 np0005541913.localdomain sudo[46157]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:27 np0005541913.localdomain sudo[46173]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rciubgiaehrapcdgjwioletmdzrpqnya ; /usr/bin/python3
Dec 02 07:57:27 np0005541913.localdomain sudo[46173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:27 np0005541913.localdomain python3[46175]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:27 np0005541913.localdomain sudo[46173]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:28 np0005541913.localdomain sudo[46221]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oypwlcgatwspvbabljibbridgksqmdcc ; /usr/bin/python3
Dec 02 07:57:28 np0005541913.localdomain sudo[46221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:28 np0005541913.localdomain python3[46223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:57:28 np0005541913.localdomain sudo[46221]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:28 np0005541913.localdomain sudo[46264]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhyzmmpraktitddjfsfvfyfzjxihtmhd ; /usr/bin/python3
Dec 02 07:57:28 np0005541913.localdomain sudo[46264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:28 np0005541913.localdomain python3[46266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662247.886411-78380-122593191027078/source _original_basename=tmp9y49mo2w follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:28 np0005541913.localdomain sudo[46264]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:49 np0005541913.localdomain sudo[46281]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:57:49 np0005541913.localdomain sudo[46281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:57:49 np0005541913.localdomain sudo[46281]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:49 np0005541913.localdomain sudo[46296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:57:49 np0005541913.localdomain sudo[46296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:57:49 np0005541913.localdomain sudo[46324]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juwfjvjgpxhuofrtvmazdvfjeycbcfik ; /usr/bin/python3
Dec 02 07:57:49 np0005541913.localdomain sudo[46324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:49 np0005541913.localdomain python3[46326]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 07:57:49 np0005541913.localdomain sudo[46324]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:50 np0005541913.localdomain sudo[46354]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-webvsquiacpaspydapnuizerbkrxkekz ; /usr/bin/python3
Dec 02 07:57:50 np0005541913.localdomain sudo[46354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:50 np0005541913.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 02 07:57:50 np0005541913.localdomain python3[46359]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Dec 02 07:57:50 np0005541913.localdomain sudo[46354]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:50 np0005541913.localdomain sudo[46296]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:50 np0005541913.localdomain sudo[46389]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhoebosekooigjgvxeoaibrqvtqorweq ; /usr/bin/python3
Dec 02 07:57:50 np0005541913.localdomain sudo[46389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:50 np0005541913.localdomain python3[46391]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 07:57:50 np0005541913.localdomain sudo[46389]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:50 np0005541913.localdomain sudo[46405]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vimclpvewswctrarbvlturmpkrzkujrn ; /usr/bin/python3
Dec 02 07:57:50 np0005541913.localdomain sudo[46405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:50 np0005541913.localdomain python3[46407]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:50 np0005541913.localdomain sudo[46405]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:51 np0005541913.localdomain sudo[46421]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swhtukixsihwpaddjzwocibfjzdrgmqx ; /usr/bin/python3
Dec 02 07:57:51 np0005541913.localdomain sudo[46421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:51 np0005541913.localdomain python3[46423]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:57:51 np0005541913.localdomain sudo[46421]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:51 np0005541913.localdomain sudo[46437]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itefrnhwqowoizekrjkallrjjhzkbonj ; /usr/bin/python3
Dec 02 07:57:51 np0005541913.localdomain sudo[46437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:51 np0005541913.localdomain python3[46439]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 02 07:57:51 np0005541913.localdomain sudo[46440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:57:51 np0005541913.localdomain sudo[46440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:57:51 np0005541913.localdomain sudo[46440]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:52 np0005541913.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Dec 02 07:57:52 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 07:57:52 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 07:57:52 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 07:57:52 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 07:57:52 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 07:57:52 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 07:57:52 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 07:57:52 np0005541913.localdomain sudo[46437]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:52 np0005541913.localdomain sudo[46473]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpnfjxicuvoygrkzjheyeiqxlkkioykz ; /usr/bin/python3
Dec 02 07:57:52 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 02 07:57:52 np0005541913.localdomain sudo[46473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:52 np0005541913.localdomain python3[46475]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 07:57:52 np0005541913.localdomain sudo[46473]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:53 np0005541913.localdomain sudo[46489]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzloenrqmbresahfdkkaxkzfuxbioawj ; /usr/bin/python3
Dec 02 07:57:53 np0005541913.localdomain sudo[46489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:53 np0005541913.localdomain sudo[46489]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:53 np0005541913.localdomain sudo[46537]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lachmfjyxnvappxlmwsvxqxhgltaikzh ; /usr/bin/python3
Dec 02 07:57:53 np0005541913.localdomain sudo[46537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:53 np0005541913.localdomain sudo[46537]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:53 np0005541913.localdomain sudo[46580]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewwlopcrcogoaknlfjplthaqtkwwzzvd ; /usr/bin/python3
Dec 02 07:57:53 np0005541913.localdomain sudo[46580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:54 np0005541913.localdomain sudo[46580]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:54 np0005541913.localdomain sudo[46610]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjbnjdnkbdhifkbmrcmsfezwcoqbluxy ; /usr/bin/python3
Dec 02 07:57:54 np0005541913.localdomain sudo[46610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:54 np0005541913.localdomain python3[46612]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_libvirt_init_secret': {'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, 'nova_wait_for_compute_service': {'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}}}
Dec 02 07:57:54 np0005541913.localdomain sudo[46610]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:54 np0005541913.localdomain rsyslogd[754]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Dec 02 07:57:55 np0005541913.localdomain sudo[46626]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izeadartlktvunxkewtlocbxjfogoqid ; /usr/bin/python3
Dec 02 07:57:55 np0005541913.localdomain sudo[46626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:55 np0005541913.localdomain python3[46628]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 07:57:55 np0005541913.localdomain sudo[46626]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:55 np0005541913.localdomain sudo[46642]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioxdbokuabthswvvihvmskklipkmkjyr ; /usr/bin/python3
Dec 02 07:57:55 np0005541913.localdomain sudo[46642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:55 np0005541913.localdomain python3[46644]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 07:57:55 np0005541913.localdomain sudo[46642]: pam_unix(sudo:session): session closed for user root
Dec 02 07:57:55 np0005541913.localdomain sudo[46658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egypyptivltjhwrptmwoeefvzmiuusoo ; /usr/bin/python3
Dec 02 07:57:55 np0005541913.localdomain sudo[46658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:57:56 np0005541913.localdomain python3[46660]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Dec 02 07:57:56 np0005541913.localdomain sudo[46658]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:00 np0005541913.localdomain sudo[46706]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyaiclyblurlmfkfwqueasgcaqjmjofp ; /usr/bin/python3
Dec 02 07:58:00 np0005541913.localdomain sudo[46706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:00 np0005541913.localdomain python3[46708]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:58:00 np0005541913.localdomain sudo[46706]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:00 np0005541913.localdomain sudo[46749]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgyasjcedpaocrgptlmfvsynocsrowel ; /usr/bin/python3
Dec 02 07:58:00 np0005541913.localdomain sudo[46749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:01 np0005541913.localdomain python3[46751]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662280.4645784-79786-116757097494743/source _original_basename=tmpgkd2kwug follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:58:01 np0005541913.localdomain sudo[46749]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:01 np0005541913.localdomain sudo[46779]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apsojxathhkzwfrfzmorczesjgtrdleq ; /usr/bin/python3
Dec 02 07:58:01 np0005541913.localdomain sudo[46779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:01 np0005541913.localdomain python3[46781]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:58:01 np0005541913.localdomain sudo[46779]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:02 np0005541913.localdomain sudo[46829]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgzztxbldhmggstjjyjnvczmnxqaqsuk ; /usr/bin/python3
Dec 02 07:58:02 np0005541913.localdomain sudo[46829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:02 np0005541913.localdomain sudo[46829]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:02 np0005541913.localdomain sudo[46872]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrdjqvjnqxqneqvrtoyyjqhfuheyepws ; /usr/bin/python3
Dec 02 07:58:02 np0005541913.localdomain sudo[46872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:02 np0005541913.localdomain sudo[46872]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:03 np0005541913.localdomain sudo[46902]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtqejepvarfdurddgzdnfppitqgrkbsn ; /usr/bin/python3
Dec 02 07:58:03 np0005541913.localdomain sudo[46902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:03 np0005541913.localdomain python3[46904]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 07:58:03 np0005541913.localdomain sudo[46902]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:03 np0005541913.localdomain sudo[46950]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjjijyquhuaqsopzkryjtmzridnjeuhn ; /usr/bin/python3
Dec 02 07:58:03 np0005541913.localdomain sudo[46950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:03 np0005541913.localdomain sudo[46950]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:04 np0005541913.localdomain sudo[46993]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjupuxzzocqaqvtwhchnodocrymhcire ; /usr/bin/python3
Dec 02 07:58:04 np0005541913.localdomain sudo[46993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:04 np0005541913.localdomain sudo[46993]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:04 np0005541913.localdomain sudo[47023]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quedbglsnlpagxmxzxtlkokhunxgdgvr ; /usr/bin/python3
Dec 02 07:58:04 np0005541913.localdomain sudo[47023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:04 np0005541913.localdomain python3[47025]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 02 07:58:04 np0005541913.localdomain sudo[47023]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:07 np0005541913.localdomain sudo[47039]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nllhjlwqwvsxvmaxtsqhtueavaydkbkj ; /usr/bin/python3
Dec 02 07:58:07 np0005541913.localdomain sudo[47039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:07 np0005541913.localdomain python3[47041]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:58:07 np0005541913.localdomain sudo[47039]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:08 np0005541913.localdomain sudo[47056]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czicdhqdplkjxjezfpujquwxfwflwqug ; /usr/bin/python3
Dec 02 07:58:08 np0005541913.localdomain sudo[47056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:08 np0005541913.localdomain python3[47058]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 07:58:12 np0005541913.localdomain dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Dec 02 07:58:12 np0005541913.localdomain dbus-broker-launch[18431]: Noticed file-system modification, trigger reload.
Dec 02 07:58:12 np0005541913.localdomain dbus-broker-launch[18431]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 02 07:58:12 np0005541913.localdomain dbus-broker-launch[18431]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 02 07:58:12 np0005541913.localdomain dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Dec 02 07:58:12 np0005541913.localdomain systemd[1]: Reexecuting.
Dec 02 07:58:12 np0005541913.localdomain systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 02 07:58:12 np0005541913.localdomain systemd[1]: Detected virtualization kvm.
Dec 02 07:58:12 np0005541913.localdomain systemd[1]: Detected architecture x86-64.
Dec 02 07:58:12 np0005541913.localdomain systemd-rc-local-generator[47108]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:58:12 np0005541913.localdomain systemd-sysv-generator[47114]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:58:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:58:21 np0005541913.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Dec 02 07:58:21 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 07:58:21 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 07:58:21 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 07:58:21 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 07:58:21 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 07:58:21 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 07:58:21 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 07:58:21 np0005541913.localdomain dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Dec 02 07:58:21 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 02 07:58:21 np0005541913.localdomain dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:58:22 np0005541913.localdomain systemd-rc-local-generator[47202]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:58:22 np0005541913.localdomain systemd-sysv-generator[47206]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Stopping Journal Service...
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 02 07:58:22 np0005541913.localdomain systemd-journald[619]: Received SIGTERM from PID 1 (systemd).
Dec 02 07:58:22 np0005541913.localdomain systemd-journald[619]: Journal stopped
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Stopped Journal Service.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: systemd-journald.service: Consumed 2.394s CPU time.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Starting Journal Service...
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: systemd-udevd.service: Consumed 2.796s CPU time.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 02 07:58:22 np0005541913.localdomain systemd-journald[47611]: Journal started
Dec 02 07:58:22 np0005541913.localdomain systemd-journald[47611]: Runtime Journal (/run/log/journal/510530184876bdc0ebb29e7199f63471) is 12.1M, max 314.7M, 302.6M free.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Started Journal Service.
Dec 02 07:58:22 np0005541913.localdomain systemd-journald[47611]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Dec 02 07:58:22 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 07:58:22 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 07:58:22 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 07:58:22 np0005541913.localdomain systemd-udevd[47615]: Using default interface naming scheme 'rhel-9.0'.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 02 07:58:22 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 07:58:23 np0005541913.localdomain systemd-rc-local-generator[48125]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 07:58:23 np0005541913.localdomain systemd-sysv-generator[48129]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 07:58:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 07:58:23 np0005541913.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 07:58:23 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 07:58:23 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 07:58:23 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.311s CPU time.
Dec 02 07:58:23 np0005541913.localdomain systemd[1]: run-r6161ad4a52914a2aa570c55418ab4a33.service: Deactivated successfully.
Dec 02 07:58:23 np0005541913.localdomain systemd[1]: run-r2b8b533960ea431da1e01c2576fcd26a.service: Deactivated successfully.
Dec 02 07:58:24 np0005541913.localdomain sudo[47056]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:25 np0005541913.localdomain sudo[48551]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eptiaptwqmxdquqemaogywwrktwpeiod ; /usr/bin/python3
Dec 02 07:58:25 np0005541913.localdomain sudo[48551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:25 np0005541913.localdomain python3[48553]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Dec 02 07:58:25 np0005541913.localdomain sudo[48551]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:25 np0005541913.localdomain sudo[48570]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxfowqvkeynjqprzziegstkahhpyetey ; /usr/bin/python3
Dec 02 07:58:25 np0005541913.localdomain sudo[48570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:25 np0005541913.localdomain python3[48572]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 07:58:25 np0005541913.localdomain sudo[48570]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:26 np0005541913.localdomain sudo[48588]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdibjhpwfjwdhemdtvwlwjhthmgbarof ; /usr/bin/python3
Dec 02 07:58:26 np0005541913.localdomain sudo[48588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:26 np0005541913.localdomain python3[48590]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:58:26 np0005541913.localdomain python3[48590]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Dec 02 07:58:26 np0005541913.localdomain python3[48590]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Dec 02 07:58:34 np0005541913.localdomain podman[48602]: 2025-12-02 07:58:26.65340634 +0000 UTC m=+0.050326178 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 02 07:58:34 np0005541913.localdomain python3[48590]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json
Dec 02 07:58:34 np0005541913.localdomain sudo[48588]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:34 np0005541913.localdomain sudo[48701]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihakwybtotsvbwzjxhhdxenvosnwfzxk ; /usr/bin/python3
Dec 02 07:58:34 np0005541913.localdomain sudo[48701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:34 np0005541913.localdomain python3[48703]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:58:34 np0005541913.localdomain python3[48703]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Dec 02 07:58:34 np0005541913.localdomain python3[48703]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Dec 02 07:58:42 np0005541913.localdomain podman[48715]: 2025-12-02 07:58:35.006137911 +0000 UTC m=+0.044771792 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 02 07:58:42 np0005541913.localdomain python3[48703]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json
Dec 02 07:58:42 np0005541913.localdomain sudo[48701]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:42 np0005541913.localdomain sudo[48815]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcapbabooysrbelpabgygwbhdotrimli ; /usr/bin/python3
Dec 02 07:58:42 np0005541913.localdomain sudo[48815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:58:43 np0005541913.localdomain python3[48817]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:58:43 np0005541913.localdomain python3[48817]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Dec 02 07:58:43 np0005541913.localdomain python3[48817]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Dec 02 07:58:51 np0005541913.localdomain sudo[48949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:58:51 np0005541913.localdomain sudo[48949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:58:51 np0005541913.localdomain sudo[48949]: pam_unix(sudo:session): session closed for user root
Dec 02 07:58:52 np0005541913.localdomain sudo[49167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 07:58:52 np0005541913.localdomain sudo[49167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:59:01 np0005541913.localdomain podman[48831]: 2025-12-02 07:58:43.204949959 +0000 UTC m=+0.047198135 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 07:59:01 np0005541913.localdomain python3[48817]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json
Dec 02 07:59:01 np0005541913.localdomain sudo[48815]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:01 np0005541913.localdomain sudo[49695]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kktvztqeqzxmhvpnskbkwxdbnbuvaymb ; /usr/bin/python3
Dec 02 07:59:01 np0005541913.localdomain sudo[49695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:01 np0005541913.localdomain systemd[1]: tmp-crun.xYTdwa.mount: Deactivated successfully.
Dec 02 07:59:01 np0005541913.localdomain podman[49693]: 2025-12-02 07:59:01.570018219 +0000 UTC m=+0.115813341 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, RELEASE=main, io.openshift.tags=rhceph ceph, release=1763362218, version=7, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.41.4, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 07:59:01 np0005541913.localdomain python3[49702]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:59:01 np0005541913.localdomain python3[49702]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Dec 02 07:59:01 np0005541913.localdomain python3[49702]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Dec 02 07:59:01 np0005541913.localdomain podman[49693]: 2025-12-02 07:59:01.718142287 +0000 UTC m=+0.263937319 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, name=rhceph, io.buildah.version=1.41.4, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=)
Dec 02 07:59:01 np0005541913.localdomain sudo[49167]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:02 np0005541913.localdomain sudo[49787]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 07:59:02 np0005541913.localdomain sudo[49787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:59:02 np0005541913.localdomain sudo[49787]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:02 np0005541913.localdomain sudo[49802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 07:59:02 np0005541913.localdomain sudo[49802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:59:02 np0005541913.localdomain sudo[49802]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:03 np0005541913.localdomain sudo[49874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 07:59:03 np0005541913.localdomain sudo[49874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 07:59:03 np0005541913.localdomain sudo[49874]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:17 np0005541913.localdomain podman[49729]: 2025-12-02 07:59:01.717270395 +0000 UTC m=+0.046088347 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 02 07:59:17 np0005541913.localdomain python3[49702]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json
Dec 02 07:59:17 np0005541913.localdomain sudo[49695]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:17 np0005541913.localdomain sudo[49929]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjasbwwpmegyzbzjzfpzkiutncwluxla ; /usr/bin/python3
Dec 02 07:59:17 np0005541913.localdomain sudo[49929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:17 np0005541913.localdomain python3[49931]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:59:17 np0005541913.localdomain python3[49931]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Dec 02 07:59:17 np0005541913.localdomain python3[49931]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Dec 02 07:59:24 np0005541913.localdomain podman[49945]: 2025-12-02 07:59:17.636254064 +0000 UTC m=+0.041411249 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 02 07:59:24 np0005541913.localdomain python3[49931]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json
Dec 02 07:59:24 np0005541913.localdomain sudo[49929]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:24 np0005541913.localdomain sudo[50285]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxhxkdknyrmbdinhxxnmgdywhbkaotsd ; /usr/bin/python3
Dec 02 07:59:24 np0005541913.localdomain sudo[50285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:24 np0005541913.localdomain python3[50287]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:59:24 np0005541913.localdomain python3[50287]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Dec 02 07:59:24 np0005541913.localdomain python3[50287]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Dec 02 07:59:29 np0005541913.localdomain podman[50299]: 2025-12-02 07:59:24.531149034 +0000 UTC m=+0.041659215 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 02 07:59:29 np0005541913.localdomain python3[50287]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json
Dec 02 07:59:29 np0005541913.localdomain sudo[50285]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:29 np0005541913.localdomain sudo[50376]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xovreddcpsjyfvsqnrfahqomlgytqxvf ; /usr/bin/python3
Dec 02 07:59:29 np0005541913.localdomain sudo[50376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:29 np0005541913.localdomain python3[50378]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:59:29 np0005541913.localdomain python3[50378]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Dec 02 07:59:29 np0005541913.localdomain python3[50378]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Dec 02 07:59:31 np0005541913.localdomain podman[50390]: 2025-12-02 07:59:29.831405019 +0000 UTC m=+0.038341620 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 02 07:59:31 np0005541913.localdomain python3[50378]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json
Dec 02 07:59:31 np0005541913.localdomain sudo[50376]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:32 np0005541913.localdomain sudo[50466]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zddltzcrodhuvhfinmuffiqqswqegejl ; /usr/bin/python3
Dec 02 07:59:32 np0005541913.localdomain sudo[50466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:32 np0005541913.localdomain python3[50468]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:59:32 np0005541913.localdomain python3[50468]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Dec 02 07:59:32 np0005541913.localdomain python3[50468]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Dec 02 07:59:34 np0005541913.localdomain podman[50481]: 2025-12-02 07:59:32.280853304 +0000 UTC m=+0.050851388 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 02 07:59:34 np0005541913.localdomain python3[50468]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json
Dec 02 07:59:34 np0005541913.localdomain sudo[50466]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:35 np0005541913.localdomain sudo[50557]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwehjeflrgsgiavxfmdtbbhzaxxeusxd ; /usr/bin/python3
Dec 02 07:59:35 np0005541913.localdomain sudo[50557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:35 np0005541913.localdomain python3[50559]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:59:35 np0005541913.localdomain python3[50559]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Dec 02 07:59:35 np0005541913.localdomain python3[50559]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Dec 02 07:59:37 np0005541913.localdomain podman[50571]: 2025-12-02 07:59:35.293712433 +0000 UTC m=+0.045543423 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Dec 02 07:59:37 np0005541913.localdomain python3[50559]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json
Dec 02 07:59:37 np0005541913.localdomain sudo[50557]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:37 np0005541913.localdomain sudo[50647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdphxvzanimbstrfggwvcrtzuqrpcrct ; /usr/bin/python3
Dec 02 07:59:37 np0005541913.localdomain sudo[50647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:37 np0005541913.localdomain python3[50649]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:59:37 np0005541913.localdomain python3[50649]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Dec 02 07:59:38 np0005541913.localdomain python3[50649]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Dec 02 07:59:41 np0005541913.localdomain podman[50661]: 2025-12-02 07:59:38.058394633 +0000 UTC m=+0.032357729 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 02 07:59:41 np0005541913.localdomain python3[50649]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json
Dec 02 07:59:41 np0005541913.localdomain sudo[50647]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:42 np0005541913.localdomain sudo[50748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypocengpoeevltcgzzudkdjdqloidrbi ; /usr/bin/python3
Dec 02 07:59:42 np0005541913.localdomain sudo[50748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:42 np0005541913.localdomain python3[50750]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 02 07:59:42 np0005541913.localdomain python3[50750]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Dec 02 07:59:42 np0005541913.localdomain python3[50750]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Dec 02 07:59:44 np0005541913.localdomain podman[50764]: 2025-12-02 07:59:42.318556666 +0000 UTC m=+0.046503699 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 02 07:59:44 np0005541913.localdomain python3[50750]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json
Dec 02 07:59:45 np0005541913.localdomain sudo[50748]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:45 np0005541913.localdomain sudo[50838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bauejtkdfpsbwknidzxriqghphdnggra ; /usr/bin/python3
Dec 02 07:59:45 np0005541913.localdomain sudo[50838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:45 np0005541913.localdomain python3[50840]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:59:45 np0005541913.localdomain sudo[50838]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:46 np0005541913.localdomain sudo[50888]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkihbdjemtefeowdsyiuozrnfcsspbko ; /usr/bin/python3
Dec 02 07:59:46 np0005541913.localdomain sudo[50888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:46 np0005541913.localdomain sudo[50888]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:46 np0005541913.localdomain sudo[50906]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idnzqodbrhjqqmfinssjekuxczevgnil ; /usr/bin/python3
Dec 02 07:59:46 np0005541913.localdomain sudo[50906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:46 np0005541913.localdomain sudo[50906]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:47 np0005541913.localdomain sudo[51010]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcoquhzuprmkbuhsdafubzgnbstfvyjd ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662386.7326825-82665-83397263209155/async_wrapper.py 241885978677 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662386.7326825-82665-83397263209155/AnsiballZ_command.py _
Dec 02 07:59:47 np0005541913.localdomain sudo[51010]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 07:59:47 np0005541913.localdomain ansible-async_wrapper.py[51012]: Invoked with 241885978677 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662386.7326825-82665-83397263209155/AnsiballZ_command.py _
Dec 02 07:59:47 np0005541913.localdomain ansible-async_wrapper.py[51015]: Starting module and watcher
Dec 02 07:59:47 np0005541913.localdomain ansible-async_wrapper.py[51015]: Start watching 51016 (3600)
Dec 02 07:59:47 np0005541913.localdomain ansible-async_wrapper.py[51016]: Start module (51016)
Dec 02 07:59:47 np0005541913.localdomain ansible-async_wrapper.py[51012]: Return async_wrapper task started.
Dec 02 07:59:47 np0005541913.localdomain sudo[51010]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:47 np0005541913.localdomain sudo[51031]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpldflfoompguijbivaicbfhcfvluxoc ; /usr/bin/python3
Dec 02 07:59:47 np0005541913.localdomain sudo[51031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:47 np0005541913.localdomain python3[51036]: ansible-ansible.legacy.async_status Invoked with jid=241885978677.51012 mode=status _async_dir=/tmp/.ansible_async
Dec 02 07:59:47 np0005541913.localdomain sudo[51031]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:    (file: /etc/puppet/hiera.yaml)
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Warning: Undefined variable '::deploy_config_name';
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:    (file & line not available)
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:    (file & line not available)
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.12 seconds
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Notice: Applied catalog in 0.05 seconds
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Application:
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:    Initial environment: production
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:    Converged environment: production
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:          Run mode: user
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Changes:
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:             Total: 3
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Events:
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:           Success: 3
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:             Total: 3
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Resources:
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:           Changed: 3
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:       Out of sync: 3
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:             Total: 10
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Time:
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:          Schedule: 0.00
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:              File: 0.00
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:              Exec: 0.01
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:            Augeas: 0.02
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:    Transaction evaluation: 0.05
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:    Catalog application: 0.05
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:    Config retrieval: 0.15
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:          Last run: 1764662391
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:        Filebucket: 0.00
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:             Total: 0.05
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]: Version:
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:            Config: 1764662391
Dec 02 07:59:51 np0005541913.localdomain puppet-user[51035]:            Puppet: 7.10.0
Dec 02 07:59:51 np0005541913.localdomain ansible-async_wrapper.py[51016]: Module complete (51016)
Dec 02 07:59:52 np0005541913.localdomain ansible-async_wrapper.py[51015]: Done in kid B.
Dec 02 07:59:57 np0005541913.localdomain sudo[51264]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tezkxljbfqbwgxtvjhxmmuldjtzjpxzv ; /usr/bin/python3
Dec 02 07:59:57 np0005541913.localdomain sudo[51264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:57 np0005541913.localdomain python3[51266]: ansible-ansible.legacy.async_status Invoked with jid=241885978677.51012 mode=status _async_dir=/tmp/.ansible_async
Dec 02 07:59:57 np0005541913.localdomain sudo[51264]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:58 np0005541913.localdomain sudo[51280]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyvqvyxtnwbtmknqcosjmsootoaldnoy ; /usr/bin/python3
Dec 02 07:59:58 np0005541913.localdomain sudo[51280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:58 np0005541913.localdomain python3[51282]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 07:59:58 np0005541913.localdomain sudo[51280]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:58 np0005541913.localdomain sudo[51296]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiouckrlzwahqjutqqqrohxlhfgllklu ; /usr/bin/python3
Dec 02 07:59:58 np0005541913.localdomain sudo[51296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:58 np0005541913.localdomain python3[51298]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 07:59:58 np0005541913.localdomain sudo[51296]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:59 np0005541913.localdomain sudo[51344]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muudfdtpghavivhjbtershcrsysysnsq ; /usr/bin/python3
Dec 02 07:59:59 np0005541913.localdomain sudo[51344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:59 np0005541913.localdomain python3[51346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 07:59:59 np0005541913.localdomain sudo[51344]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:59 np0005541913.localdomain sudo[51387]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dogdwhqaywtuvpcscotgkoyyshjytkwo ; /usr/bin/python3
Dec 02 07:59:59 np0005541913.localdomain sudo[51387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:59 np0005541913.localdomain python3[51389]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662398.962913-82874-175783869190601/source _original_basename=tmpnlkqsi8w follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 07:59:59 np0005541913.localdomain sudo[51387]: pam_unix(sudo:session): session closed for user root
Dec 02 07:59:59 np0005541913.localdomain sudo[51417]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shwbammagayocciqyeeyrhyvuxqhbdyh ; /usr/bin/python3
Dec 02 07:59:59 np0005541913.localdomain sudo[51417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 07:59:59 np0005541913.localdomain python3[51419]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 07:59:59 np0005541913.localdomain sudo[51417]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:00 np0005541913.localdomain sudo[51433]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooumwgwblkavpsqszmikuupucccntycu ; /usr/bin/python3
Dec 02 08:00:00 np0005541913.localdomain sudo[51433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:00 np0005541913.localdomain sudo[51433]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:01 np0005541913.localdomain sudo[51520]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cteckajkltcuuddvnwnyfoczggtqhcjd ; /usr/bin/python3
Dec 02 08:00:01 np0005541913.localdomain sudo[51520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:01 np0005541913.localdomain python3[51522]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 02 08:00:01 np0005541913.localdomain sudo[51520]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:01 np0005541913.localdomain sudo[51539]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyjfyjgdidejrnaygonesjtucelugtgr ; /usr/bin/python3
Dec 02 08:00:01 np0005541913.localdomain sudo[51539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:01 np0005541913.localdomain python3[51541]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 08:00:01 np0005541913.localdomain sudo[51539]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:01 np0005541913.localdomain sudo[51555]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgvrfbhepztgejahzawrepkbnqivrjtj ; /usr/bin/python3
Dec 02 08:00:01 np0005541913.localdomain sudo[51555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:01 np0005541913.localdomain python3[51557]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005541913 step=1 update_config_hash_only=False
Dec 02 08:00:01 np0005541913.localdomain sudo[51555]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:02 np0005541913.localdomain sudo[51571]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gynjibykuijanljiqbaorddsvroqtkar ; /usr/bin/python3
Dec 02 08:00:02 np0005541913.localdomain sudo[51571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:02 np0005541913.localdomain python3[51573]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:02 np0005541913.localdomain sudo[51571]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:02 np0005541913.localdomain sudo[51587]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snyotptdsfporbmcxgajlakdjvesbgvf ; /usr/bin/python3
Dec 02 08:00:02 np0005541913.localdomain sudo[51587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:03 np0005541913.localdomain python3[51589]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 02 08:00:03 np0005541913.localdomain sudo[51587]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:03 np0005541913.localdomain sudo[51590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:00:03 np0005541913.localdomain sudo[51590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:00:03 np0005541913.localdomain sudo[51590]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:03 np0005541913.localdomain sudo[51605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:00:03 np0005541913.localdomain sudo[51605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:00:03 np0005541913.localdomain sudo[51633]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiqntriahhzexfhdrkisqcylmavdngqy ; /usr/bin/python3
Dec 02 08:00:03 np0005541913.localdomain sudo[51633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:03 np0005541913.localdomain python3[51635]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 08:00:03 np0005541913.localdomain sudo[51633]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:04 np0005541913.localdomain sudo[51605]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:04 np0005541913.localdomain sudo[51705]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adhxlvevrqfxqflnvqypbnuqpzjskokk ; /usr/bin/python3
Dec 02 08:00:04 np0005541913.localdomain sudo[51705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:04 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Dec 02 08:00:05 np0005541913.localdomain podman[51896]: 2025-12-02 08:00:05.078080846 +0000 UTC m=+0.102450652 container create 79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, container_name=container-puppet-iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public)
Dec 02 08:00:05 np0005541913.localdomain podman[51884]: 2025-12-02 08:00:04.991572711 +0000 UTC m=+0.026811472 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 02 08:00:05 np0005541913.localdomain podman[51896]: 2025-12-02 08:00:05.007430247 +0000 UTC m=+0.031800023 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 02 08:00:05 np0005541913.localdomain podman[51914]: 2025-12-02 08:00:05.112350694 +0000 UTC m=+0.118765171 container create 79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, distribution-scope=public, config_id=tripleo_puppet_step1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-nova_libvirt, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libpod-conmon-79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16.scope.
Dec 02 08:00:05 np0005541913.localdomain podman[51914]: 2025-12-02 08:00:05.020829178 +0000 UTC m=+0.027243675 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:00:05 np0005541913.localdomain podman[51905]: 2025-12-02 08:00:05.131386272 +0000 UTC m=+0.146871276 container create 028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, vcs-type=git, container_name=container-puppet-crond, architecture=x86_64, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 02 08:00:05 np0005541913.localdomain podman[51906]: 2025-12-02 08:00:05.14049244 +0000 UTC m=+0.150110780 container create 8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=container-puppet-metrics_qdr, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libpod-conmon-79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d.scope.
Dec 02 08:00:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0052f13d91303294194500e25d2f8e0888afaf1ca7e6de5d98fbefe304631472/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0052f13d91303294194500e25d2f8e0888afaf1ca7e6de5d98fbefe304631472/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libpod-conmon-028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf.scope.
Dec 02 08:00:05 np0005541913.localdomain podman[51905]: 2025-12-02 08:00:05.061505982 +0000 UTC m=+0.076991016 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libpod-conmon-8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60.scope.
Dec 02 08:00:05 np0005541913.localdomain podman[51896]: 2025-12-02 08:00:05.164284964 +0000 UTC m=+0.188654770 container init 79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=container-puppet-iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-iscsid-container)
Dec 02 08:00:05 np0005541913.localdomain podman[51906]: 2025-12-02 08:00:05.063089424 +0000 UTC m=+0.072707764 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b388412fca905b307e07ab1555f64621018b9abe733ff2c7e7266decb6c12c8d/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:05 np0005541913.localdomain podman[51896]: 2025-12-02 08:00:05.177598733 +0000 UTC m=+0.201968529 container start 79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, distribution-scope=public, config_id=tripleo_puppet_step1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.expose-services=, container_name=container-puppet-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:00:05 np0005541913.localdomain podman[51896]: 2025-12-02 08:00:05.178439574 +0000 UTC m=+0.202809380 container attach 79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:00:05 np0005541913.localdomain podman[51884]: 2025-12-02 08:00:05.182195752 +0000 UTC m=+0.217434493 container create 486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=container-puppet-collectd, build-date=2025-11-18T22:51:28Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:00:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1605e3642cbc6f4a340468563ba343adf6d0f8a3115728727d8e4543418cb20/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/104925f4f3140d86c4d76991cbbe20b0ea2114e629deebdf08f0de90504ded5f/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libpod-conmon-486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1.scope.
Dec 02 08:00:05 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b063472ae149eb518ac7d99c3a97d11dcdfc09eaeb34ff91e9c6e02d02ccc47e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:06 np0005541913.localdomain podman[51905]: 2025-12-02 08:00:06.458215921 +0000 UTC m=+1.473700945 container init 028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12)
Dec 02 08:00:06 np0005541913.localdomain podman[51905]: 2025-12-02 08:00:06.469845645 +0000 UTC m=+1.485330719 container start 028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=container-puppet-crond, config_id=tripleo_puppet_step1, com.redhat.component=openstack-cron-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team)
Dec 02 08:00:06 np0005541913.localdomain podman[51905]: 2025-12-02 08:00:06.470211944 +0000 UTC m=+1.485697018 container attach 028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, container_name=container-puppet-crond, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1)
Dec 02 08:00:06 np0005541913.localdomain podman[51884]: 2025-12-02 08:00:06.502589132 +0000 UTC m=+1.537827893 container init 486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=container-puppet-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_puppet_step1, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:00:06 np0005541913.localdomain systemd[1]: tmp-crun.Mw5HZ6.mount: Deactivated successfully.
Dec 02 08:00:06 np0005541913.localdomain podman[51884]: 2025-12-02 08:00:06.521736894 +0000 UTC m=+1.556975655 container start 486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:00:06 np0005541913.localdomain podman[51884]: 2025-12-02 08:00:06.522064332 +0000 UTC m=+1.557303093 container attach 486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-type=git, config_id=tripleo_puppet_step1, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, release=1761123044, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1)
Dec 02 08:00:06 np0005541913.localdomain sudo[51988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:00:06 np0005541913.localdomain sudo[51988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:00:06 np0005541913.localdomain sudo[51988]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:06 np0005541913.localdomain podman[51906]: 2025-12-02 08:00:06.571025124 +0000 UTC m=+1.580643454 container init 8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=container-puppet-metrics_qdr, url=https://www.redhat.com, config_id=tripleo_puppet_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:00:06 np0005541913.localdomain podman[51906]: 2025-12-02 08:00:06.641115159 +0000 UTC m=+1.650733529 container start 8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, distribution-scope=public, io.openshift.expose-services=)
Dec 02 08:00:06 np0005541913.localdomain podman[51906]: 2025-12-02 08:00:06.641469918 +0000 UTC m=+1.651088278 container attach 8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, container_name=container-puppet-metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, config_id=tripleo_puppet_step1, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 02 08:00:06 np0005541913.localdomain podman[51914]: 2025-12-02 08:00:06.654804137 +0000 UTC m=+1.661218634 container init 79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=container-puppet-nova_libvirt, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 08:00:06 np0005541913.localdomain podman[51914]: 2025-12-02 08:00:06.661498262 +0000 UTC m=+1.667912749 container start 79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=container-puppet-nova_libvirt, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:00:06 np0005541913.localdomain podman[51914]: 2025-12-02 08:00:06.661728249 +0000 UTC m=+1.668142756 container attach 79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, tcib_managed=true, container_name=container-puppet-nova_libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain ovs-vsctl[52352]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.08 seconds
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Dec 02 08:00:08 np0005541913.localdomain crontab[52359]: (root) LIST (root)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.10 seconds
Dec 02 08:00:08 np0005541913.localdomain crontab[52370]: (root) REPLACE (root)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: Accepting previously invalid value for target type 'Integer'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Notice: Applied catalog in 0.04 seconds
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Application:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:    Initial environment: production
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:    Converged environment: production
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:          Run mode: user
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Changes:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:             Total: 2
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Events:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:           Success: 2
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:             Total: 2
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Resources:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:           Changed: 2
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:       Out of sync: 2
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:           Skipped: 7
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:             Total: 9
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Time:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:              File: 0.01
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:              Cron: 0.01
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:    Transaction evaluation: 0.04
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:    Catalog application: 0.04
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:    Config retrieval: 0.11
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:          Last run: 1764662408
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:             Total: 0.05
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]: Version:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:            Config: 1764662408
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52021]:            Puppet: 7.10.0
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.13 seconds
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}ce90bebf2484546c06edc1852bcd172057e5aa8cc85a9be28cc54d45adc16782'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Notice: Applied catalog in 0.03 seconds
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Application:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:    Initial environment: production
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:    Converged environment: production
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:          Run mode: user
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Changes:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:             Total: 7
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Events:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:           Success: 7
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:             Total: 7
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Resources:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:           Skipped: 13
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:           Changed: 5
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:       Out of sync: 5
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:             Total: 20
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Time:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:              File: 0.01
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:    Transaction evaluation: 0.03
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:    Catalog application: 0.03
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:    Config retrieval: 0.16
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:          Last run: 1764662408
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:             Total: 0.03
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]: Version:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:            Config: 1764662408
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52049]:            Puppet: 7.10.0
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]:    (file & line not available)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.35 seconds
Dec 02 08:00:08 np0005541913.localdomain systemd[1]: libpod-028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf.scope: Deactivated successfully.
Dec 02 08:00:08 np0005541913.localdomain systemd[1]: libpod-028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf.scope: Consumed 2.149s CPU time.
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: in a future release. Use nova::cinder::os_region_name instead
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: in a future release. Use nova::cinder::catalog_info instead
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Dec 02 08:00:08 np0005541913.localdomain systemd[1]: libpod-8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60.scope: Deactivated successfully.
Dec 02 08:00:08 np0005541913.localdomain systemd[1]: libpod-8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60.scope: Consumed 2.108s CPU time.
Dec 02 08:00:08 np0005541913.localdomain podman[51906]: 2025-12-02 08:00:08.827666615 +0000 UTC m=+3.837284975 container died 8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=container-puppet-metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_puppet_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:00:08 np0005541913.localdomain podman[52477]: 2025-12-02 08:00:08.84692644 +0000 UTC m=+0.077683755 container died 028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, container_name=container-puppet-crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain systemd[1]: tmp-crun.PvtXeP.mount: Deactivated successfully.
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b388412fca905b307e07ab1555f64621018b9abe733ff2c7e7266decb6c12c8d-merged.mount: Deactivated successfully.
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Notice: Applied catalog in 0.49 seconds
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Application:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:    Initial environment: production
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:    Converged environment: production
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:          Run mode: user
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Changes:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:             Total: 4
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Events:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:           Success: 4
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:             Total: 4
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Resources:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:           Changed: 4
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:       Out of sync: 4
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:           Skipped: 8
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:             Total: 13
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Time:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:              File: 0.00
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:              Exec: 0.05
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:    Config retrieval: 0.13
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:            Augeas: 0.41
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:    Transaction evaluation: 0.47
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:    Catalog application: 0.49
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:          Last run: 1764662408
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:             Total: 0.49
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]: Version:
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:            Config: 1764662408
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52002]:            Puppet: 7.10.0
Dec 02 08:00:08 np0005541913.localdomain podman[52477]: 2025-12-02 08:00:08.92182448 +0000 UTC m=+0.152581755 container cleanup 028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, container_name=container-puppet-crond, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}0d4e701b7b2398bbf396579a0713d46d3c496c79edc52f2e260456f359c9a46c'
Dec 02 08:00:08 np0005541913.localdomain podman[52500]: 2025-12-02 08:00:08.939791851 +0000 UTC m=+0.099522607 container cleanup 8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, vcs-type=git, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=container-puppet-metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:00:08 np0005541913.localdomain systemd[1]: libpod-conmon-028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf.scope: Deactivated successfully.
Dec 02 08:00:08 np0005541913.localdomain systemd[1]: libpod-conmon-8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60.scope: Deactivated successfully.
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c'
Dec 02 08:00:08 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005541913 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 02 08:00:08 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005541913 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::qdr
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52063]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Dec 02 08:00:08 np0005541913.localdomain puppet-user[52042]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]: Notice: Applied catalog in 0.29 seconds
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]: Application:
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:    Initial environment: production
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:    Converged environment: production
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:          Run mode: user
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]: Changes:
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:             Total: 43
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]: Events:
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:           Success: 43
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:             Total: 43
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]: Resources:
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:           Skipped: 14
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:           Changed: 38
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:       Out of sync: 38
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:             Total: 82
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]: Time:
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:    Concat fragment: 0.00
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:       Concat file: 0.00
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:              File: 0.13
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:    Transaction evaluation: 0.28
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:    Catalog application: 0.29
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:    Config retrieval: 0.41
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:          Last run: 1764662409
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:             Total: 0.29
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]: Version:
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:            Config: 1764662408
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52042]:            Puppet: 7.10.0
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52063]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52063]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52063]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: libpod-79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16.scope: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: libpod-79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16.scope: Consumed 2.590s CPU time.
Dec 02 08:00:09 np0005541913.localdomain podman[51896]: 2025-12-02 08:00:09.237343241 +0000 UTC m=+4.261713037 container died 79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.openshift.expose-services=, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_puppet_step1, container_name=container-puppet-iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52063]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used.
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: libpod-486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1.scope: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: libpod-486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1.scope: Consumed 2.589s CPU time.
Dec 02 08:00:09 np0005541913.localdomain podman[51884]: 2025-12-02 08:00:09.41950106 +0000 UTC m=+4.454739811 container died 486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, container_name=container-puppet-collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public)
Dec 02 08:00:09 np0005541913.localdomain podman[52651]: 2025-12-02 08:00:09.429887262 +0000 UTC m=+0.197899583 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 02 08:00:09 np0005541913.localdomain podman[52673]: 2025-12-02 08:00:09.707587452 +0000 UTC m=+0.459112851 container cleanup 79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_puppet_step1, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, container_name=container-puppet-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible)
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: libpod-conmon-79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16.scope: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005541913 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::iscsid
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 02 08:00:09 np0005541913.localdomain podman[51781]: 2025-12-02 08:00:04.912725687 +0000 UTC m=+0.042479753 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Dec 02 08:00:09 np0005541913.localdomain podman[52713]: 2025-12-02 08:00:09.758229968 +0000 UTC m=+0.328564483 container cleanup 486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=container-puppet-collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: libpod-conmon-486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1.scope: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain podman[52651]: 2025-12-02 08:00:09.767724746 +0000 UTC m=+0.535737037 container create f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: Started libpod-conmon-f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a.scope.
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5908dabcdc4beecd14375872c1a5b4a4e28c3db557b9e42f64a01ed422f93ce2/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:09 np0005541913.localdomain podman[52972]: 2025-12-02 08:00:09.807656703 +0000 UTC m=+0.320649427 container create 09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:34:05Z, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Dec 02 08:00:09 np0005541913.localdomain podman[52972]: 2025-12-02 08:00:09.721295721 +0000 UTC m=+0.234288445 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: Started libpod-conmon-09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d.scope.
Dec 02 08:00:09 np0005541913.localdomain podman[52651]: 2025-12-02 08:00:09.867204151 +0000 UTC m=+0.635216472 container init f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., architecture=x86_64, container_name=container-puppet-rsyslog, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:49Z, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1)
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:09 np0005541913.localdomain podman[52651]: 2025-12-02 08:00:09.877007998 +0000 UTC m=+0.645020319 container start f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=container-puppet-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, version=17.1.12, com.redhat.component=openstack-rsyslog-container)
Dec 02 08:00:09 np0005541913.localdomain puppet-user[52063]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 1.38 seconds
Dec 02 08:00:09 np0005541913.localdomain podman[52651]: 2025-12-02 08:00:09.882632195 +0000 UTC m=+0.650644516 container attach f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-rsyslog, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc.)
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b063472ae149eb518ac7d99c3a97d11dcdfc09eaeb34ff91e9c6e02d02ccc47e-merged.mount: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-486d48aed113ca9f72cd2937199dcf86d7354e3bd3109dee0d41df3f96bdd7d1-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d1605e3642cbc6f4a340468563ba343adf6d0f8a3115728727d8e4543418cb20-merged.mount: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0052f13d91303294194500e25d2f8e0888afaf1ca7e6de5d98fbefe304631472-merged.mount: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f9ede822be11b60c0a1703a4ec9607dd292d56847ed8465c37bae8fb9e0d08/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f9ede822be11b60c0a1703a4ec9607dd292d56847ed8465c37bae8fb9e0d08/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:09 np0005541913.localdomain podman[52972]: 2025-12-02 08:00:09.896945389 +0000 UTC m=+0.409938113 container init 09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1)
Dec 02 08:00:09 np0005541913.localdomain podman[52972]: 2025-12-02 08:00:09.906737536 +0000 UTC m=+0.419730230 container start 09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc.)
Dec 02 08:00:09 np0005541913.localdomain podman[52972]: 2025-12-02 08:00:09.906952512 +0000 UTC m=+0.419945286 container attach 09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=container-puppet-ovn_controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:00:09 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005541913 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 02 08:00:09 np0005541913.localdomain podman[53031]: 2025-12-02 08:00:09.962544287 +0000 UTC m=+0.087111271 container create d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.4, container_name=container-puppet-ceilometer, name=rhosp17/openstack-ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-19T00:11:59Z, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-central-container, distribution-scope=public)
Dec 02 08:00:10 np0005541913.localdomain systemd[1]: Started libpod-conmon-d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257.scope.
Dec 02 08:00:10 np0005541913.localdomain podman[53031]: 2025-12-02 08:00:09.920468946 +0000 UTC m=+0.045035990 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Dec 02 08:00:10 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e7aea19432089756ed62f0f30cfa5a3f11dba2345bf487cdfbd5c2a4914be89/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:10 np0005541913.localdomain podman[53031]: 2025-12-02 08:00:10.04362406 +0000 UTC m=+0.168191044 container init d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:11:59Z, io.buildah.version=1.41.4, tcib_managed=true, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-central-container, io.openshift.expose-services=)
Dec 02 08:00:10 np0005541913.localdomain podman[53031]: 2025-12-02 08:00:10.053000606 +0000 UTC m=+0.177567610 container start d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-19T00:11:59Z, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-central-container, managed_by=tripleo_ansible, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1)
Dec 02 08:00:10 np0005541913.localdomain podman[53031]: 2025-12-02 08:00:10.053336734 +0000 UTC m=+0.177903718 container attach d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-ceilometer, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, build-date=2025-11-19T00:11:59Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-central-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}5ba64817af7f9555281205611eb52d45214b5127a0e5ce894ff9b319c0723a16'
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Warning: Empty environment setting 'TLS_PASSWORD'
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}8c1883a65300cc327d1cb9c34702b30b2083e07e3f42b734ab7685f1cc6449ef'
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Dec 02 08:00:10 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53072]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53072]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53072]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53072]:    (file & line not available)
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53072]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53072]:    (file & line not available)
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53086]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53086]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53086]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53086]:    (file & line not available)
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53086]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53086]:    (file & line not available)
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53145]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53145]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53145]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53145]:    (file & line not available)
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53145]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53145]:    (file & line not available)
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Dec 02 08:00:11 np0005541913.localdomain puppet-user[53072]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.24 seconds
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.25 seconds
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}3f62d179f65be7c16842a28abf994d6a58e30b2328fb95c74da2c0a9b9529a22'
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53370]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53372]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}c3b156c2c9f08abc530e7f7185a0499af26dcb54a74ced688ff254968e5ee0ca'
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Notice: Applied catalog in 0.12 seconds
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Application:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:    Initial environment: production
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:    Converged environment: production
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:          Run mode: user
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Changes:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:             Total: 3
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Events:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:           Success: 3
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:             Total: 3
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Resources:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:           Skipped: 11
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:           Changed: 3
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:       Out of sync: 3
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:             Total: 25
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Time:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:       Concat file: 0.00
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:    Concat fragment: 0.00
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:              File: 0.02
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:    Transaction evaluation: 0.11
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:    Catalog application: 0.12
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:    Config retrieval: 0.28
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:          Last run: 1764662412
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:             Total: 0.12
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]: Version:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:            Config: 1764662411
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53072]:            Puppet: 7.10.0
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53374]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.107
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53379]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005541913.localdomain
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005541913.novalocal' to 'np0005541913.localdomain'
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53385]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53390]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53392]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.37 seconds
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53394]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53403]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53412]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53414]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:9a:ba:cf
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53421]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain systemd[1]: libpod-f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a.scope: Deactivated successfully.
Dec 02 08:00:12 np0005541913.localdomain systemd[1]: libpod-f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a.scope: Consumed 2.393s CPU time.
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53429]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain ovs-vsctl[53442]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain podman[53431]: 2025-12-02 08:00:12.50721844 +0000 UTC m=+0.043563851 container died f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-rsyslog, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain systemd[1]: tmp-crun.IakQEG.mount: Deactivated successfully.
Dec 02 08:00:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5908dabcdc4beecd14375872c1a5b4a4e28c3db557b9e42f64a01ed422f93ce2-merged.mount: Deactivated successfully.
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Notice: Applied catalog in 0.51 seconds
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Application:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:    Initial environment: production
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:    Converged environment: production
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:          Run mode: user
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Changes:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:             Total: 14
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Events:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:           Success: 14
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:             Total: 14
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Resources:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:           Skipped: 12
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:           Changed: 14
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:       Out of sync: 14
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:             Total: 29
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Time:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:              Exec: 0.02
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:    Config retrieval: 0.29
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:         Vs config: 0.40
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:    Transaction evaluation: 0.44
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:    Catalog application: 0.51
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:          Last run: 1764662412
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:             Total: 0.51
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]: Version:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:            Config: 1764662411
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53086]:            Puppet: 7.10.0
Dec 02 08:00:12 np0005541913.localdomain podman[53431]: 2025-12-02 08:00:12.597734749 +0000 UTC m=+0.134080150 container cleanup f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=container-puppet-rsyslog, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Notice: Applied catalog in 0.44 seconds
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Application:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:    Initial environment: production
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:    Converged environment: production
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:          Run mode: user
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Changes:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:             Total: 31
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Events:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:           Success: 31
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:             Total: 31
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Resources:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:           Skipped: 22
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:           Changed: 31
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:       Out of sync: 31
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:             Total: 151
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Time:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:           Package: 0.03
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:    Ceilometer config: 0.34
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:    Transaction evaluation: 0.43
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:    Catalog application: 0.44
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:    Config retrieval: 0.44
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:          Last run: 1764662412
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:         Resources: 0.00
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:             Total: 0.44
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]: Version:
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:            Config: 1764662411
Dec 02 08:00:12 np0005541913.localdomain puppet-user[53145]:            Puppet: 7.10.0
Dec 02 08:00:12 np0005541913.localdomain systemd[1]: libpod-09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d.scope: Deactivated successfully.
Dec 02 08:00:12 np0005541913.localdomain systemd[1]: libpod-09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d.scope: Consumed 2.790s CPU time.
Dec 02 08:00:13 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Dec 02 08:00:13 np0005541913.localdomain systemd[1]: libpod-d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257.scope: Deactivated successfully.
Dec 02 08:00:13 np0005541913.localdomain systemd[1]: libpod-d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257.scope: Consumed 2.949s CPU time.
Dec 02 08:00:13 np0005541913.localdomain podman[53031]: 2025-12-02 08:00:13.195226193 +0000 UTC m=+3.319793187 container died d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, container_name=container-puppet-ceilometer, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-central-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-central, architecture=x86_64, build-date=2025-11-19T00:11:59Z, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:00:13 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005541913 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 02 08:00:13 np0005541913.localdomain systemd[1]: libpod-conmon-f8196f31e8f2465df3ff647e63a0beabeba6e7bb16edc123f88deea6cfc5636a.scope: Deactivated successfully.
Dec 02 08:00:13 np0005541913.localdomain podman[52972]: 2025-12-02 08:00:13.251166877 +0000 UTC m=+3.764159611 container died 09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, container_name=container-puppet-ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:00:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-86f9ede822be11b60c0a1703a4ec9607dd292d56847ed8465c37bae8fb9e0d08-merged.mount: Deactivated successfully.
Dec 02 08:00:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:13 np0005541913.localdomain podman[53501]: 2025-12-02 08:00:13.697064771 +0000 UTC m=+0.737871489 container cleanup 09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, container_name=container-puppet-ovn_controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Dec 02 08:00:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e7aea19432089756ed62f0f30cfa5a3f11dba2345bf487cdfbd5c2a4914be89-merged.mount: Deactivated successfully.
Dec 02 08:00:13 np0005541913.localdomain systemd[1]: libpod-conmon-09803f8cf8cd21b86529e70478133611d0fc830d7e92da00d23b8653587bd24d.scope: Deactivated successfully.
Dec 02 08:00:13 np0005541913.localdomain podman[53117]: 2025-12-02 08:00:10.100671404 +0000 UTC m=+0.041426046 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Dec 02 08:00:13 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Dec 02 08:00:13 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Dec 02 08:00:13 np0005541913.localdomain podman[53539]: 2025-12-02 08:00:13.762124704 +0000 UTC m=+0.552097065 container cleanup d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-type=git, build-date=2025-11-19T00:11:59Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=container-puppet-ceilometer, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-central, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-ceilometer-central-container)
Dec 02 08:00:13 np0005541913.localdomain systemd[1]: libpod-conmon-d3d5f6f441933c37b0dad78090286e10a59e8be704d0fdf20e3b61324b5b9257.scope: Deactivated successfully.
Dec 02 08:00:13 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005541913 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Dec 02 08:00:13 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Dec 02 08:00:13 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Dec 02 08:00:13 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Dec 02 08:00:13 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Dec 02 08:00:13 np0005541913.localdomain podman[53615]: 2025-12-02 08:00:13.983010997 +0000 UTC m=+0.103928302 container create 53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-server, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=container-puppet-neutron, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-server-container, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Dec 02 08:00:14 np0005541913.localdomain systemd[1]: Started libpod-conmon-53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db.scope.
Dec 02 08:00:14 np0005541913.localdomain podman[53615]: 2025-12-02 08:00:13.931046746 +0000 UTC m=+0.051964071 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Dec 02 08:00:14 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:14 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73f5af374b13f82b9f4d3d5847d5882ab5c5f129a64a44d0b3384933c5aad231/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:14 np0005541913.localdomain podman[53615]: 2025-12-02 08:00:14.051830119 +0000 UTC m=+0.172747444 container init 53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, container_name=container-puppet-neutron, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:23:27Z, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.component=openstack-neutron-server-container)
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain podman[53615]: 2025-12-02 08:00:14.064417579 +0000 UTC m=+0.185334884 container start 53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:23:27Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, container_name=container-puppet-neutron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-server, com.redhat.component=openstack-neutron-server-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:00:14 np0005541913.localdomain podman[53615]: 2025-12-02 08:00:14.064811079 +0000 UTC m=+0.185728444 container attach 53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:23:27Z, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.openshift.expose-services=, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-server-container, summary=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-server, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, vcs-type=git, container_name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005541913 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::agents::ovn
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Dec 02 08:00:14 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3a12438802493a75725c4f7704f2af6db1ef72af396369e5de28f6f4d6a7ed98'
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Notice: Applied catalog in 4.93 seconds
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Application:
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Initial environment: production
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Converged environment: production
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:          Run mode: user
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Changes:
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:             Total: 183
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Events:
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:           Success: 183
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:             Total: 183
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Resources:
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:           Changed: 183
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:       Out of sync: 183
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:           Skipped: 57
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:             Total: 487
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Time:
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:       Concat file: 0.00
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Concat fragment: 0.00
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:            Anchor: 0.00
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:         File line: 0.00
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Virtlogd config: 0.00
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Virtqemud config: 0.01
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Virtsecretd config: 0.01
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Virtnodedevd config: 0.01
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Virtstoraged config: 0.02
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:              Exec: 0.02
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:           Package: 0.02
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Virtproxyd config: 0.03
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:              File: 0.03
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:            Augeas: 1.22
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Config retrieval: 1.65
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:          Last run: 1764662415
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:       Nova config: 3.33
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Transaction evaluation: 4.92
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:    Catalog application: 4.93
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:         Resources: 0.00
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:             Total: 4.94
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]: Version:
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:            Config: 1764662408
Dec 02 08:00:15 np0005541913.localdomain puppet-user[52063]:            Puppet: 7.10.0
Dec 02 08:00:15 np0005541913.localdomain puppet-user[53660]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Dec 02 08:00:15 np0005541913.localdomain puppet-user[53660]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:00:15 np0005541913.localdomain puppet-user[53660]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:00:15 np0005541913.localdomain puppet-user[53660]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:00:15 np0005541913.localdomain puppet-user[53660]:    (file & line not available)
Dec 02 08:00:15 np0005541913.localdomain puppet-user[53660]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:00:15 np0005541913.localdomain puppet-user[53660]:    (file & line not available)
Dec 02 08:00:15 np0005541913.localdomain systemd[1]: libpod-79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d.scope: Deactivated successfully.
Dec 02 08:00:15 np0005541913.localdomain systemd[1]: libpod-79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d.scope: Consumed 9.035s CPU time.
Dec 02 08:00:15 np0005541913.localdomain podman[51914]: 2025-12-02 08:00:15.984103998 +0000 UTC m=+10.990518515 container died 79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, release=1761123044, container_name=container-puppet-nova_libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 08:00:15 np0005541913.localdomain puppet-user[53660]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Dec 02 08:00:16 np0005541913.localdomain systemd[1]: tmp-crun.qCzK4e.mount: Deactivated successfully.
Dec 02 08:00:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-104925f4f3140d86c4d76991cbbe20b0ea2114e629deebdf08f0de90504ded5f-merged.mount: Deactivated successfully.
Dec 02 08:00:16 np0005541913.localdomain podman[53796]: 2025-12-02 08:00:16.152757484 +0000 UTC m=+0.158639715 container cleanup 79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, container_name=container-puppet-nova_libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:00:16 np0005541913.localdomain systemd[1]: libpod-conmon-79e7bfa253c8a0aa3056c562963a2fcae1d6b8bb4029380f0fb9891fb44c522d.scope: Deactivated successfully.
Dec 02 08:00:16 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005541913 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                         # TODO(emilien): figure how to deal with libvirt profile.
                                                         # We'll probably treat it like we do with Neutron plugins.
                                                         # Until then, just include it in the default nova-compute role.
                                                         include tripleo::profile::base::nova::compute::libvirt
                                                         
                                                         include tripleo::profile::base::nova::libvirt
                                                         
                                                         include tripleo::profile::base::nova::compute::libvirt_guests
                                                         
                                                         include tripleo::profile::base::sshd
                                                         include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.64 seconds
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Dec 02 08:00:16 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]: Notice: Applied catalog in 0.45 seconds
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]: Application:
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:    Initial environment: production
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:    Converged environment: production
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:          Run mode: user
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]: Changes:
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:             Total: 33
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]: Events:
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:           Success: 33
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:             Total: 33
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]: Resources:
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:           Skipped: 21
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:           Changed: 33
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:       Out of sync: 33
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:             Total: 155
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]: Time:
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:         Resources: 0.00
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:    Ovn metadata agent config: 0.02
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:    Neutron config: 0.37
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:    Transaction evaluation: 0.44
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:    Catalog application: 0.45
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:    Config retrieval: 0.71
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:          Last run: 1764662417
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:             Total: 0.45
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]: Version:
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:            Config: 1764662415
Dec 02 08:00:17 np0005541913.localdomain puppet-user[53660]:            Puppet: 7.10.0
Dec 02 08:00:17 np0005541913.localdomain systemd[1]: libpod-53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db.scope: Deactivated successfully.
Dec 02 08:00:17 np0005541913.localdomain systemd[1]: libpod-53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db.scope: Consumed 3.473s CPU time.
Dec 02 08:00:17 np0005541913.localdomain podman[53615]: 2025-12-02 08:00:17.728738424 +0000 UTC m=+3.849655759 container died 53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-server-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.12, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_puppet_step1, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:23:27Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, container_name=container-puppet-neutron, architecture=x86_64)
Dec 02 08:00:17 np0005541913.localdomain systemd[1]: tmp-crun.zTntEm.mount: Deactivated successfully.
Dec 02 08:00:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-73f5af374b13f82b9f4d3d5847d5882ab5c5f129a64a44d0b3384933c5aad231-merged.mount: Deactivated successfully.
Dec 02 08:00:17 np0005541913.localdomain podman[53868]: 2025-12-02 08:00:17.863527593 +0000 UTC m=+0.126254936 container cleanup 53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:23:27Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-server-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, name=rhosp17/openstack-neutron-server, version=17.1.12, url=https://www.redhat.com)
Dec 02 08:00:17 np0005541913.localdomain systemd[1]: libpod-conmon-53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db.scope: Deactivated successfully.
Dec 02 08:00:17 np0005541913.localdomain python3[51707]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005541913 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::ovn_metadata
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005541913', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Dec 02 08:00:17 np0005541913.localdomain sudo[51705]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:18 np0005541913.localdomain sudo[53920]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqugpxlfncicpuimqadwxojhokdtiezp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:18 np0005541913.localdomain sudo[53920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:18 np0005541913.localdomain python3[53922]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:18 np0005541913.localdomain sudo[53920]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:18 np0005541913.localdomain sudo[53936]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtfhffmduahlolyolvcavjtbsksuysmo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:18 np0005541913.localdomain sudo[53936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:19 np0005541913.localdomain sudo[53936]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:19 np0005541913.localdomain sudo[53952]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlupemdvmeuhjsuwvahnqnrmlgsphiza ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:19 np0005541913.localdomain sudo[53952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:19 np0005541913.localdomain python3[53954]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:00:19 np0005541913.localdomain sudo[53952]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:20 np0005541913.localdomain sudo[54002]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdhxhpduyfbwkvpiqttwfxlvwcmltotl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:20 np0005541913.localdomain sudo[54002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:20 np0005541913.localdomain python3[54004]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:00:20 np0005541913.localdomain sudo[54002]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:20 np0005541913.localdomain sudo[54045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgjnllvheosviuuarmaszjonjxuduxap ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:20 np0005541913.localdomain sudo[54045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:20 np0005541913.localdomain python3[54047]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662419.8885703-83456-143487100285342/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:20 np0005541913.localdomain sudo[54045]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:20 np0005541913.localdomain sudo[54107]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egwhbqgvlbsjcrrmcexaktijqgggjzth ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:20 np0005541913.localdomain sudo[54107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:20 np0005541913.localdomain python3[54109]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:00:20 np0005541913.localdomain sudo[54107]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:21 np0005541913.localdomain sudo[54150]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzrznjfjelmmprsmtfmisnmtogknlrzc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:21 np0005541913.localdomain sudo[54150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:21 np0005541913.localdomain python3[54152]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662420.709919-83456-152773432819666/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:21 np0005541913.localdomain sudo[54150]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:21 np0005541913.localdomain sudo[54212]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsogwgoqhdpmmdvjqpwkvjfluzppdozl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:21 np0005541913.localdomain sudo[54212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:21 np0005541913.localdomain python3[54214]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:00:21 np0005541913.localdomain sudo[54212]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:22 np0005541913.localdomain sudo[54255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnuhtvutigrhofcmsuwclxqjzylhjigt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:22 np0005541913.localdomain sudo[54255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:22 np0005541913.localdomain python3[54257]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662421.6082842-83556-63790121439300/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:22 np0005541913.localdomain sudo[54255]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:22 np0005541913.localdomain sudo[54317]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nesfzsmakrwnlzkawyghcolnqdlukslm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:22 np0005541913.localdomain sudo[54317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:22 np0005541913.localdomain python3[54319]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:00:22 np0005541913.localdomain sudo[54317]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:22 np0005541913.localdomain sudo[54360]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkpfonsfexeawbcsxetxrzyeowygjbvy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:22 np0005541913.localdomain sudo[54360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:23 np0005541913.localdomain python3[54362]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662422.4676592-83585-164176831914883/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:23 np0005541913.localdomain sudo[54360]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:23 np0005541913.localdomain sudo[54390]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msdkpejbbcieeihndgehkljlisbcjqgz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:23 np0005541913.localdomain sudo[54390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:23 np0005541913.localdomain python3[54392]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:00:23 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:00:23 np0005541913.localdomain systemd-rc-local-generator[54417]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:00:23 np0005541913.localdomain systemd-sysv-generator[54422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:00:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:00:23 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:00:24 np0005541913.localdomain systemd-sysv-generator[54458]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:00:24 np0005541913.localdomain systemd-rc-local-generator[54454]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:00:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:00:24 np0005541913.localdomain systemd[1]: Starting TripleO Container Shutdown...
Dec 02 08:00:24 np0005541913.localdomain systemd[1]: Finished TripleO Container Shutdown.
Dec 02 08:00:24 np0005541913.localdomain sudo[54390]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:24 np0005541913.localdomain sudo[54515]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paljzhotvkxiajqntbcfqhqbydckozur ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:24 np0005541913.localdomain sudo[54515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:24 np0005541913.localdomain python3[54517]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:00:24 np0005541913.localdomain sudo[54515]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:24 np0005541913.localdomain sudo[54558]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpoujhgbevsmjqevqgosxgbqssptbvct ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:24 np0005541913.localdomain sudo[54558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:25 np0005541913.localdomain python3[54560]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662424.4535828-83634-128300938039233/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:25 np0005541913.localdomain sudo[54558]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:25 np0005541913.localdomain sudo[54620]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vldnjtnwnguxykocyexyfpvvtoughuye ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:25 np0005541913.localdomain sudo[54620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:25 np0005541913.localdomain python3[54622]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:00:25 np0005541913.localdomain sudo[54620]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:25 np0005541913.localdomain sudo[54663]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrtpympvjwfienvklkfksglqszgxxjat ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:25 np0005541913.localdomain sudo[54663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:26 np0005541913.localdomain python3[54665]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662425.348438-83704-89717137302983/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:26 np0005541913.localdomain sudo[54663]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:26 np0005541913.localdomain sudo[54693]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbvtfxzydtfgyihkkqlidcdduatusial ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:26 np0005541913.localdomain sudo[54693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:26 np0005541913.localdomain python3[54695]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:00:26 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:00:26 np0005541913.localdomain systemd-rc-local-generator[54722]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:00:26 np0005541913.localdomain systemd-sysv-generator[54726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:00:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:00:26 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:00:26 np0005541913.localdomain systemd-sysv-generator[54764]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:00:26 np0005541913.localdomain systemd-rc-local-generator[54761]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:00:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:00:27 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 08:00:27 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 08:00:27 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 08:00:27 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 08:00:27 np0005541913.localdomain sudo[54693]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:27 np0005541913.localdomain sudo[54786]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcubymoefflwuzcowfgycwnpobobfiqi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:27 np0005541913.localdomain sudo[54786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 36af2f1ef63ece3c88eb676f44e9c36d
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: 4767aaabc3de112d8791c290aa2b669d
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 230f4ebc92ecc6f511b0217abb58f1b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 1c70cec5d3310de4d4589e1a95c8fd3c
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 72848ce4d815e5b4e89ff3e01c5f9f7e
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 72848ce4d815e5b4e89ff3e01c5f9f7e
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: d1544001d5773d0045aaf61439ef5e02
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain python3[54788]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: ff8ff724cb5f0d02131158e2fae849b6
Dec 02 08:00:27 np0005541913.localdomain sudo[54786]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:27 np0005541913.localdomain sudo[54802]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azunuknguhbimbkbgfvbnexmndnzpdxv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:27 np0005541913.localdomain sudo[54802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:28 np0005541913.localdomain sudo[54802]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:28 np0005541913.localdomain sudo[54844]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwdohteusanvsiwkahwwdemxsuiwnbzx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:28 np0005541913.localdomain sudo[54844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:29 np0005541913.localdomain python3[54846]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 02 08:00:29 np0005541913.localdomain podman[54884]: 2025-12-02 08:00:29.275916041 +0000 UTC m=+0.057764924 container create 61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=metrics_qdr_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:00:29 np0005541913.localdomain systemd[1]: Started libpod-conmon-61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d.scope.
Dec 02 08:00:29 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:29 np0005541913.localdomain podman[54884]: 2025-12-02 08:00:29.24494354 +0000 UTC m=+0.026792403 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 02 08:00:29 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/804deff8dacbfc312114476fef5e5066b58626df118d8072d88e0a05fadba7d2/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:29 np0005541913.localdomain podman[54884]: 2025-12-02 08:00:29.356569013 +0000 UTC m=+0.138417896 container init 61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:00:29 np0005541913.localdomain podman[54884]: 2025-12-02 08:00:29.370244791 +0000 UTC m=+0.152093664 container start 61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:00:29 np0005541913.localdomain podman[54884]: 2025-12-02 08:00:29.370964979 +0000 UTC m=+0.152813842 container attach 61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr_init_logs)
Dec 02 08:00:29 np0005541913.localdomain systemd[1]: libpod-61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d.scope: Deactivated successfully.
Dec 02 08:00:29 np0005541913.localdomain podman[54884]: 2025-12-02 08:00:29.375347685 +0000 UTC m=+0.157196538 container died 61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, tcib_managed=true, container_name=metrics_qdr_init_logs, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:00:29 np0005541913.localdomain podman[54903]: 2025-12-02 08:00:29.449040044 +0000 UTC m=+0.060193198 container cleanup 61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, container_name=metrics_qdr_init_logs, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible)
Dec 02 08:00:29 np0005541913.localdomain systemd[1]: libpod-conmon-61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d.scope: Deactivated successfully.
Dec 02 08:00:29 np0005541913.localdomain python3[54846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Dec 02 08:00:29 np0005541913.localdomain podman[54976]: 2025-12-02 08:00:29.916563764 +0000 UTC m=+0.092548444 container create 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-type=git, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, version=17.1.12)
Dec 02 08:00:29 np0005541913.localdomain systemd[1]: Started libpod-conmon-71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.scope.
Dec 02 08:00:29 np0005541913.localdomain podman[54976]: 2025-12-02 08:00:29.864549262 +0000 UTC m=+0.040533942 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 02 08:00:29 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:00:29 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/083325a356d009687825873f5ef80d42d8ec3a9c9ef25c5a97dbce5b8f99fa32/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:29 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/083325a356d009687825873f5ef80d42d8ec3a9c9ef25c5a97dbce5b8f99fa32/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Dec 02 08:00:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:00:30 np0005541913.localdomain podman[54976]: 2025-12-02 08:00:30.018578924 +0000 UTC m=+0.194563604 container init 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4)
Dec 02 08:00:30 np0005541913.localdomain sudo[54997]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:00:30 np0005541913.localdomain sudo[54997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Dec 02 08:00:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:00:30 np0005541913.localdomain podman[54976]: 2025-12-02 08:00:30.06348667 +0000 UTC m=+0.239471340 container start 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr)
Dec 02 08:00:30 np0005541913.localdomain python3[54846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=36af2f1ef63ece3c88eb676f44e9c36d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 02 08:00:30 np0005541913.localdomain sudo[54997]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:30 np0005541913.localdomain podman[54999]: 2025-12-02 08:00:30.165562983 +0000 UTC m=+0.092806282 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true)
Dec 02 08:00:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-804deff8dacbfc312114476fef5e5066b58626df118d8072d88e0a05fadba7d2-merged.mount: Deactivated successfully.
Dec 02 08:00:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61bfb0f60bf1d340e284e5b4c8d783ad999ed782a1edb4c5af3cfb351fa36a5d-userdata-shm.mount: Deactivated successfully.
Dec 02 08:00:30 np0005541913.localdomain sudo[54844]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:30 np0005541913.localdomain podman[54999]: 2025-12-02 08:00:30.430509119 +0000 UTC m=+0.357752438 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:00:30 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:00:30 np0005541913.localdomain sudo[55072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thbnpgrbfrstoyemmfoiljemhpdzjfmm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:30 np0005541913.localdomain sudo[55072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:30 np0005541913.localdomain python3[55074]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:30 np0005541913.localdomain sudo[55072]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:30 np0005541913.localdomain sudo[55088]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxtgftxmgmmqujzudebiqidvpatltawv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:30 np0005541913.localdomain sudo[55088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:31 np0005541913.localdomain python3[55090]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:00:31 np0005541913.localdomain sudo[55088]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:31 np0005541913.localdomain sudo[55149]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acxmeodkmpyemdpppjfdguvjxxwwixld ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:31 np0005541913.localdomain sudo[55149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:31 np0005541913.localdomain python3[55151]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662431.1264088-83886-30778362098085/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:31 np0005541913.localdomain sudo[55149]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:31 np0005541913.localdomain sudo[55165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eltrrvbmvmspvdcthgjdkvcyuavusknw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:31 np0005541913.localdomain sudo[55165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:32 np0005541913.localdomain python3[55167]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 08:00:32 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:00:32 np0005541913.localdomain systemd-rc-local-generator[55193]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:00:32 np0005541913.localdomain systemd-sysv-generator[55196]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:00:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:00:32 np0005541913.localdomain sudo[55165]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:32 np0005541913.localdomain sudo[55218]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmishifaeszpjeghquljupfdkbhvqfrp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:00:32 np0005541913.localdomain sudo[55218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:32 np0005541913.localdomain python3[55220]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:00:33 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:00:33 np0005541913.localdomain systemd-sysv-generator[55253]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:00:33 np0005541913.localdomain systemd-rc-local-generator[55250]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:00:33 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:00:33 np0005541913.localdomain systemd[1]: Starting metrics_qdr container...
Dec 02 08:00:33 np0005541913.localdomain systemd[1]: Started metrics_qdr container.
Dec 02 08:00:33 np0005541913.localdomain sudo[55218]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:33 np0005541913.localdomain sudo[55298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weimiieafubedhnnxtirlclgyvszozjz ; /usr/bin/python3
Dec 02 08:00:33 np0005541913.localdomain sudo[55298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:33 np0005541913.localdomain python3[55300]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:33 np0005541913.localdomain sudo[55298]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:34 np0005541913.localdomain sudo[55346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sawddxhsdkxmkviiypsytifbfshszphb ; /usr/bin/python3
Dec 02 08:00:34 np0005541913.localdomain sudo[55346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:34 np0005541913.localdomain sudo[55346]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:34 np0005541913.localdomain sudo[55389]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfjmqjyuafvvlarfipedkxcsrydedehm ; /usr/bin/python3
Dec 02 08:00:34 np0005541913.localdomain sudo[55389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:34 np0005541913.localdomain sudo[55389]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:35 np0005541913.localdomain sudo[55419]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weucjvwwdrmaamphhpmkggvnotvjapaj ; /usr/bin/python3
Dec 02 08:00:35 np0005541913.localdomain sudo[55419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:35 np0005541913.localdomain python3[55421]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005541913 step=1 update_config_hash_only=False
Dec 02 08:00:35 np0005541913.localdomain sudo[55419]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:35 np0005541913.localdomain sudo[55435]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtoeefatetbhfwaacbtriedkpqgitqgr ; /usr/bin/python3
Dec 02 08:00:35 np0005541913.localdomain sudo[55435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:35 np0005541913.localdomain python3[55437]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:00:35 np0005541913.localdomain sudo[55435]: pam_unix(sudo:session): session closed for user root
Dec 02 08:00:35 np0005541913.localdomain sudo[55451]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whicaugjsdrhedebfajeqzstjhvnxalb ; /usr/bin/python3
Dec 02 08:00:35 np0005541913.localdomain sudo[55451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:00:36 np0005541913.localdomain python3[55453]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 02 08:00:36 np0005541913.localdomain sudo[55451]: pam_unix(sudo:session): session closed for user root
Dec 02 08:01:01 np0005541913.localdomain CROND[55455]: (root) CMD (run-parts /etc/cron.hourly)
Dec 02 08:01:01 np0005541913.localdomain run-parts[55458]: (/etc/cron.hourly) starting 0anacron
Dec 02 08:01:01 np0005541913.localdomain run-parts[55464]: (/etc/cron.hourly) finished 0anacron
Dec 02 08:01:01 np0005541913.localdomain CROND[55454]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 02 08:01:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:01:01 np0005541913.localdomain systemd[1]: tmp-crun.P3mFVZ.mount: Deactivated successfully.
Dec 02 08:01:01 np0005541913.localdomain podman[55465]: 2025-12-02 08:01:01.453013061 +0000 UTC m=+0.090758447 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Dec 02 08:01:01 np0005541913.localdomain podman[55465]: 2025-12-02 08:01:01.64854382 +0000 UTC m=+0.286289186 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1)
Dec 02 08:01:01 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:01:06 np0005541913.localdomain sudo[55493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:01:06 np0005541913.localdomain sudo[55493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:01:06 np0005541913.localdomain sudo[55493]: pam_unix(sudo:session): session closed for user root
Dec 02 08:01:06 np0005541913.localdomain sudo[55508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:01:06 np0005541913.localdomain sudo[55508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:01:07 np0005541913.localdomain sudo[55508]: pam_unix(sudo:session): session closed for user root
Dec 02 08:01:08 np0005541913.localdomain sudo[55555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:01:08 np0005541913.localdomain sudo[55555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:01:08 np0005541913.localdomain sudo[55555]: pam_unix(sudo:session): session closed for user root
Dec 02 08:01:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:01:32 np0005541913.localdomain systemd[1]: tmp-crun.uRIXiN.mount: Deactivated successfully.
Dec 02 08:01:32 np0005541913.localdomain podman[55570]: 2025-12-02 08:01:32.430404454 +0000 UTC m=+0.074521973 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, architecture=x86_64, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, managed_by=tripleo_ansible)
Dec 02 08:01:32 np0005541913.localdomain podman[55570]: 2025-12-02 08:01:32.657507598 +0000 UTC m=+0.301625147 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=)
Dec 02 08:01:32 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:02:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:02:03 np0005541913.localdomain systemd[1]: tmp-crun.lNcr2b.mount: Deactivated successfully.
Dec 02 08:02:03 np0005541913.localdomain podman[55600]: 2025-12-02 08:02:03.448484336 +0000 UTC m=+0.088822397 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, release=1761123044)
Dec 02 08:02:03 np0005541913.localdomain podman[55600]: 2025-12-02 08:02:03.629221796 +0000 UTC m=+0.269559787 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:02:03 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:02:08 np0005541913.localdomain sudo[55630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:02:08 np0005541913.localdomain sudo[55630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:02:08 np0005541913.localdomain sudo[55630]: pam_unix(sudo:session): session closed for user root
Dec 02 08:02:08 np0005541913.localdomain sudo[55645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:02:08 np0005541913.localdomain sudo[55645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:02:09 np0005541913.localdomain sudo[55645]: pam_unix(sudo:session): session closed for user root
Dec 02 08:02:09 np0005541913.localdomain sudo[55693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:02:09 np0005541913.localdomain sudo[55693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:02:09 np0005541913.localdomain sudo[55693]: pam_unix(sudo:session): session closed for user root
Dec 02 08:02:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:02:34 np0005541913.localdomain systemd[1]: tmp-crun.FRQuhw.mount: Deactivated successfully.
Dec 02 08:02:34 np0005541913.localdomain podman[55708]: 2025-12-02 08:02:34.43349622 +0000 UTC m=+0.077228116 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:02:34 np0005541913.localdomain podman[55708]: 2025-12-02 08:02:34.653193387 +0000 UTC m=+0.296925263 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr)
Dec 02 08:02:34 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:03:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:03:05 np0005541913.localdomain podman[55736]: 2025-12-02 08:03:05.427082421 +0000 UTC m=+0.069575535 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, vcs-type=git)
Dec 02 08:03:05 np0005541913.localdomain podman[55736]: 2025-12-02 08:03:05.58792869 +0000 UTC m=+0.230421764 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z)
Dec 02 08:03:05 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:03:10 np0005541913.localdomain sudo[55765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:03:10 np0005541913.localdomain sudo[55765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:03:10 np0005541913.localdomain sudo[55765]: pam_unix(sudo:session): session closed for user root
Dec 02 08:03:10 np0005541913.localdomain sudo[55780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:03:10 np0005541913.localdomain sudo[55780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:03:10 np0005541913.localdomain sudo[55780]: pam_unix(sudo:session): session closed for user root
Dec 02 08:03:11 np0005541913.localdomain sudo[55827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:03:11 np0005541913.localdomain sudo[55827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:03:11 np0005541913.localdomain sudo[55827]: pam_unix(sudo:session): session closed for user root
Dec 02 08:03:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:03:36 np0005541913.localdomain systemd[1]: tmp-crun.v5C9YS.mount: Deactivated successfully.
Dec 02 08:03:36 np0005541913.localdomain podman[55842]: 2025-12-02 08:03:36.42171774 +0000 UTC m=+0.063716654 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 02 08:03:36 np0005541913.localdomain podman[55842]: 2025-12-02 08:03:36.612663191 +0000 UTC m=+0.254662145 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:46Z, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=)
Dec 02 08:03:36 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:04:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:04:07 np0005541913.localdomain systemd[1]: tmp-crun.fid941.mount: Deactivated successfully.
Dec 02 08:04:07 np0005541913.localdomain podman[55870]: 2025-12-02 08:04:07.454684154 +0000 UTC m=+0.093148411 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:04:07 np0005541913.localdomain podman[55870]: 2025-12-02 08:04:07.676795375 +0000 UTC m=+0.315259562 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:04:07 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:04:11 np0005541913.localdomain sudo[55900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:04:11 np0005541913.localdomain sudo[55900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:04:11 np0005541913.localdomain sudo[55900]: pam_unix(sudo:session): session closed for user root
Dec 02 08:04:11 np0005541913.localdomain sudo[55915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:04:11 np0005541913.localdomain sudo[55915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:04:12 np0005541913.localdomain sudo[55915]: pam_unix(sudo:session): session closed for user root
Dec 02 08:04:13 np0005541913.localdomain sudo[55962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:04:13 np0005541913.localdomain sudo[55962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:04:13 np0005541913.localdomain sudo[55962]: pam_unix(sudo:session): session closed for user root
Dec 02 08:04:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:04:38 np0005541913.localdomain systemd[1]: tmp-crun.ucPcpL.mount: Deactivated successfully.
Dec 02 08:04:38 np0005541913.localdomain podman[55977]: 2025-12-02 08:04:38.450530849 +0000 UTC m=+0.091501677 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:04:38 np0005541913.localdomain podman[55977]: 2025-12-02 08:04:38.664046879 +0000 UTC m=+0.305017677 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, architecture=x86_64, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd)
Dec 02 08:04:38 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:05:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:05:09 np0005541913.localdomain systemd[1]: tmp-crun.mgmfYw.mount: Deactivated successfully.
Dec 02 08:05:09 np0005541913.localdomain podman[56006]: 2025-12-02 08:05:09.436674314 +0000 UTC m=+0.074865520 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc.)
Dec 02 08:05:09 np0005541913.localdomain podman[56006]: 2025-12-02 08:05:09.637589916 +0000 UTC m=+0.275781102 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Dec 02 08:05:09 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:05:13 np0005541913.localdomain sudo[56036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:05:13 np0005541913.localdomain sudo[56036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:05:13 np0005541913.localdomain sudo[56036]: pam_unix(sudo:session): session closed for user root
Dec 02 08:05:13 np0005541913.localdomain sudo[56051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:05:13 np0005541913.localdomain sudo[56051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:05:13 np0005541913.localdomain sudo[56051]: pam_unix(sudo:session): session closed for user root
Dec 02 08:05:14 np0005541913.localdomain sudo[56097]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:05:14 np0005541913.localdomain sudo[56097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:05:14 np0005541913.localdomain sudo[56097]: pam_unix(sudo:session): session closed for user root
Dec 02 08:05:28 np0005541913.localdomain sshd[56112]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 08:05:28 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 21 pg[2.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [4,5,3] r=2 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:30 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 23 pg[3.0( empty local-lis/les=0/0 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [5,4,0] r=2 lpr=23 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:31 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 25 pg[4.0( empty local-lis/les=0/0 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [3,4,5] r=0 lpr=25 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:32 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 26 pg[4.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [3,4,5] r=0 lpr=25 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:35 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 27 pg[5.0( empty local-lis/les=0/0 n=0 ec=27/27 lis/c=0/0 les/c/f=0/0/0 sis=27) [2,3,4] r=1 lpr=27 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:38 np0005541913.localdomain sshd[56112]: Connection closed by 71.6.199.65 port 51608 [preauth]
Dec 02 08:05:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:05:40 np0005541913.localdomain systemd[1]: tmp-crun.PtyjfM.mount: Deactivated successfully.
Dec 02 08:05:40 np0005541913.localdomain podman[56115]: 2025-12-02 08:05:40.431229503 +0000 UTC m=+0.074759768 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1)
Dec 02 08:05:40 np0005541913.localdomain podman[56115]: 2025-12-02 08:05:40.646552332 +0000 UTC m=+0.290082647 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64)
Dec 02 08:05:40 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:05:41 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 33 pg[2.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=33 pruub=11.261794090s) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active pruub 1117.898559570s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:41 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 33 pg[2.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=33 pruub=11.258399010s) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1117.898559570s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:41 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 33 pg[3.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=33 pruub=13.721211433s) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 active pruub 1124.780761719s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,0], acting [5,4,0] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:41 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 33 pg[3.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=33 pruub=13.718441963s) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1124.780761719s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.18( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.15( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.16( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.13( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.12( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.14( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.11( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.1f( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.1e( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.1c( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.1d( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.1b( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.1a( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.8( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.9( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.5( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.4( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.2( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.1( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.19( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.7( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.6( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.3( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.a( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.b( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.d( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.c( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.e( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.f( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.10( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 34 pg[2.17( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [4,5,3] r=2 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.1b( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.1a( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.18( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.19( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.16( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.13( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.15( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.14( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.17( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.12( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.11( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.f( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.e( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.d( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.1c( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.1( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.c( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.2( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.5( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.4( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.3( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.6( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.7( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.8( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.9( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.a( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.1d( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.b( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.1e( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.1f( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:42 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 34 pg[3.10( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [5,4,0] r=2 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:43 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 35 pg[4.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=13.097516060s) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active pruub 1121.829223633s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,5], acting [3,4,5] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:43 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 35 pg[4.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=13.097516060s) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown pruub 1121.829223633s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:43 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 35 pg[5.0( empty local-lis/les=27/28 n=0 ec=27/27 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=15.465136528s) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 active pruub 1124.201416016s@ mbc={}] start_peering_interval up [2,3,4] -> [2,3,4], acting [2,3,4] -> [2,3,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:43 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 35 pg[5.0( empty local-lis/les=27/28 n=0 ec=27/27 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=15.461248398s) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1124.201416016s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.18( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1b( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.18( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1a( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1d( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1c( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.1d( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.e( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.f( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.2( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.3( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.5( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.7( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.6( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.5( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.c( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.b( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.9( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.a( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.8( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.1f( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.1e( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.19( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.1b( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.4( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.1a( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1f( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1e( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.10( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.11( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.2( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.11( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.10( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.12( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.3( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.13( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.12( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.13( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.14( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.1( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.f( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.15( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.e( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.17( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.d( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.19( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.16( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[5.1c( empty local-lis/les=27/28 n=0 ec=35/27 lis/c=27/27 les/c/f=28/28/0 sis=35) [2,3,4] r=1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.15( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.14( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.17( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.9( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.16( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.8( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.d( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.a( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.b( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.c( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.6( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.7( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.4( empty local-lis/les=25/26 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.0( empty local-lis/les=35/36 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.19( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.18( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.e( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1c( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1d( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.f( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1b( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.2( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.3( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.5( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.d( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.4( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.7( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.a( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.c( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.6( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1a( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.b( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.15( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.16( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.12( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.10( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.14( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1e( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.1f( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.17( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.11( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.13( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.8( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:44 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 36 pg[4.9( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=25/25 les/c/f=26/26/0 sis=35) [3,4,5] r=0 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:46 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.0 deep-scrub starts
Dec 02 08:05:46 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.0 deep-scrub ok
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[6.0( empty local-lis/les=0/0 n=0 ec=37/37 lis/c=0/0 les/c/f=0/0/0 sis=37) [0,4,2] r=0 lpr=37 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.654157639s) [0,1,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085693359s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,5], acting [5,4,0] -> [0,1,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.654157639s) [0,1,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.085693359s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.654387474s) [5,4,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085937500s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.654333115s) [5,4,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085937500s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.654165268s) [3,2,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085815430s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,4], acting [5,4,0] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.654087067s) [3,2,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085815430s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653826714s) [3,4,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085571289s@ mbc={}] start_peering_interval up [5,4,0] -> [3,4,5], acting [5,4,0] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.8( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653568268s) [2,0,4] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085571289s@ mbc={}] start_peering_interval up [5,4,0] -> [2,0,4], acting [5,4,0] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.9( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653591156s) [5,1,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085571289s@ mbc={}] start_peering_interval up [5,4,0] -> [5,1,3], acting [5,4,0] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.7( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653493881s) [3,5,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085571289s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,4], acting [5,4,0] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.8( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653387070s) [2,0,4] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085571289s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653527260s) [3,4,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085571289s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.9( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653422356s) [5,1,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085571289s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.7( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653345108s) [3,5,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085571289s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.6( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653051376s) [0,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085327148s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.6( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.653051376s) [0,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.085327148s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.3( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652811050s) [4,0,5] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085205078s@ mbc={}] start_peering_interval up [5,4,0] -> [4,0,5], acting [5,4,0] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.3( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652792931s) [4,0,5] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085205078s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.5( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652827263s) [4,3,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085327148s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.4( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652703285s) [3,2,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085205078s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.2( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652594566s) [3,5,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085083008s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,1], acting [5,4,0] -> [3,5,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.2( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652567863s) [3,5,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085083008s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.5( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652786255s) [4,3,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085327148s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.4( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652657509s) [3,2,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085205078s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652359009s) [4,3,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085083008s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652342796s) [4,3,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085083008s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652385712s) [0,4,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085083008s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,2], acting [5,4,0] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652385712s) [0,4,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.085083008s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651956558s) [1,2,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084960938s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,3], acting [5,4,0] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651874542s) [1,5,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084838867s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,0], acting [5,4,0] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651953697s) [2,4,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084960938s@ mbc={}] start_peering_interval up [5,4,0] -> [2,4,0], acting [5,4,0] -> [2,4,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651887894s) [1,2,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084960938s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651806831s) [1,5,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084838867s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652468681s) [4,3,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085571289s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651900291s) [1,3,2] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084960938s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651865005s) [2,4,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084960938s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.10( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652619362s) [1,5,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.085815430s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,3], acting [5,4,0] -> [1,5,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.10( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652565002s) [1,5,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085815430s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.12( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651395798s) [0,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084716797s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.12( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651395798s) [0,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.084716797s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.13( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651181221s) [1,3,2] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084594727s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.14( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651185036s) [1,2,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084594727s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,0], acting [5,4,0] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651831627s) [1,3,2] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084960938s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.14( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651161194s) [1,2,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084594727s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.13( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651131630s) [1,3,2] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084594727s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.16( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.650981903s) [1,3,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084594727s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,5], acting [5,4,0] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.15( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651112556s) [2,1,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084716797s@ mbc={}] start_peering_interval up [5,4,0] -> [2,1,0], acting [5,4,0] -> [2,1,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.16( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.650949478s) [1,3,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084594727s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.15( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651082993s) [2,1,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084716797s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.17( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651031494s) [0,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084716797s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.17( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651031494s) [0,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.084716797s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.19( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.650726318s) [0,1,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084350586s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,2], acting [5,4,0] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651062012s) [5,3,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084838867s@ mbc={}] start_peering_interval up [5,4,0] -> [5,3,4], acting [5,4,0] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.19( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.650726318s) [0,1,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.084350586s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.651031494s) [5,3,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084838867s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.18( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.650574684s) [3,2,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084350586s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.650166512s) [5,4,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1128.084106445s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.18( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.650369644s) [3,2,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084350586s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.1b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.650120735s) [5,4,3] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.084106445s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[3.a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.652303696s) [4,3,5] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.085571289s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.1e( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,1,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.a( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.5( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,4,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.3( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,1,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.19( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.1e( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,2,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.19( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.774377823s) [0,1,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.836425781s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,5], acting [2,3,4] -> [0,1,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1b( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.768483162s) [1,0,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.830810547s@ mbc={}] start_peering_interval up [2,3,4] -> [1,0,2], acting [2,3,4] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1b( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.768430710s) [1,0,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.830810547s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1d( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766879082s) [3,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.829223633s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,5], acting [2,3,4] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1d( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766879082s) [3,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1125.829223633s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.e( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.774645805s) [2,0,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.837158203s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,4], acting [2,3,4] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.e( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.774545670s) [2,0,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.837158203s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.4( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,2,1] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.19( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.774307251s) [0,1,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.836425781s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.3( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772085190s) [0,1,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.835449219s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1a( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772748947s) [2,4,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.834838867s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,3], acting [2,3,4] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.6( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.765979767s) [3,1,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.829467773s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,2], acting [2,3,4] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1a( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.771361351s) [2,4,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.834838867s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.6( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.765979767s) [3,1,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1125.829467773s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.5( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.765991211s) [0,4,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.829589844s@ mbc={}] start_peering_interval up [2,3,4] -> [0,4,5], acting [2,3,4] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.5( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.765943527s) [0,4,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.829589844s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.7( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,5,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.3( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772041321s) [0,1,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.835449219s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.2( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,5,1] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.d( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.771343231s) [2,4,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.836425781s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,0], acting [2,3,4] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.d( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.771301270s) [2,4,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.836425781s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.b( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.a( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769821167s) [0,2,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.835327148s@ mbc={}] start_peering_interval up [2,3,4] -> [0,2,4], acting [2,3,4] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.c( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.765130997s) [3,4,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.830810547s@ mbc={}] start_peering_interval up [2,3,4] -> [3,4,2], acting [2,3,4] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.a( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769781113s) [0,2,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.835327148s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.9( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.768354416s) [1,5,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.833984375s@ mbc={}] start_peering_interval up [2,3,4] -> [1,5,0], acting [2,3,4] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.c( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.765130997s) [3,4,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1125.830810547s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [0,4,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.14( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769432068s) [3,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.836181641s@ mbc={}] start_peering_interval up [2,3,4] -> [3,2,4], acting [2,3,4] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.14( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769432068s) [3,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1125.836181641s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.9( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.768296242s) [1,5,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.833984375s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.17( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769285202s) [3,5,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.836303711s@ mbc={}] start_peering_interval up [2,3,4] -> [3,5,4], acting [2,3,4] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.17( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769285202s) [3,5,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1125.836303711s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.8( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766729355s) [2,0,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.833862305s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,1], acting [2,3,4] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.8( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766638756s) [2,0,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.833862305s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1e( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.768626213s) [0,1,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.835815430s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1e( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.768581390s) [0,1,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.835815430s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.16( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769913673s) [1,3,2] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.837280273s@ mbc={}] start_peering_interval up [2,3,4] -> [1,3,2], acting [2,3,4] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.16( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769865036s) [1,3,2] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.837280273s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.11( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.768770218s) [1,2,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.836303711s@ mbc={}] start_peering_interval up [2,3,4] -> [1,2,0], acting [2,3,4] -> [1,2,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.11( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.768734932s) [1,2,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.836303711s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.18( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,2,1] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1f( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.779154778s) [2,4,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847656250s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1f( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.779109955s) [2,4,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847656250s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.10( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766963005s) [4,5,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.835815430s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,0], acting [2,3,4] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1f( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766510963s) [4,5,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.835449219s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,3], acting [2,3,4] -> [4,5,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1f( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766464233s) [4,5,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.835449219s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1e( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.778610229s) [0,4,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847656250s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.636725426s) [1,5,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705932617s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.636270523s) [5,1,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705688477s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.10( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766930580s) [4,5,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.835815430s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.636182785s) [5,1,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705688477s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1e( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.778536797s) [0,4,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847656250s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.11( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.778059006s) [3,4,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847778320s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,2], acting [3,4,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.11( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.778059006s) [3,4,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1125.847778320s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.10( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.777619362s) [3,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847656250s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.10( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.777619362s) [3,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1125.847656250s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.1e( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,4,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.635581970s) [1,2,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705688477s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.635509491s) [1,2,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705688477s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.12( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764682770s) [5,1,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.835083008s@ mbc={}] start_peering_interval up [2,3,4] -> [5,1,3], acting [2,3,4] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.12( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764632225s) [5,1,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.835083008s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.13( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.777222633s) [4,2,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847778320s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.636683464s) [1,5,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705932617s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.13( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.777153015s) [4,2,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847778320s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.634814262s) [3,4,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705566406s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.14( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.634751320s) [4,2,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705566406s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,0], acting [4,5,3] -> [4,2,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.14( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.634723663s) [4,2,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705566406s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.12( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.776627541s) [0,5,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847534180s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.12( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.776589394s) [0,5,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847534180s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.635031700s) [5,0,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705444336s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.634213448s) [2,4,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705444336s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,3], acting [4,5,3] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.12( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,5,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.634177208s) [2,4,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705444336s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.15( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775684357s) [5,3,1] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847167969s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,1], acting [3,4,5] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.13( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.765464783s) [5,0,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.836303711s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,1], acting [2,3,4] -> [5,0,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.634814262s) [3,4,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1123.705566406s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.13( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764733315s) [5,0,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.836303711s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.15( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764796257s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.836303711s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.15( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764726639s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.836303711s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.12( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.633594513s) [5,3,1] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705322266s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.12( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.633567810s) [5,3,1] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705322266s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.634274483s) [5,0,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705444336s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.14( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775888443s) [5,0,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847656250s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.633378983s) [4,3,2] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705200195s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,2], acting [4,5,3] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.17( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775915146s) [3,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847778320s@ mbc={}] start_peering_interval up [3,4,5] -> [3,1,5], acting [3,4,5] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.633273125s) [4,3,2] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705200195s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.17( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775915146s) [3,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1125.847778320s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.10( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.633278847s) [2,0,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705322266s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,4], acting [4,5,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.10( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.633233070s) [2,0,4] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705322266s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.15( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775649071s) [5,3,1] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847167969s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.16( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775273323s) [0,4,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847534180s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.16( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775230408s) [0,4,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847534180s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.9( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775558472s) [5,0,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847778320s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.9( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775522232s) [5,0,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847778320s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.632546425s) [3,2,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704956055s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.8( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775401115s) [5,4,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847778320s@ mbc={}] start_peering_interval up [3,4,5] -> [5,4,3], acting [3,4,5] -> [5,4,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.14( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775245667s) [5,0,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847656250s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.632546425s) [3,2,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1123.704956055s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.8( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.775337219s) [5,4,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847778320s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.631945610s) [2,4,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705200195s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.16( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,4,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.631910324s) [2,4,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705200195s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.631632805s) [5,1,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705078125s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.4( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,5,1] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.7( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,5,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.b( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.631601334s) [5,1,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705078125s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.b( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.759953499s) [5,0,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.833496094s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,4], acting [2,3,4] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.b( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.759929657s) [5,0,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.833496094s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.b( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.773355484s) [0,1,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.847045898s@ mbc={}] start_peering_interval up [3,4,5] -> [0,1,5], acting [3,4,5] -> [0,1,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.b( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.773309708s) [0,1,5] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.847045898s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.630913734s) [5,1,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704711914s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,0], acting [4,5,3] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.630885124s) [5,1,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.704711914s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.d( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772572517s) [4,2,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.846557617s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.630796432s) [2,3,1] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704711914s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.630754471s) [2,3,1] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.704711914s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.d( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772478104s) [4,2,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.846557617s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.c( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772571564s) [4,3,2] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.846801758s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,2], acting [3,4,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.c( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772540092s) [4,3,2] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.846801758s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.631366730s) [2,0,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705200195s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.4( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.760310173s) [5,3,4] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.834594727s@ mbc={}] start_peering_interval up [2,3,4] -> [5,3,4], acting [2,3,4] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.a( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772434235s) [1,0,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.846801758s@ mbc={}] start_peering_interval up [3,4,5] -> [1,0,2], acting [3,4,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.4( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.760278702s) [5,3,4] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.834594727s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.a( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772386551s) [1,0,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.846801758s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.630822182s) [2,0,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705200195s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.5( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766405106s) [1,5,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840820312s@ mbc={}] start_peering_interval up [3,4,5] -> [1,5,0], acting [3,4,5] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.7( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.755390167s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.829833984s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.630205154s) [4,3,5] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704589844s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,5], acting [4,5,3] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.7( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.755354881s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.829833984s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.6( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772210121s) [5,3,4] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.846801758s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,4], acting [3,4,5] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.630124092s) [4,3,5] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.704589844s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.6( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.772170067s) [5,3,4] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.846801758s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.761236191s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.835815430s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.761205673s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.835815430s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.629482269s) [3,2,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704223633s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.630090714s) [4,2,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704956055s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,3], acting [4,5,3] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.5( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.766370773s) [1,5,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.840820312s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.629482269s) [3,2,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1123.704223633s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.630060196s) [4,2,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.704956055s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.771953583s) [2,1,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.846923828s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,0], acting [3,4,5] -> [2,1,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.629543304s) [3,5,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704589844s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.629543304s) [3,5,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1123.704589844s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.7( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.771477699s) [0,5,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.846557617s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.629158020s) [1,0,2] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704467773s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,2], acting [4,5,3] -> [1,0,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.7( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.771430016s) [0,5,4] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.846557617s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.629124641s) [1,0,2] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.704467773s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.4( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.771115303s) [0,5,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.846557617s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,1], acting [3,4,5] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.628822327s) [3,2,1] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704345703s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.4( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.771069527s) [0,5,1] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.846557617s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.628822327s) [3,2,1] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1123.704345703s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.2( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.765193939s) [2,1,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840820312s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.2( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.759393692s) [4,0,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.835083008s@ mbc={}] start_peering_interval up [2,3,4] -> [4,0,2], acting [2,3,4] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.2( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.759354591s) [4,0,2] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.835083008s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.628469467s) [2,0,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704223633s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.2( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.765112877s) [2,1,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.840820312s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.628434181s) [2,0,1] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.704223633s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.3( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764847755s) [2,4,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840820312s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.f( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.761003494s) [4,2,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.837036133s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.628332138s) [1,2,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.704345703s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.f( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.760964394s) [4,2,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.837036133s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.628293037s) [1,2,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.704345703s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.622431755s) [3,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.698730469s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,5], acting [4,5,3] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.622431755s) [3,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1123.698730469s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.f( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764132500s) [3,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840698242s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1c( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764026642s) [2,3,4] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840576172s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,4], acting [3,4,5] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.f( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764132500s) [3,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1125.840698242s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.3( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.764297485s) [2,4,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.840820312s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1c( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.763982773s) [2,3,4] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.840576172s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.e( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.763723373s) [4,5,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840576172s@ mbc={}] start_peering_interval up [3,4,5] -> [4,5,0], acting [3,4,5] -> [4,5,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1c( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.760432243s) [4,2,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.837280273s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,0], acting [2,3,4] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.621959686s) [1,5,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.698730469s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.e( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.763629913s) [4,5,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.840576172s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1d( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.763553619s) [2,1,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840576172s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1d( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.763513565s) [2,1,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.840576172s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.1c( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.760316849s) [4,2,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.837280273s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.621560097s) [5,4,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.698852539s@ mbc={}] start_peering_interval up [4,5,3] -> [5,4,3], acting [4,5,3] -> [5,4,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1a( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769455910s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.846801758s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1a( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.769421577s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.846801758s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1b( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.763136864s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840698242s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.628329277s) [2,1,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705810547s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,0], acting [4,5,3] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1b( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.763062477s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.840698242s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.1( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.771667480s) [2,1,0] r=-1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.846923828s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.621520042s) [5,4,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.698852539s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.628097534s) [2,1,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705810547s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.627774239s) [2,4,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705688477s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.627834320s) [3,5,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.705810547s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.627725601s) [2,4,0] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.705688477s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.627834320s) [3,5,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown pruub 1123.705810547s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.18( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.762476921s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840576172s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.18( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.762437820s) [4,3,5] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.840576172s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.19( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.762310982s) [2,3,1] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.840454102s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,1], acting [3,4,5] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.627837181s) [0,4,2] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active pruub 1123.706054688s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[4.19( empty local-lis/les=35/36 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.762266159s) [2,3,1] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.840454102s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.621759415s) [1,5,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.698730469s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37 pruub=9.627794266s) [0,4,2] r=-1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.706054688s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.18( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.750872612s) [4,2,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active pruub 1125.829101562s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:05:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[5.18( empty local-lis/les=35/36 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37 pruub=11.749888420s) [4,2,3] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.829101562s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.1b( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [1,0,2] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [1,2,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [4,5,0] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.1c( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [1,3,2] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.2( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [4,0,2] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [1,0,2] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.5( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [1,5,0] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [1,0,2] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.5( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [4,3,5] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.9( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [1,5,0] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [1,2,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.11( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [1,2,0] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.a( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [4,3,5] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.d( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [1,2,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.1c( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [4,2,0] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.c( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [4,3,5] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.14( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [4,2,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.10( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [4,5,0] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [2,4,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.10( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [1,5,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [2,4,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.13( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [1,3,2] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.16( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [1,3,5] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.8( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [2,0,1] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.c( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [2,0,1] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [2,1,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [2,0,1] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [2,1,0] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.d( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [2,4,0] r=2 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.e( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [2,0,4] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.10( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [2,0,4] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.1d( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [5,4,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.1a( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [5,3,4] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.1b( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [5,4,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 37 pg[3.9( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [5,1,3] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [5,1,0] r=2 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.b( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [5,0,4] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.9( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [5,0,1] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[4.14( empty local-lis/les=0/0 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [5,0,1] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[5.13( empty local-lis/les=0/0 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [5,0,1] r=1 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 37 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [5,0,4] r=1 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[2.1e( empty local-lis/les=37/38 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,5,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[2.1f( empty local-lis/les=37/38 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [0,4,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[5.19( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[3.1f( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [0,1,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[4.4( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,5,1] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[5.5( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,4,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[4.7( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,5,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[3.6( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [0,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[4.b( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[5.3( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,1,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[3.19( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [0,1,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[5.1e( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,1,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[3.1( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [0,4,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[5.a( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[6.0( empty local-lis/les=37/38 n=0 ec=37/37 lis/c=0/0 les/c/f=0/0/0 sis=37) [0,4,2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[3.1e( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,2,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[3.4( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,2,1] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[5.6( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [3,1,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[2.4( empty local-lis/les=37/38 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,2,1] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[2.6( empty local-lis/les=37/38 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,2,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[5.c( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [3,4,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[2.e( empty local-lis/les=37/38 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,2,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[5.14( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [3,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[4.f( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [3,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[3.18( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,2,1] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[2.19( empty local-lis/les=37/38 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,4,2] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[4.11( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [3,4,2] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[4.10( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [3,2,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[5.1d( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [3,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[2.9( empty local-lis/les=37/38 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[2.1( empty local-lis/les=37/38 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,5,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[3.2( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,5,1] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[3.7( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,5,4] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[3.b( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [3,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[4.16( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,4,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[4.12( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,5,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[3.12( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [0,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[3.17( empty local-lis/les=37/38 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=37) [0,4,5] r=0 lpr=37 pi=[33,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 38 pg[4.1e( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [0,4,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[4.17( empty local-lis/les=37/38 n=0 ec=35/25 lis/c=35/35 les/c/f=36/36/0 sis=37) [3,1,5] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 38 pg[5.17( empty local-lis/les=37/38 n=0 ec=35/27 lis/c=35/35 les/c/f=36/36/0 sis=37) [3,5,4] r=0 lpr=37 pi=[35,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:05:52 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Dec 02 08:05:52 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 39 pg[7.0( empty local-lis/les=0/0 n=0 ec=39/39 lis/c=0/0 les/c/f=0/0/0 sis=39) [1,5,3] r=2 lpr=39 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:05:52 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.1 deep-scrub starts
Dec 02 08:05:54 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.6 deep-scrub starts
Dec 02 08:05:54 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.6 deep-scrub ok
Dec 02 08:05:55 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec 02 08:05:55 np0005541913.localdomain sudo[56145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:05:55 np0005541913.localdomain sudo[56145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:05:55 np0005541913.localdomain sudo[56145]: pam_unix(sudo:session): session closed for user root
Dec 02 08:05:55 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec 02 08:05:57 np0005541913.localdomain sudo[56160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:05:57 np0005541913.localdomain sudo[56160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:05:57 np0005541913.localdomain sudo[56160]: pam_unix(sudo:session): session closed for user root
Dec 02 08:05:58 np0005541913.localdomain sudo[56175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:05:58 np0005541913.localdomain sudo[56175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:05:58 np0005541913.localdomain sudo[56175]: pam_unix(sudo:session): session closed for user root
Dec 02 08:05:58 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.5 deep-scrub starts
Dec 02 08:05:58 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.5 deep-scrub ok
Dec 02 08:06:00 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.c scrub starts
Dec 02 08:06:00 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec 02 08:06:00 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.c scrub ok
Dec 02 08:06:02 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec 02 08:06:02 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 02 08:06:03 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.e scrub starts
Dec 02 08:06:03 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.e scrub ok
Dec 02 08:06:06 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Dec 02 08:06:06 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Dec 02 08:06:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:06:11 np0005541913.localdomain podman[56190]: 2025-12-02 08:06:11.440954269 +0000 UTC m=+0.085390097 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Dec 02 08:06:11 np0005541913.localdomain podman[56190]: 2025-12-02 08:06:11.629861836 +0000 UTC m=+0.274297604 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, distribution-scope=public, version=17.1.12)
Dec 02 08:06:11 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:06:12 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec 02 08:06:12 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec 02 08:06:13 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Dec 02 08:06:13 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Dec 02 08:06:14 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.11 deep-scrub starts
Dec 02 08:06:15 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec 02 08:06:15 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 02 08:06:15 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec 02 08:06:18 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec 02 08:06:18 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec 02 08:06:20 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.17 deep-scrub starts
Dec 02 08:06:20 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec 02 08:06:20 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.17 deep-scrub ok
Dec 02 08:06:20 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec 02 08:06:21 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.a deep-scrub starts
Dec 02 08:06:21 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec 02 08:06:21 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.a deep-scrub ok
Dec 02 08:06:25 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec 02 08:06:25 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec 02 08:06:26 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec 02 08:06:26 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec 02 08:06:26 np0005541913.localdomain sudo[56235]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgirvjockvgzrdxoqoirttalowtqxmsq ; /usr/bin/python3
Dec 02 08:06:26 np0005541913.localdomain sudo[56235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:26 np0005541913.localdomain python3[56237]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:06:26 np0005541913.localdomain sudo[56235]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:27 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.1e deep-scrub starts
Dec 02 08:06:27 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.1e deep-scrub ok
Dec 02 08:06:27 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.7 deep-scrub starts
Dec 02 08:06:27 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.7 deep-scrub ok
Dec 02 08:06:28 np0005541913.localdomain sudo[56251]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfdtiysutqboqryjfmxtxjovtoqcyopp ; /usr/bin/python3
Dec 02 08:06:28 np0005541913.localdomain sudo[56251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:28 np0005541913.localdomain python3[56253]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:06:28 np0005541913.localdomain sudo[56251]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:29 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Dec 02 08:06:29 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Dec 02 08:06:30 np0005541913.localdomain sudo[56267]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aufxtsdmaayqibsaxpsqvjhtzsflkkmx ; /usr/bin/python3
Dec 02 08:06:30 np0005541913.localdomain sudo[56267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:30 np0005541913.localdomain python3[56269]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:06:30 np0005541913.localdomain sudo[56267]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:31 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec 02 08:06:31 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec 02 08:06:31 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec 02 08:06:31 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec 02 08:06:32 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.b scrub starts
Dec 02 08:06:32 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.b scrub ok
Dec 02 08:06:33 np0005541913.localdomain sudo[56315]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfswaizkvshjqxlutskryvofejjptxir ; /usr/bin/python3
Dec 02 08:06:33 np0005541913.localdomain sudo[56315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:33 np0005541913.localdomain python3[56317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:06:33 np0005541913.localdomain sudo[56315]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:33 np0005541913.localdomain sudo[56358]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmmfiqajimirmjmbltoyyprluhndgmqj ; /usr/bin/python3
Dec 02 08:06:33 np0005541913.localdomain sudo[56358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:33 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.12 deep-scrub starts
Dec 02 08:06:33 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.12 deep-scrub ok
Dec 02 08:06:33 np0005541913.localdomain python3[56360]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662792.8810189-91322-236075018444877/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=55e6802793866e8195bd7dc6c06395cc4184e741 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:06:33 np0005541913.localdomain sudo[56358]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:38 np0005541913.localdomain sudo[56420]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvjmbyrlwmfxzgtconlpffdssmdxdsks ; /usr/bin/python3
Dec 02 08:06:38 np0005541913.localdomain sudo[56420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:38 np0005541913.localdomain python3[56422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:06:38 np0005541913.localdomain sudo[56420]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:38 np0005541913.localdomain sudo[56463]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvxgxpkucvrywdksioxaatxqlsddsxkv ; /usr/bin/python3
Dec 02 08:06:38 np0005541913.localdomain sudo[56463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:38 np0005541913.localdomain python3[56465]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662797.962291-91322-254794161285859/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=32e95cb48a0c881d4099e3645e940da5c77bc88c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:06:38 np0005541913.localdomain sudo[56463]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:40 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.f deep-scrub starts
Dec 02 08:06:40 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.f deep-scrub ok
Dec 02 08:06:41 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec 02 08:06:41 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec 02 08:06:41 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec 02 08:06:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:06:42 np0005541913.localdomain podman[56480]: 2025-12-02 08:06:42.429527259 +0000 UTC m=+0.074846399 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Dec 02 08:06:42 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec 02 08:06:42 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec 02 08:06:42 np0005541913.localdomain podman[56480]: 2025-12-02 08:06:42.624357706 +0000 UTC m=+0.269676826 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:06:42 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:06:43 np0005541913.localdomain sudo[56556]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axkgruwnbgtrwjtcfmuwyjobqtbqujdf ; /usr/bin/python3
Dec 02 08:06:43 np0005541913.localdomain sudo[56556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:43 np0005541913.localdomain python3[56558]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:06:43 np0005541913.localdomain sudo[56556]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:43 np0005541913.localdomain sudo[56599]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctjkuqwiegguholmpewtjxlntaqbjqos ; /usr/bin/python3
Dec 02 08:06:43 np0005541913.localdomain sudo[56599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:43 np0005541913.localdomain python3[56601]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662803.0061452-91322-107019927739205/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=ed42d7e7572ec51630a216299b8e7374862502cf backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:06:43 np0005541913.localdomain sudo[56599]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:45 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 45 pg[7.0( v 42'39 (0'0,42'39] local-lis/les=39/40 n=22 ec=39/39 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=11.633689880s) [1,5,3] r=2 lpr=45 pi=[39,45)/1 luod=0'0 lua=42'37 crt=42'39 lcod 42'38 mlcod 0'0 active pruub 1181.808227539s@ mbc={}] start_peering_interval up [1,5,3] -> [1,5,3], acting [1,5,3] -> [1,5,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:45 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 45 pg[7.0( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=39/39 lis/c=39/39 les/c/f=40/40/0 sis=45 pruub=11.632221222s) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 lcod 42'38 mlcod 0'0 unknown NOTIFY pruub 1181.808227539s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:45 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 45 pg[6.0( empty local-lis/les=37/38 n=0 ec=37/37 lis/c=37/37 les/c/f=38/38/0 sis=45 pruub=8.942852020s) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active pruub 1183.499755859s@ mbc={}] start_peering_interval up [0,4,2] -> [0,4,2], acting [0,4,2] -> [0,4,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:45 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 45 pg[6.0( empty local-lis/les=37/38 n=0 ec=37/37 lis/c=37/37 les/c/f=38/38/0 sis=45 pruub=8.942852020s) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown pruub 1183.499755859s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:45 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Dec 02 08:06:45 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Dec 02 08:06:45 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec 02 08:06:45 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.18( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1b( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1a( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.3( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.7( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.9( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.19( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.5( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.a( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.16( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.14( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.11( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.12( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.10( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=37/38 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.d( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.c( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.3( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=2 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.5( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=2 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.6( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=2 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.f( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.e( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.2( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=2 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.4( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=2 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.1( v 42'39 (0'0,42'39] local-lis/les=39/40 n=2 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.9( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.b( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.8( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.7( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 46 pg[7.a( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=39/40 n=1 ec=45/39 lis/c=39/39 les/c/f=40/40/0 sis=45) [1,5,3] r=2 lpr=45 pi=[39,45)/1 crt=42'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.5( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.a( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.0( empty local-lis/les=45/46 n=0 ec=37/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.9( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.19( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1b( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.18( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.3( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.10( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.11( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.12( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1a( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.7( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.14( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:46 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 46 pg[6.16( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=37/37 les/c/f=38/38/0 sis=45) [0,4,2] r=0 lpr=45 pi=[37,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:48 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.1e deep-scrub starts
Dec 02 08:06:48 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.1e deep-scrub ok
Dec 02 08:06:48 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.1e deep-scrub starts
Dec 02 08:06:48 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.1e deep-scrub ok
Dec 02 08:06:48 np0005541913.localdomain sudo[56661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdkgrsdssruxyjbrllgposlhwiteuvvr ; /usr/bin/python3
Dec 02 08:06:48 np0005541913.localdomain sudo[56661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:48 np0005541913.localdomain python3[56663]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:06:48 np0005541913.localdomain sudo[56661]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:49 np0005541913.localdomain sudo[56706]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-robqavoyemetrnbxbmebrojdnluxmdfo ; /usr/bin/python3
Dec 02 08:06:49 np0005541913.localdomain sudo[56706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:49 np0005541913.localdomain python3[56708]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662808.4760535-91675-114907344217399/source _original_basename=tmpvt8veae_ follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:06:49 np0005541913.localdomain sudo[56706]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.7( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.776106834s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.223754883s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.7( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.775996208s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1187.223754883s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.b( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.776648521s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.224731445s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.9( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.775749207s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.224121094s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.b( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.776539803s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1187.224731445s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.f( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.775081635s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.223388672s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.f( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.775035858s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1187.223388672s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.5( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.776295662s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.224853516s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.5( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.776230812s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1187.224853516s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.3( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.775509834s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.224243164s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.3( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.775459290s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1187.224243164s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.1( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.774242401s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.223022461s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.1( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.774214745s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1187.223022461s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.9( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.774991035s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1187.224121094s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.d( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.775052071s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.224243164s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[7.d( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.774992943s) [4,2,3] r=2 lpr=47 pi=[45,47)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1187.224243164s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.1f( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,5,1] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.c( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,1,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.6( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,4,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.4( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,1,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.f( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,5,1] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.14( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,4,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.b( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,1,2] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.13( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,2,1] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.1d( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,5,1] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.11( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,5,4] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.18( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767797470s) [0,1,2] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.612915039s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,2], acting [0,4,2] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1a( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.769163132s) [4,2,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.614379883s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.18( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767797470s) [0,1,2] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown pruub 1191.612915039s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.f( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766558647s) [3,5,1] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.611816406s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.f( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766497612s) [3,5,1] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.611816406s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1a( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.769031525s) [4,2,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.614379883s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.d( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767814636s) [1,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.613281250s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.d( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767749786s) [1,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.613281250s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.3( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767336845s) [4,5,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.613037109s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.3( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767288208s) [4,5,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.613037109s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767270088s) [2,1,3] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.613037109s@ mbc={}] start_peering_interval up [0,4,2] -> [2,1,3], acting [0,4,2] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767236710s) [2,1,3] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.613037109s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.7( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.768362045s) [4,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.614501953s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.6( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766625404s) [3,4,5] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.612792969s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.7( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.768322945s) [4,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.614501953s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.6( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766568184s) [3,4,5] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.612792969s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.9( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766295433s) [0,2,4] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.612670898s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.9( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766295433s) [0,2,4] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown pruub 1191.612670898s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.e( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.768082619s) [4,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.614379883s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.19( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766028404s) [1,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.612670898s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.19( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765970230s) [1,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.612670898s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.c( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767256737s) [3,1,5] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.613525391s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.4( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767774582s) [3,1,5] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.614624023s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.5( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765157700s) [4,2,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.611938477s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.4( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767678261s) [3,1,5] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.614624023s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.5( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765051842s) [4,2,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.611938477s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.c( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766743660s) [3,1,5] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.613525391s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.b( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765673637s) [3,1,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.612792969s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,2], acting [0,4,2] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.8( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765687943s) [1,2,3] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.612915039s@ mbc={}] start_peering_interval up [0,4,2] -> [1,2,3], acting [0,4,2] -> [1,2,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.8( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765621185s) [1,2,3] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.612915039s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.e( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767791748s) [4,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.614379883s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.b( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765572548s) [3,1,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.612792969s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.a( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764496803s) [4,0,2] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.611816406s@ mbc={}] start_peering_interval up [0,4,2] -> [4,0,2], acting [0,4,2] -> [4,0,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.14( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766647339s) [3,4,5] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.614135742s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.14( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766587257s) [3,4,5] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.614135742s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.a( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764464378s) [4,0,2] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.611816406s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.16( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767200470s) [0,1,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.614746094s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,5], acting [0,4,2] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.16( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.767200470s) [0,1,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown pruub 1191.614746094s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.11( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765805244s) [3,5,4] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.613525391s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,4], acting [0,4,2] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.13( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765371323s) [3,2,1] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.613037109s@ mbc={}] start_peering_interval up [0,4,2] -> [3,2,1], acting [0,4,2] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.11( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765760422s) [3,5,4] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.613525391s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.13( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765332222s) [3,2,1] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.613037109s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1d( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764786720s) [3,5,1] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.612670898s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1f( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766614914s) [3,5,1] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.614624023s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1d( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764717102s) [3,5,1] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.612670898s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1f( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.766586304s) [3,5,1] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.614624023s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1e( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764067650s) [5,1,3] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.612426758s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,3], acting [0,4,2] -> [5,1,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1e( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764037132s) [5,1,3] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.612426758s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.15( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765802383s) [4,5,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.614135742s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.15( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765705109s) [4,5,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.614135742s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.10( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764935493s) [0,2,4] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.613403320s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.10( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764935493s) [0,2,4] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown pruub 1191.613403320s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.2( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764878273s) [1,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.613525391s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1c( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.762330055s) [5,3,4] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.611206055s@ mbc={}] start_peering_interval up [0,4,2] -> [5,3,4], acting [0,4,2] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.12( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764959335s) [5,4,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.613769531s@ mbc={}] start_peering_interval up [0,4,2] -> [5,4,0], acting [0,4,2] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1c( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.762292862s) [5,3,4] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.611206055s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.12( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.764915466s) [5,4,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.613769531s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.17( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765299797s) [5,0,1] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.614624023s@ mbc={}] start_peering_interval up [0,4,2] -> [5,0,1], acting [0,4,2] -> [5,0,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.17( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.765261650s) [5,0,1] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.614624023s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.2( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.763521194s) [1,3,2] r=-1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.613525391s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1b( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.762686729s) [5,1,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active pruub 1191.612792969s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,0], acting [0,4,2] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:49 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 47 pg[6.1b( empty local-lis/les=45/46 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47 pruub=12.762652397s) [5,1,0] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1191.612792969s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain sudo[56768]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhvfzvuaecyqxkwufzszdxewbguvwkci ; /usr/bin/python3
Dec 02 08:06:50 np0005541913.localdomain sudo[56768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:50 np0005541913.localdomain python3[56770]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:06:50 np0005541913.localdomain sudo[56768]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.19( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [1,3,2] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.1e( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [5,1,3] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.d( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [1,3,2] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.1( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [2,1,3] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.2( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [1,3,2] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.e( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [4,3,2] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.8( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [1,2,3] r=2 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.1c( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [5,3,4] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 47 pg[6.7( empty local-lis/les=0/0 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [4,3,2] r=1 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 48 pg[6.16( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [0,1,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.c( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,1,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 48 pg[6.10( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [0,2,4] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 48 pg[6.18( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [0,1,2] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 48 pg[6.9( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [0,2,4] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.f( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,5,1] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.4( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,1,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.11( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,5,4] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.6( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,4,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.1d( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,5,1] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.1f( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,5,1] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.14( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,4,5] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.b( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,1,2] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 48 pg[6.13( empty local-lis/les=47/48 n=0 ec=45/37 lis/c=45/45 les/c/f=46/46/0 sis=47) [3,2,1] r=0 lpr=47 pi=[45,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:50 np0005541913.localdomain sudo[56811]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-giljilobsuhqebkijxddsufpxjcujpwh ; /usr/bin/python3
Dec 02 08:06:50 np0005541913.localdomain sudo[56811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:50 np0005541913.localdomain python3[56813]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662810.0898514-91774-79578624410722/source _original_basename=tmpu62kul09 follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:06:50 np0005541913.localdomain sudo[56811]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:51 np0005541913.localdomain sudo[56841]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-piruznwsxwqxnuwevrwujzurlgbldgmd ; /usr/bin/python3
Dec 02 08:06:51 np0005541913.localdomain sudo[56841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:51 np0005541913.localdomain python3[56843]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Dec 02 08:06:51 np0005541913.localdomain crontab[56844]: (root) LIST (root)
Dec 02 08:06:51 np0005541913.localdomain crontab[56845]: (root) REPLACE (root)
Dec 02 08:06:51 np0005541913.localdomain sudo[56841]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:51 np0005541913.localdomain sudo[56859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhmtbyrgboyqffyfpnalenjnoxbbbrwu ; /usr/bin/python3
Dec 02 08:06:51 np0005541913.localdomain sudo[56859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:51 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 49 pg[7.2( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=10.703760147s) [3,5,1] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.223388672s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:51 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 49 pg[7.6( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=10.703334808s) [3,5,1] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.223144531s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:51 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 49 pg[7.2( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=10.703760147s) [3,5,1] r=0 lpr=49 pi=[45,49)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1187.223388672s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:51 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 49 pg[7.6( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=10.703334808s) [3,5,1] r=0 lpr=49 pi=[45,49)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1187.223144531s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:51 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 49 pg[7.a( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=10.703518867s) [3,5,1] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.223754883s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:51 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 49 pg[7.a( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=10.703518867s) [3,5,1] r=0 lpr=49 pi=[45,49)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1187.223754883s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:51 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 49 pg[7.e( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=10.703692436s) [3,5,1] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1187.224243164s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:51 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 49 pg[7.e( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=10.703692436s) [3,5,1] r=0 lpr=49 pi=[45,49)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1187.224243164s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:51 np0005541913.localdomain python3[56861]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:06:51 np0005541913.localdomain sudo[56859]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:52 np0005541913.localdomain sudo[56909]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqlzzkoguiczvgedutvayisfjatjujui ; /usr/bin/python3
Dec 02 08:06:52 np0005541913.localdomain sudo[56909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:52 np0005541913.localdomain sudo[56909]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:52 np0005541913.localdomain sudo[56927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybprwhuybnyqcrugeemvtwakgovxulbu ; /usr/bin/python3
Dec 02 08:06:52 np0005541913.localdomain sudo[56927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:52 np0005541913.localdomain sudo[56927]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:52 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 50 pg[7.a( v 42'39 (0'0,42'39] local-lis/les=49/50 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,5,1] r=0 lpr=49 pi=[45,49)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:52 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 50 pg[7.6( v 42'39 (0'0,42'39] local-lis/les=49/50 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,5,1] r=0 lpr=49 pi=[45,49)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:52 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 50 pg[7.2( v 42'39 (0'0,42'39] local-lis/les=49/50 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,5,1] r=0 lpr=49 pi=[45,49)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:52 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 50 pg[7.e( v 42'39 (0'0,42'39] local-lis/les=49/50 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,5,1] r=0 lpr=49 pi=[45,49)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:06:53 np0005541913.localdomain sudo[57031]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lomquirffsnkqirteptwhwibhpwilnan ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662812.6819944-91863-12133943783798/async_wrapper.py 49553956186 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662812.6819944-91863-12133943783798/AnsiballZ_command.py _
Dec 02 08:06:53 np0005541913.localdomain sudo[57031]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 08:06:53 np0005541913.localdomain ansible-async_wrapper.py[57033]: Invoked with 49553956186 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764662812.6819944-91863-12133943783798/AnsiballZ_command.py _
Dec 02 08:06:53 np0005541913.localdomain ansible-async_wrapper.py[57036]: Starting module and watcher
Dec 02 08:06:53 np0005541913.localdomain ansible-async_wrapper.py[57036]: Start watching 57037 (3600)
Dec 02 08:06:53 np0005541913.localdomain ansible-async_wrapper.py[57037]: Start module (57037)
Dec 02 08:06:53 np0005541913.localdomain ansible-async_wrapper.py[57033]: Return async_wrapper task started.
Dec 02 08:06:53 np0005541913.localdomain sudo[57031]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:53 np0005541913.localdomain sudo[57052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehjcgplrmyaqztewqxjchludcfxmpzzr ; /usr/bin/python3
Dec 02 08:06:53 np0005541913.localdomain sudo[57052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:06:53 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec 02 08:06:53 np0005541913.localdomain python3[57057]: ansible-ansible.legacy.async_status Invoked with jid=49553956186.57033 mode=status _async_dir=/tmp/.ansible_async
Dec 02 08:06:53 np0005541913.localdomain sudo[57052]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:55 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec 02 08:06:55 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec 02 08:06:55 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec 02 08:06:56 np0005541913.localdomain puppet-user[57056]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:06:56 np0005541913.localdomain puppet-user[57056]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:06:56 np0005541913.localdomain puppet-user[57056]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:06:56 np0005541913.localdomain puppet-user[57056]:    (file & line not available)
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:    (file & line not available)
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.12 seconds
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Notice: Applied catalog in 0.04 seconds
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Application:
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:    Initial environment: production
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:    Converged environment: production
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:          Run mode: user
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Changes:
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Events:
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Resources:
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:             Total: 10
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Time:
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:          Schedule: 0.00
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:              File: 0.00
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:              Exec: 0.01
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:            Augeas: 0.01
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:    Transaction evaluation: 0.03
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:    Catalog application: 0.04
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:    Config retrieval: 0.15
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:          Last run: 1764662817
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:        Filebucket: 0.00
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:             Total: 0.05
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]: Version:
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:            Config: 1764662816
Dec 02 08:06:57 np0005541913.localdomain puppet-user[57056]:            Puppet: 7.10.0
Dec 02 08:06:57 np0005541913.localdomain ansible-async_wrapper.py[57037]: Module complete (57037)
Dec 02 08:06:57 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Dec 02 08:06:57 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Dec 02 08:06:58 np0005541913.localdomain ansible-async_wrapper.py[57036]: Done in kid B.
Dec 02 08:06:58 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Dec 02 08:06:58 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Dec 02 08:06:58 np0005541913.localdomain sudo[57169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:06:58 np0005541913.localdomain sudo[57169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:06:58 np0005541913.localdomain sudo[57169]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:58 np0005541913.localdomain sudo[57184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 08:06:58 np0005541913.localdomain sudo[57184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:06:59 np0005541913.localdomain sudo[57184]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:59 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 51 pg[7.7( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/50/0 sis=51 pruub=15.033974648s) [3,4,2] r=0 lpr=51 pi=[47,51)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1199.529296875s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:59 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 51 pg[7.f( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/49/0 sis=51 pruub=15.025829315s) [3,4,2] r=0 lpr=51 pi=[47,51)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1199.521118164s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:59 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 51 pg[7.3( v 42'39 (0'0,42'39] local-lis/les=47/48 n=2 ec=45/39 lis/c=47/47 les/c/f=48/49/0 sis=51 pruub=15.033367157s) [3,4,2] r=0 lpr=51 pi=[47,51)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1199.528564453s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:59 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 51 pg[7.b( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/48/0 sis=51 pruub=15.025732040s) [3,4,2] r=0 lpr=51 pi=[47,51)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1199.521118164s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:06:59 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 51 pg[7.f( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/49/0 sis=51 pruub=15.025829315s) [3,4,2] r=0 lpr=51 pi=[47,51)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1199.521118164s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:59 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 51 pg[7.3( v 42'39 (0'0,42'39] local-lis/les=47/48 n=2 ec=45/39 lis/c=47/47 les/c/f=48/49/0 sis=51 pruub=15.033367157s) [3,4,2] r=0 lpr=51 pi=[47,51)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1199.528564453s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:59 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 51 pg[7.b( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/48/0 sis=51 pruub=15.025732040s) [3,4,2] r=0 lpr=51 pi=[47,51)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1199.521118164s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:59 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 51 pg[7.7( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/50/0 sis=51 pruub=15.033974648s) [3,4,2] r=0 lpr=51 pi=[47,51)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1199.529296875s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:06:59 np0005541913.localdomain sudo[57220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:06:59 np0005541913.localdomain sudo[57220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:06:59 np0005541913.localdomain sudo[57220]: pam_unix(sudo:session): session closed for user root
Dec 02 08:06:59 np0005541913.localdomain sudo[57235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:06:59 np0005541913.localdomain sudo[57235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:07:00 np0005541913.localdomain sudo[57235]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:00 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 52 pg[7.b( v 42'39 (0'0,42'39] local-lis/les=51/52 n=1 ec=45/39 lis/c=47/47 les/c/f=48/48/0 sis=51) [3,4,2] r=0 lpr=51 pi=[47,51)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:00 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 52 pg[7.7( v 42'39 (0'0,42'39] local-lis/les=51/52 n=1 ec=45/39 lis/c=47/47 les/c/f=48/50/0 sis=51) [3,4,2] r=0 lpr=51 pi=[47,51)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:00 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 52 pg[7.3( v 42'39 (0'0,42'39] local-lis/les=51/52 n=2 ec=45/39 lis/c=47/47 les/c/f=48/49/0 sis=51) [3,4,2] r=0 lpr=51 pi=[47,51)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:00 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 52 pg[7.f( v 42'39 (0'0,42'39] local-lis/les=51/52 n=1 ec=45/39 lis/c=47/47 les/c/f=48/49/0 sis=51) [3,4,2] r=0 lpr=51 pi=[47,51)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:00 np0005541913.localdomain sudo[57282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:07:00 np0005541913.localdomain sudo[57282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:07:00 np0005541913.localdomain sudo[57282]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:01 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 53 pg[7.c( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=53) [0,1,2] r=0 lpr=53 pi=[45,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:07:01 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 53 pg[7.4( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=8.593282700s) [0,1,2] r=-1 lpr=53 pi=[45,53)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1195.224121094s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:01 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 53 pg[7.c( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=8.592213631s) [0,1,2] r=-1 lpr=53 pi=[45,53)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1195.223144531s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:01 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 53 pg[7.4( v 42'39 (0'0,42'39] local-lis/les=45/46 n=2 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=8.593200684s) [0,1,2] r=-1 lpr=53 pi=[45,53)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1195.224121094s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:01 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 53 pg[7.c( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=8.592172623s) [0,1,2] r=-1 lpr=53 pi=[45,53)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1195.223144531s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:01 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 53 pg[7.4( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=53) [0,1,2] r=0 lpr=53 pi=[45,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:07:02 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 54 pg[7.4( v 42'39 lc 42'15 (0'0,42'39] local-lis/les=53/54 n=4 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=53) [0,1,2] r=0 lpr=53 pi=[45,53)/1 crt=42'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(1+2)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:02 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 54 pg[7.c( v 42'39 lc 42'17 (0'0,42'39] local-lis/les=53/54 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=53) [0,1,2] r=0 lpr=53 pi=[45,53)/1 crt=42'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:03 np0005541913.localdomain sudo[57310]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xudndjfffcnmeswsrvetkwpddhbbpmfp ; /usr/bin/python3
Dec 02 08:07:03 np0005541913.localdomain sudo[57310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:03 np0005541913.localdomain python3[57312]: ansible-ansible.legacy.async_status Invoked with jid=49553956186.57033 mode=status _async_dir=/tmp/.ansible_async
Dec 02 08:07:03 np0005541913.localdomain sudo[57310]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:04 np0005541913.localdomain sudo[57326]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vuophjvgberjopfztnpgyjlhodfsecoo ; /usr/bin/python3
Dec 02 08:07:04 np0005541913.localdomain sudo[57326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:04 np0005541913.localdomain python3[57328]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 08:07:04 np0005541913.localdomain sudo[57326]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:04 np0005541913.localdomain sudo[57342]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvvxcrmvgufpgvrsufbqpazwtnkzghek ; /usr/bin/python3
Dec 02 08:07:04 np0005541913.localdomain sudo[57342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:04 np0005541913.localdomain python3[57344]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:07:04 np0005541913.localdomain sudo[57342]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:05 np0005541913.localdomain sudo[57392]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icnnvgmrgztwxfndzrcjtynwstxjsivc ; /usr/bin/python3
Dec 02 08:07:05 np0005541913.localdomain sudo[57392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:05 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.1f scrub starts
Dec 02 08:07:05 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.1f scrub ok
Dec 02 08:07:05 np0005541913.localdomain python3[57394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:07:05 np0005541913.localdomain sudo[57392]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:05 np0005541913.localdomain sudo[57410]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgqcijzgvhvgvjlkimnjlhgmmkkzvltj ; /usr/bin/python3
Dec 02 08:07:05 np0005541913.localdomain sudo[57410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:05 np0005541913.localdomain python3[57412]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpqxu7g064 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 08:07:05 np0005541913.localdomain sudo[57410]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:05 np0005541913.localdomain sudo[57440]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgksoxhbdlnbmsypjuujdjaitxasgpku ; /usr/bin/python3
Dec 02 08:07:05 np0005541913.localdomain sudo[57440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:06 np0005541913.localdomain python3[57442]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:06 np0005541913.localdomain sudo[57440]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:06 np0005541913.localdomain sudo[57456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biypxbxxqgzdtnivtqtqvhazcfwdlxiz ; /usr/bin/python3
Dec 02 08:07:06 np0005541913.localdomain sudo[57456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:06 np0005541913.localdomain sudo[57456]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:07 np0005541913.localdomain sudo[57544]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgirrbkibhgskrqlmgpwfvgpdnurxvtj ; /usr/bin/python3
Dec 02 08:07:07 np0005541913.localdomain sudo[57544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:07 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec 02 08:07:07 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec 02 08:07:07 np0005541913.localdomain python3[57546]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 02 08:07:07 np0005541913.localdomain sudo[57544]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:08 np0005541913.localdomain sudo[57563]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkpysykocebyqcoebkzbvwcmsiifogcl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:08 np0005541913.localdomain sudo[57563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:08 np0005541913.localdomain python3[57565]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:08 np0005541913.localdomain sudo[57563]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:08 np0005541913.localdomain sudo[57579]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klwuckabhramujosgnkvqxplseipyroi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:08 np0005541913.localdomain sudo[57579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:08 np0005541913.localdomain sudo[57579]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:08 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.0 deep-scrub starts
Dec 02 08:07:08 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.0 deep-scrub ok
Dec 02 08:07:09 np0005541913.localdomain sudo[57595]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdvkxxhomsflxskojydlrlgfddnqfmob ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:09 np0005541913.localdomain sudo[57595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:09 np0005541913.localdomain python3[57597]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:07:09 np0005541913.localdomain sudo[57595]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:09 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec 02 08:07:09 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec 02 08:07:09 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 55 pg[7.5( v 42'39 (0'0,42'39] local-lis/les=47/48 n=2 ec=45/39 lis/c=47/47 les/c/f=48/50/0 sis=55 pruub=12.927244186s) [2,0,4] r=-1 lpr=55 pi=[47,55)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1207.528686523s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:09 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 55 pg[7.d( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/50/0 sis=55 pruub=12.933298111s) [2,0,4] r=-1 lpr=55 pi=[47,55)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1207.534790039s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:09 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 55 pg[7.5( v 42'39 (0'0,42'39] local-lis/les=47/48 n=2 ec=45/39 lis/c=47/47 les/c/f=48/50/0 sis=55 pruub=12.927161217s) [2,0,4] r=-1 lpr=55 pi=[47,55)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.528686523s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:09 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 55 pg[7.d( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/50/0 sis=55 pruub=12.933200836s) [2,0,4] r=-1 lpr=55 pi=[47,55)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.534790039s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:09 np0005541913.localdomain sudo[57645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyfesjojuvopewaygizkbujksurhawjt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:09 np0005541913.localdomain sudo[57645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:09 np0005541913.localdomain python3[57647]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:07:09 np0005541913.localdomain sudo[57645]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:09 np0005541913.localdomain sudo[57663]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzrbhqvbisndtjttdnrmujgtsittfjhq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:09 np0005541913.localdomain sudo[57663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:10 np0005541913.localdomain python3[57665]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:10 np0005541913.localdomain sudo[57663]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:10 np0005541913.localdomain sudo[57725]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwjdlybjuohppiglihxgrdyzpykwwszn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:10 np0005541913.localdomain sudo[57725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:10 np0005541913.localdomain python3[57727]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:07:10 np0005541913.localdomain sudo[57725]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:10 np0005541913.localdomain sudo[57743]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpqrjfynvqdonkbddewugeyqgcyqkaxk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:10 np0005541913.localdomain sudo[57743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:07:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4224 writes, 19K keys, 4224 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4224 writes, 344 syncs, 12.28 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 965 writes, 3586 keys, 965 commit groups, 1.0 writes per commit group, ingest: 1.55 MB, 0.00 MB/s
                                                          Interval WAL: 965 writes, 199 syncs, 4.85 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 02 08:07:10 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 55 pg[7.d( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=47/47 les/c/f=48/50/0 sis=55) [2,0,4] r=1 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:10 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 55 pg[7.5( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=47/47 les/c/f=48/50/0 sis=55) [2,0,4] r=1 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:10 np0005541913.localdomain python3[57745]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:10 np0005541913.localdomain sudo[57743]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:11 np0005541913.localdomain sudo[57805]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsmpsgikhgomeueolfirunfkfpnvzhix ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:11 np0005541913.localdomain sudo[57805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:11 np0005541913.localdomain python3[57807]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:07:11 np0005541913.localdomain sudo[57805]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:11 np0005541913.localdomain sudo[57823]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igrkuowaqqtedtbfqawuyculihjiuzvo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:11 np0005541913.localdomain sudo[57823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:11 np0005541913.localdomain python3[57825]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:11 np0005541913.localdomain sudo[57823]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:11 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 57 pg[7.e( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=57) [0,4,5] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:07:11 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 57 pg[7.6( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=57) [0,4,5] r=0 lpr=57 pi=[49,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:07:11 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 57 pg[7.6( v 42'39 (0'0,42'39] local-lis/les=49/50 n=2 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.969664574s) [0,4,5] r=-1 lpr=57 pi=[49,57)/1 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1209.587890625s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:11 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 57 pg[7.6( v 42'39 (0'0,42'39] local-lis/les=49/50 n=2 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.969498634s) [0,4,5] r=-1 lpr=57 pi=[49,57)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1209.587890625s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:11 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 57 pg[7.e( v 42'39 (0'0,42'39] local-lis/les=49/50 n=1 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.969781876s) [0,4,5] r=-1 lpr=57 pi=[49,57)/1 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1209.589599609s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:11 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 57 pg[7.e( v 42'39 (0'0,42'39] local-lis/les=49/50 n=1 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=12.969389915s) [0,4,5] r=-1 lpr=57 pi=[49,57)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1209.589599609s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:12 np0005541913.localdomain sudo[57885]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blfpuglfkeqoomqskbpjxveyptscqdvk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:12 np0005541913.localdomain sudo[57885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:12 np0005541913.localdomain python3[57887]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:07:12 np0005541913.localdomain sudo[57885]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:12 np0005541913.localdomain sudo[57903]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvfggtcbcnkyikhepmgrtymwxezsangy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:12 np0005541913.localdomain sudo[57903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:12 np0005541913.localdomain python3[57905]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:12 np0005541913.localdomain sudo[57903]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:12 np0005541913.localdomain sudo[57933]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycdsvcyuqvtbwuhleszogihkwppbcohr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:12 np0005541913.localdomain sudo[57933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:07:12 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 58 pg[7.e( v 42'39 lc 42'19 (0'0,42'39] local-lis/les=57/58 n=1 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=57) [0,4,5] r=0 lpr=57 pi=[49,57)/1 crt=42'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:12 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 58 pg[7.6( v 42'39 lc 0'0 (0'0,42'39] local-lis/les=57/58 n=2 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=57) [0,4,5] r=0 lpr=57 pi=[49,57)/1 crt=42'39 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:12 np0005541913.localdomain systemd[1]: tmp-crun.Sb6vs7.mount: Deactivated successfully.
Dec 02 08:07:12 np0005541913.localdomain podman[57936]: 2025-12-02 08:07:12.807536816 +0000 UTC m=+0.100076687 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=metrics_qdr, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:07:12 np0005541913.localdomain python3[57935]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:07:12 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:07:13 np0005541913.localdomain podman[57936]: 2025-12-02 08:07:13.002936109 +0000 UTC m=+0.295475940 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:07:13 np0005541913.localdomain systemd-rc-local-generator[57993]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:07:13 np0005541913.localdomain systemd-sysv-generator[57997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:07:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:07:13 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:07:13 np0005541913.localdomain sudo[57933]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:13 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec 02 08:07:13 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec 02 08:07:13 np0005541913.localdomain sudo[58049]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmtpagvuclpetvddhpgpvebfdltofeof ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:13 np0005541913.localdomain sudo[58049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:13 np0005541913.localdomain python3[58051]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:07:13 np0005541913.localdomain sudo[58049]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:13 np0005541913.localdomain sudo[58067]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adzevgedebcswxprogyfnitdkgzfgsyt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:13 np0005541913.localdomain sudo[58067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:14 np0005541913.localdomain python3[58069]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:14 np0005541913.localdomain sudo[58067]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:14 np0005541913.localdomain sudo[58129]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqorbitamicqstgbytlwsjqgtqvkdxum ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:14 np0005541913.localdomain sudo[58129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:14 np0005541913.localdomain python3[58131]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:07:14 np0005541913.localdomain sudo[58129]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:14 np0005541913.localdomain sudo[58147]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqmkofvbmqsautcgordxrfhggtnhnzbj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:14 np0005541913.localdomain sudo[58147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:14 np0005541913.localdomain python3[58149]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:14 np0005541913.localdomain sudo[58147]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:07:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Cumulative writes: 4992 writes, 22K keys, 4992 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4992 writes, 517 syncs, 9.66 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1604 writes, 5649 keys, 1604 commit groups, 1.0 writes per commit group, ingest: 2.18 MB, 0.00 MB/s
                                                          Interval WAL: 1604 writes, 319 syncs, 5.03 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 02 08:07:15 np0005541913.localdomain sudo[58177]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jafxalwqyllidmwtvftcephxmltqzzvn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:15 np0005541913.localdomain sudo[58177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:15 np0005541913.localdomain python3[58179]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:07:15 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:07:15 np0005541913.localdomain systemd-sysv-generator[58208]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:07:15 np0005541913.localdomain systemd-rc-local-generator[58204]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:07:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:07:15 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 08:07:15 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 08:07:15 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 08:07:15 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 08:07:15 np0005541913.localdomain sudo[58177]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:16 np0005541913.localdomain sudo[58235]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvepzrzhhvwtjpoiijgoxjxtcxbnmgvu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:16 np0005541913.localdomain sudo[58235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:16 np0005541913.localdomain python3[58237]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 02 08:07:16 np0005541913.localdomain sudo[58235]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:16 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Dec 02 08:07:16 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Dec 02 08:07:16 np0005541913.localdomain sudo[58251]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtsaemoaxsvsgawdxnrbtdwsoyxalujn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:16 np0005541913.localdomain sudo[58251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:17 np0005541913.localdomain sudo[58251]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:17 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Dec 02 08:07:17 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Dec 02 08:07:17 np0005541913.localdomain sudo[58293]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biuxqnccmmapawtymazaqcwpahxbxqlp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:07:17 np0005541913.localdomain sudo[58293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:17 np0005541913.localdomain python3[58295]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 02 08:07:18 np0005541913.localdomain podman[58373]: 2025-12-02 08:07:18.30208335 +0000 UTC m=+0.102201665 container create b8fad67557c944893e6ba5b70593fb712d6466a579fce129429ec2894b1f6ad8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step2, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z)
Dec 02 08:07:18 np0005541913.localdomain podman[58380]: 2025-12-02 08:07:18.319773782 +0000 UTC m=+0.100952531 container create c4991e2f61b871b06b61f9e6a52ca78b603ef5fc80d3f5d4703835ea958583b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, config_id=tripleo_step2, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, container_name=nova_virtqemud_init_logs, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 02 08:07:18 np0005541913.localdomain podman[58373]: 2025-12-02 08:07:18.247229116 +0000 UTC m=+0.047347461 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 02 08:07:18 np0005541913.localdomain systemd[1]: Started libpod-conmon-b8fad67557c944893e6ba5b70593fb712d6466a579fce129429ec2894b1f6ad8.scope.
Dec 02 08:07:18 np0005541913.localdomain systemd[1]: Started libpod-conmon-c4991e2f61b871b06b61f9e6a52ca78b603ef5fc80d3f5d4703835ea958583b1.scope.
Dec 02 08:07:18 np0005541913.localdomain podman[58380]: 2025-12-02 08:07:18.256365584 +0000 UTC m=+0.037544343 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:07:18 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:07:18 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:07:18 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae00fa9a5dde399a6e7e6528b55d78146560e04771ac7245c19ea55518318121/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Dec 02 08:07:18 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/977574eb305bf7cb5c1e2fdd973144cbfe7638acd584e0968bf15c31ee49846c/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:07:18 np0005541913.localdomain podman[58373]: 2025-12-02 08:07:18.39387645 +0000 UTC m=+0.193994735 container init b8fad67557c944893e6ba5b70593fb712d6466a579fce129429ec2894b1f6ad8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, container_name=nova_compute_init_log, config_id=tripleo_step2, build-date=2025-11-19T00:36:58Z, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:07:18 np0005541913.localdomain podman[58373]: 2025-12-02 08:07:18.405766525 +0000 UTC m=+0.205884810 container start b8fad67557c944893e6ba5b70593fb712d6466a579fce129429ec2894b1f6ad8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step2, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_compute_init_log, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:07:18 np0005541913.localdomain python3[58295]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764661676 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Dec 02 08:07:18 np0005541913.localdomain systemd[1]: libpod-b8fad67557c944893e6ba5b70593fb712d6466a579fce129429ec2894b1f6ad8.scope: Deactivated successfully.
Dec 02 08:07:18 np0005541913.localdomain podman[58380]: 2025-12-02 08:07:18.443536903 +0000 UTC m=+0.224715622 container init c4991e2f61b871b06b61f9e6a52ca78b603ef5fc80d3f5d4703835ea958583b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud_init_logs, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 08:07:18 np0005541913.localdomain podman[58380]: 2025-12-02 08:07:18.455305704 +0000 UTC m=+0.236484423 container start c4991e2f61b871b06b61f9e6a52ca78b603ef5fc80d3f5d4703835ea958583b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, config_id=tripleo_step2, build-date=2025-11-19T00:35:22Z, release=1761123044)
Dec 02 08:07:18 np0005541913.localdomain python3[58295]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764661676 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Dec 02 08:07:18 np0005541913.localdomain systemd[1]: libpod-c4991e2f61b871b06b61f9e6a52ca78b603ef5fc80d3f5d4703835ea958583b1.scope: Deactivated successfully.
Dec 02 08:07:18 np0005541913.localdomain podman[58411]: 2025-12-02 08:07:18.474362473 +0000 UTC m=+0.052377368 container died b8fad67557c944893e6ba5b70593fb712d6466a579fce129429ec2894b1f6ad8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-nova-compute, container_name=nova_compute_init_log, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 02 08:07:18 np0005541913.localdomain podman[58435]: 2025-12-02 08:07:18.519441581 +0000 UTC m=+0.044010340 container died c4991e2f61b871b06b61f9e6a52ca78b603ef5fc80d3f5d4703835ea958583b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step2, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_virtqemud_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git)
Dec 02 08:07:18 np0005541913.localdomain podman[58435]: 2025-12-02 08:07:18.54987128 +0000 UTC m=+0.074440019 container cleanup c4991e2f61b871b06b61f9e6a52ca78b603ef5fc80d3f5d4703835ea958583b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, config_id=tripleo_step2, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_virtqemud_init_logs, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:07:18 np0005541913.localdomain systemd[1]: libpod-conmon-c4991e2f61b871b06b61f9e6a52ca78b603ef5fc80d3f5d4703835ea958583b1.scope: Deactivated successfully.
Dec 02 08:07:18 np0005541913.localdomain podman[58412]: 2025-12-02 08:07:18.568172659 +0000 UTC m=+0.138629608 container cleanup b8fad67557c944893e6ba5b70593fb712d6466a579fce129429ec2894b1f6ad8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_compute_init_log, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Dec 02 08:07:18 np0005541913.localdomain systemd[1]: libpod-conmon-b8fad67557c944893e6ba5b70593fb712d6466a579fce129429ec2894b1f6ad8.scope: Deactivated successfully.
Dec 02 08:07:18 np0005541913.localdomain podman[58561]: 2025-12-02 08:07:18.977793447 +0000 UTC m=+0.083502516 container create 6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, config_id=tripleo_step2, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=create_virtlogd_wrapper, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 08:07:19 np0005541913.localdomain podman[58562]: 2025-12-02 08:07:19.009288685 +0000 UTC m=+0.105834825 container create db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=create_haproxy_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step2)
Dec 02 08:07:19 np0005541913.localdomain systemd[1]: Started libpod-conmon-6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f.scope.
Dec 02 08:07:19 np0005541913.localdomain podman[58561]: 2025-12-02 08:07:18.929418239 +0000 UTC m=+0.035127338 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:07:19 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:07:19 np0005541913.localdomain systemd[1]: Started libpod-conmon-db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d.scope.
Dec 02 08:07:19 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a55b3f6a1664d44c5b07239ef1b7cbc40e1e222dc48149b5fd43766f0362bb85/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Dec 02 08:07:19 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:07:19 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/368ab18e82291a20f6c3548bd942730bbcde8a3da90171ac2014bcabec91a7fe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 08:07:19 np0005541913.localdomain podman[58561]: 2025-12-02 08:07:19.046492438 +0000 UTC m=+0.152201507 container init 6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, distribution-scope=public, release=1761123044)
Dec 02 08:07:19 np0005541913.localdomain podman[58562]: 2025-12-02 08:07:18.94965207 +0000 UTC m=+0.046198260 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 02 08:07:19 np0005541913.localdomain podman[58562]: 2025-12-02 08:07:19.051571736 +0000 UTC m=+0.148117866 container init db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, container_name=create_haproxy_wrapper, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step2, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:07:19 np0005541913.localdomain podman[58561]: 2025-12-02 08:07:19.058845455 +0000 UTC m=+0.164554524 container start 6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, batch=17.1_20251118.1)
Dec 02 08:07:19 np0005541913.localdomain podman[58561]: 2025-12-02 08:07:19.059387489 +0000 UTC m=+0.165096568 container attach 6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=create_virtlogd_wrapper, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true)
Dec 02 08:07:19 np0005541913.localdomain podman[58562]: 2025-12-02 08:07:19.06053027 +0000 UTC m=+0.157076370 container start db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step2, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']})
Dec 02 08:07:19 np0005541913.localdomain podman[58562]: 2025-12-02 08:07:19.060789717 +0000 UTC m=+0.157335897 container attach db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=create_haproxy_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step2)
Dec 02 08:07:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ae00fa9a5dde399a6e7e6528b55d78146560e04771ac7245c19ea55518318121-merged.mount: Deactivated successfully.
Dec 02 08:07:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4991e2f61b871b06b61f9e6a52ca78b603ef5fc80d3f5d4703835ea958583b1-userdata-shm.mount: Deactivated successfully.
Dec 02 08:07:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-977574eb305bf7cb5c1e2fdd973144cbfe7638acd584e0968bf15c31ee49846c-merged.mount: Deactivated successfully.
Dec 02 08:07:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8fad67557c944893e6ba5b70593fb712d6466a579fce129429ec2894b1f6ad8-userdata-shm.mount: Deactivated successfully.
Dec 02 08:07:19 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 59 pg[7.f( v 42'39 (0'0,42'39] local-lis/les=51/52 n=1 ec=45/39 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=12.804286003s) [1,5,3] r=2 lpr=59 pi=[51,59)/1 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1217.578247070s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:19 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 59 pg[7.f( v 42'39 (0'0,42'39] local-lis/les=51/52 n=1 ec=45/39 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=12.804210663s) [1,5,3] r=2 lpr=59 pi=[51,59)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1217.578247070s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:19 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 59 pg[7.7( v 42'39 (0'0,42'39] local-lis/les=51/52 n=1 ec=45/39 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=12.800751686s) [1,5,3] r=2 lpr=59 pi=[51,59)/1 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1217.574951172s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:19 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 59 pg[7.7( v 42'39 (0'0,42'39] local-lis/les=51/52 n=1 ec=45/39 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=12.800600052s) [1,5,3] r=2 lpr=59 pi=[51,59)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1217.574951172s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:20 np0005541913.localdomain ovs-vsctl[58678]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Dec 02 08:07:21 np0005541913.localdomain systemd[1]: libpod-6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f.scope: Deactivated successfully.
Dec 02 08:07:21 np0005541913.localdomain systemd[1]: libpod-6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f.scope: Consumed 2.096s CPU time.
Dec 02 08:07:21 np0005541913.localdomain podman[58561]: 2025-12-02 08:07:21.152087285 +0000 UTC m=+2.257796334 container died 6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, architecture=x86_64, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044)
Dec 02 08:07:21 np0005541913.localdomain systemd[1]: tmp-crun.n28ABt.mount: Deactivated successfully.
Dec 02 08:07:21 np0005541913.localdomain podman[58815]: 2025-12-02 08:07:21.248135691 +0000 UTC m=+0.083271079 container cleanup 6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, container_name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:07:21 np0005541913.localdomain systemd[1]: libpod-conmon-6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f.scope: Deactivated successfully.
Dec 02 08:07:21 np0005541913.localdomain python3[58295]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764661676 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Dec 02 08:07:21 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 61 pg[7.8( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=12.415328026s) [3,4,5] r=0 lpr=61 pi=[45,61)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1219.225830078s@ mbc={}] start_peering_interval up [1,5,3] -> [3,4,5], acting [1,5,3] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:21 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 61 pg[7.8( v 42'39 (0'0,42'39] local-lis/les=45/46 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=12.415328026s) [3,4,5] r=0 lpr=61 pi=[45,61)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1219.225830078s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:07:22 np0005541913.localdomain systemd[1]: libpod-db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d.scope: Deactivated successfully.
Dec 02 08:07:22 np0005541913.localdomain systemd[1]: libpod-db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d.scope: Consumed 2.067s CPU time.
Dec 02 08:07:22 np0005541913.localdomain podman[58562]: 2025-12-02 08:07:22.016707658 +0000 UTC m=+3.113253768 container died db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=create_haproxy_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, config_id=tripleo_step2, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 08:07:22 np0005541913.localdomain podman[58853]: 2025-12-02 08:07:22.082186991 +0000 UTC m=+0.055389000 container cleanup db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, url=https://www.redhat.com, container_name=create_haproxy_wrapper, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step2, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:07:22 np0005541913.localdomain systemd[1]: libpod-conmon-db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d.scope: Deactivated successfully.
Dec 02 08:07:22 np0005541913.localdomain python3[58295]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Dec 02 08:07:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-368ab18e82291a20f6c3548bd942730bbcde8a3da90171ac2014bcabec91a7fe-merged.mount: Deactivated successfully.
Dec 02 08:07:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db624006aefb080ca6821323538acdc984433e40a65b9a8202590ed74cf3036d-userdata-shm.mount: Deactivated successfully.
Dec 02 08:07:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a55b3f6a1664d44c5b07239ef1b7cbc40e1e222dc48149b5fd43766f0362bb85-merged.mount: Deactivated successfully.
Dec 02 08:07:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f-userdata-shm.mount: Deactivated successfully.
Dec 02 08:07:22 np0005541913.localdomain sudo[58293]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:22 np0005541913.localdomain sudo[58906]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyotzugysjjckepfsyxbqglpxagkmops ; /usr/bin/python3
Dec 02 08:07:22 np0005541913.localdomain sudo[58906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:22 np0005541913.localdomain python3[58908]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:22 np0005541913.localdomain sudo[58906]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:22 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.18 scrub starts
Dec 02 08:07:22 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.18 scrub ok
Dec 02 08:07:22 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 62 pg[7.8( v 42'39 (0'0,42'39] local-lis/les=61/62 n=1 ec=45/39 lis/c=45/45 les/c/f=46/46/0 sis=61) [3,4,5] r=0 lpr=61 pi=[45,61)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:23 np0005541913.localdomain sudo[58954]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfachvxpkbpgpprmndjmvyqcawmwtsfz ; /usr/bin/python3
Dec 02 08:07:23 np0005541913.localdomain sudo[58954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:23 np0005541913.localdomain sudo[58954]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:23 np0005541913.localdomain sudo[58997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aumqntbodlkkehhekeyvuvkcygjscvfh ; /usr/bin/python3
Dec 02 08:07:23 np0005541913.localdomain sudo[58997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:23 np0005541913.localdomain sudo[58997]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:23 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec 02 08:07:23 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec 02 08:07:23 np0005541913.localdomain sudo[59027]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-issduwffroyyfluafeboevihpkuwepok ; /usr/bin/python3
Dec 02 08:07:23 np0005541913.localdomain sudo[59027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:24 np0005541913.localdomain python3[59029]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005541913 step=2 update_config_hash_only=False
Dec 02 08:07:24 np0005541913.localdomain sudo[59027]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:24 np0005541913.localdomain sudo[59043]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktlbtusixyofmpfdzaitwvcgxsiqybqc ; /usr/bin/python3
Dec 02 08:07:24 np0005541913.localdomain sudo[59043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:24 np0005541913.localdomain python3[59045]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:07:24 np0005541913.localdomain sudo[59043]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:24 np0005541913.localdomain sudo[59059]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjuxgvjszieywatqfbdckxkpcbfpkjly ; /usr/bin/python3
Dec 02 08:07:24 np0005541913.localdomain sudo[59059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:07:25 np0005541913.localdomain python3[59061]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 02 08:07:25 np0005541913.localdomain sudo[59059]: pam_unix(sudo:session): session closed for user root
Dec 02 08:07:26 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Dec 02 08:07:26 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Dec 02 08:07:29 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.1d deep-scrub starts
Dec 02 08:07:29 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.1d deep-scrub ok
Dec 02 08:07:29 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 63 pg[7.9( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=47/47 les/c/f=48/48/0 sis=63) [0,2,4] r=0 lpr=63 pi=[47,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:07:29 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 63 pg[7.9( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/48/0 sis=63 pruub=8.924983978s) [0,2,4] r=-1 lpr=63 pi=[47,63)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1223.529296875s@ mbc={}] start_peering_interval up [4,2,3] -> [0,2,4], acting [4,2,3] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:29 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 63 pg[7.9( v 42'39 (0'0,42'39] local-lis/les=47/48 n=1 ec=45/39 lis/c=47/47 les/c/f=48/48/0 sis=63 pruub=8.924929619s) [0,2,4] r=-1 lpr=63 pi=[47,63)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1223.529296875s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:30 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 02 08:07:31 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 64 pg[7.9( v 42'39 (0'0,42'39] local-lis/les=63/64 n=1 ec=45/39 lis/c=47/47 les/c/f=48/48/0 sis=63) [0,2,4] r=0 lpr=63 pi=[47,63)/1 crt=42'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:31 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 02 08:07:31 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 65 pg[7.a( v 42'39 (0'0,42'39] local-lis/les=49/50 n=1 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=65 pruub=8.800328255s) [2,0,4] r=-1 lpr=65 pi=[49,65)/1 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1225.584350586s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:31 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 65 pg[7.a( v 42'39 (0'0,42'39] local-lis/les=49/50 n=1 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=65 pruub=8.799696922s) [2,0,4] r=-1 lpr=65 pi=[49,65)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1225.584350586s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:32 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 65 pg[7.a( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=49/49 les/c/f=50/50/0 sis=65) [2,0,4] r=1 lpr=65 pi=[49,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:34 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 67 pg[7.b( v 42'39 (0'0,42'39] local-lis/les=51/52 n=1 ec=45/39 lis/c=51/51 les/c/f=52/52/0 sis=67 pruub=14.453248978s) [3,1,2] r=0 lpr=67 pi=[51,67)/1 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1233.576293945s@ mbc={}] start_peering_interval up [3,4,2] -> [3,1,2], acting [3,4,2] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:34 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 67 pg[7.b( v 42'39 (0'0,42'39] local-lis/les=51/52 n=1 ec=45/39 lis/c=51/51 les/c/f=52/52/0 sis=67 pruub=14.453248978s) [3,1,2] r=0 lpr=67 pi=[51,67)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown pruub 1233.576293945s@ mbc={}] state<Start>: transitioning to Primary
Dec 02 08:07:34 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Dec 02 08:07:34 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Dec 02 08:07:35 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 68 pg[7.c( v 42'39 (0'0,42'39] local-lis/les=53/54 n=1 ec=45/39 lis/c=53/53 les/c/f=54/54/0 sis=68 pruub=15.064795494s) [1,3,2] r=-1 lpr=68 pi=[53,68)/1 crt=42'39 mlcod 0'0 active pruub 1240.224365234s@ mbc={255={}}] start_peering_interval up [0,1,2] -> [1,3,2], acting [0,1,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:35 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 68 pg[7.c( v 42'39 (0'0,42'39] local-lis/les=53/54 n=1 ec=45/39 lis/c=53/53 les/c/f=54/54/0 sis=68 pruub=15.064718246s) [1,3,2] r=-1 lpr=68 pi=[53,68)/1 crt=42'39 mlcod 0'0 unknown NOTIFY pruub 1240.224365234s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:35 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 68 pg[7.b( v 42'39 (0'0,42'39] local-lis/les=67/68 n=1 ec=45/39 lis/c=51/51 les/c/f=52/52/0 sis=67) [3,1,2] r=0 lpr=67 pi=[51,67)/1 crt=42'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:35 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 02 08:07:35 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 02 08:07:37 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec 02 08:07:37 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 68 pg[7.c( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=53/53 les/c/f=54/54/0 sis=68) [1,3,2] r=1 lpr=68 pi=[53,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:37 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec 02 08:07:38 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 70 pg[7.d( v 42'39 (0'0,42'39] local-lis/les=55/56 n=1 ec=45/39 lis/c=55/55 les/c/f=56/56/0 sis=70 pruub=12.514230728s) [1,3,5] r=-1 lpr=70 pi=[55,70)/1 luod=0'0 crt=42'39 mlcod 0'0 active pruub 1240.024169922s@ mbc={}] start_peering_interval up [2,0,4] -> [1,3,5], acting [2,0,4] -> [1,3,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:38 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 70 pg[7.d( v 42'39 (0'0,42'39] local-lis/les=55/56 n=1 ec=45/39 lis/c=55/55 les/c/f=56/56/0 sis=70 pruub=12.514141083s) [1,3,5] r=-1 lpr=70 pi=[55,70)/1 crt=42'39 mlcod 0'0 unknown NOTIFY pruub 1240.024169922s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:38 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Dec 02 08:07:38 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Dec 02 08:07:39 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 7.3 deep-scrub starts
Dec 02 08:07:39 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 7.3 deep-scrub ok
Dec 02 08:07:39 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 70 pg[7.d( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=55/55 les/c/f=56/56/0 sis=70) [1,3,5] r=1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:39 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec 02 08:07:40 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec 02 08:07:40 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Dec 02 08:07:40 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.6 scrub ok
Dec 02 08:07:41 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.1 deep-scrub starts
Dec 02 08:07:41 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 3.1 deep-scrub ok
Dec 02 08:07:42 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec 02 08:07:42 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec 02 08:07:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:07:43 np0005541913.localdomain systemd[1]: tmp-crun.kK7BIJ.mount: Deactivated successfully.
Dec 02 08:07:43 np0005541913.localdomain podman[59062]: 2025-12-02 08:07:43.445771032 +0000 UTC m=+0.086055246 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, container_name=metrics_qdr, vcs-type=git, tcib_managed=true, release=1761123044)
Dec 02 08:07:43 np0005541913.localdomain podman[59062]: 2025-12-02 08:07:43.654478087 +0000 UTC m=+0.294762341 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1)
Dec 02 08:07:43 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:07:43 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec 02 08:07:43 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec 02 08:07:44 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec 02 08:07:44 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec 02 08:07:45 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 72 pg[7.e( v 42'39 (0'0,42'39] local-lis/les=57/58 n=1 ec=45/39 lis/c=57/57 les/c/f=58/58/0 sis=72 pruub=15.045168877s) [3,5,1] r=-1 lpr=72 pi=[57,72)/1 crt=42'39 mlcod 0'0 active pruub 1250.033569336s@ mbc={255={}}] start_peering_interval up [0,4,5] -> [3,5,1], acting [0,4,5] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:45 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 72 pg[7.e( v 42'39 (0'0,42'39] local-lis/les=57/58 n=1 ec=45/39 lis/c=57/57 les/c/f=58/58/0 sis=72 pruub=15.045111656s) [3,5,1] r=-1 lpr=72 pi=[57,72)/1 crt=42'39 mlcod 0'0 unknown NOTIFY pruub 1250.033569336s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:45 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 72 pg[7.e( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=57/57 les/c/f=58/58/0 sis=72) [3,5,1] r=0 lpr=72 pi=[57,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:07:45 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec 02 08:07:45 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec 02 08:07:46 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 73 pg[7.e( v 42'39 lc 42'19 (0'0,42'39] local-lis/les=72/73 n=1 ec=45/39 lis/c=57/57 les/c/f=58/58/0 sis=72) [3,5,1] r=0 lpr=72 pi=[57,72)/1 crt=42'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:47 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 74 pg[7.f( empty local-lis/les=0/0 n=0 ec=45/39 lis/c=59/59 les/c/f=60/60/0 sis=74) [0,5,1] r=0 lpr=74 pi=[59,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 02 08:07:47 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 74 pg[7.f( v 42'39 (0'0,42'39] local-lis/les=59/60 n=1 ec=45/39 lis/c=59/59 les/c/f=60/60/0 sis=74 pruub=13.090670586s) [0,5,1] r=-1 lpr=74 pi=[59,74)/1 luod=0'0 crt=42'39 lcod 0'0 mlcod 0'0 active pruub 1245.798461914s@ mbc={}] start_peering_interval up [1,5,3] -> [0,5,1], acting [1,5,3] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 02 08:07:47 np0005541913.localdomain ceph-osd[32582]: osd.3 pg_epoch: 74 pg[7.f( v 42'39 (0'0,42'39] local-lis/les=59/60 n=1 ec=45/39 lis/c=59/59 les/c/f=60/60/0 sis=74 pruub=13.090611458s) [0,5,1] r=-1 lpr=74 pi=[59,74)/1 crt=42'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1245.798461914s@ mbc={}] state<Start>: transitioning to Stray
Dec 02 08:07:48 np0005541913.localdomain ceph-osd[31622]: osd.0 pg_epoch: 75 pg[7.f( v 42'39 lc 42'1 (0'0,42'39] local-lis/les=74/75 n=3 ec=45/39 lis/c=59/59 les/c/f=60/60/0 sis=74) [0,5,1] r=0 lpr=74 pi=[59,74)/1 crt=42'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(2+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 02 08:07:48 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec 02 08:07:49 np0005541913.localdomain ceph-osd[31622]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec 02 08:07:51 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 02 08:07:51 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec 02 08:07:53 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec 02 08:07:53 np0005541913.localdomain ceph-osd[32582]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec 02 08:08:01 np0005541913.localdomain sudo[59090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:08:01 np0005541913.localdomain sudo[59090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:08:01 np0005541913.localdomain sudo[59090]: pam_unix(sudo:session): session closed for user root
Dec 02 08:08:01 np0005541913.localdomain sudo[59105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:08:01 np0005541913.localdomain sudo[59105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:08:01 np0005541913.localdomain sudo[59105]: pam_unix(sudo:session): session closed for user root
Dec 02 08:08:02 np0005541913.localdomain sudo[59152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:08:02 np0005541913.localdomain sudo[59152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:08:02 np0005541913.localdomain sudo[59152]: pam_unix(sudo:session): session closed for user root
Dec 02 08:08:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:08:14 np0005541913.localdomain podman[59167]: 2025-12-02 08:08:14.439423449 +0000 UTC m=+0.079042249 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Dec 02 08:08:14 np0005541913.localdomain podman[59167]: 2025-12-02 08:08:14.632030326 +0000 UTC m=+0.271649126 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team)
Dec 02 08:08:14 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:08:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:08:45 np0005541913.localdomain podman[59196]: 2025-12-02 08:08:45.430707299 +0000 UTC m=+0.071483709 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public)
Dec 02 08:08:45 np0005541913.localdomain podman[59196]: 2025-12-02 08:08:45.623268275 +0000 UTC m=+0.264044615 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step1, release=1761123044, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:08:45 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:09:02 np0005541913.localdomain anacron[18350]: Job `cron.weekly' started
Dec 02 08:09:02 np0005541913.localdomain anacron[18350]: Job `cron.weekly' terminated
Dec 02 08:09:02 np0005541913.localdomain sudo[59227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:09:02 np0005541913.localdomain sudo[59227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:09:02 np0005541913.localdomain sudo[59227]: pam_unix(sudo:session): session closed for user root
Dec 02 08:09:02 np0005541913.localdomain sudo[59242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 08:09:02 np0005541913.localdomain sudo[59242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:09:03 np0005541913.localdomain podman[59327]: 2025-12-02 08:09:03.492783997 +0000 UTC m=+0.067441813 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1763362218, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 08:09:03 np0005541913.localdomain podman[59327]: 2025-12-02 08:09:03.617326657 +0000 UTC m=+0.191984493 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7)
Dec 02 08:09:03 np0005541913.localdomain sudo[59242]: pam_unix(sudo:session): session closed for user root
Dec 02 08:09:03 np0005541913.localdomain sudo[59394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:09:03 np0005541913.localdomain sudo[59394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:09:03 np0005541913.localdomain sudo[59394]: pam_unix(sudo:session): session closed for user root
Dec 02 08:09:04 np0005541913.localdomain sudo[59409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:09:04 np0005541913.localdomain sudo[59409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:09:04 np0005541913.localdomain sudo[59409]: pam_unix(sudo:session): session closed for user root
Dec 02 08:09:05 np0005541913.localdomain sudo[59455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:09:05 np0005541913.localdomain sudo[59455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:09:05 np0005541913.localdomain sudo[59455]: pam_unix(sudo:session): session closed for user root
Dec 02 08:09:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:09:16 np0005541913.localdomain podman[59470]: 2025-12-02 08:09:16.452390553 +0000 UTC m=+0.092147295 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1)
Dec 02 08:09:16 np0005541913.localdomain podman[59470]: 2025-12-02 08:09:16.643070909 +0000 UTC m=+0.282827691 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:09:16 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:09:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:09:47 np0005541913.localdomain podman[59500]: 2025-12-02 08:09:47.503103093 +0000 UTC m=+0.143048590 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, container_name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1)
Dec 02 08:09:47 np0005541913.localdomain podman[59500]: 2025-12-02 08:09:47.732160483 +0000 UTC m=+0.372106070 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Dec 02 08:09:47 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:10:05 np0005541913.localdomain sudo[59529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:10:05 np0005541913.localdomain sudo[59529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:10:05 np0005541913.localdomain sudo[59529]: pam_unix(sudo:session): session closed for user root
Dec 02 08:10:05 np0005541913.localdomain sudo[59544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:10:05 np0005541913.localdomain sudo[59544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:10:06 np0005541913.localdomain sudo[59544]: pam_unix(sudo:session): session closed for user root
Dec 02 08:10:06 np0005541913.localdomain sudo[59590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:10:06 np0005541913.localdomain sudo[59590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:10:06 np0005541913.localdomain sudo[59590]: pam_unix(sudo:session): session closed for user root
Dec 02 08:10:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:10:18 np0005541913.localdomain systemd[1]: tmp-crun.a3MtZ2.mount: Deactivated successfully.
Dec 02 08:10:18 np0005541913.localdomain podman[59605]: 2025-12-02 08:10:18.470552416 +0000 UTC m=+0.113750166 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Dec 02 08:10:18 np0005541913.localdomain podman[59605]: 2025-12-02 08:10:18.645803764 +0000 UTC m=+0.289001504 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1)
Dec 02 08:10:18 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:10:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:10:49 np0005541913.localdomain systemd[1]: tmp-crun.afRSZs.mount: Deactivated successfully.
Dec 02 08:10:49 np0005541913.localdomain podman[59634]: 2025-12-02 08:10:49.445908332 +0000 UTC m=+0.090137135 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:10:49 np0005541913.localdomain podman[59634]: 2025-12-02 08:10:49.63887684 +0000 UTC m=+0.283105573 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:10:49 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:11:06 np0005541913.localdomain sudo[59664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:11:06 np0005541913.localdomain sudo[59664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:11:06 np0005541913.localdomain sudo[59664]: pam_unix(sudo:session): session closed for user root
Dec 02 08:11:07 np0005541913.localdomain sudo[59679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:11:07 np0005541913.localdomain sudo[59679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:11:07 np0005541913.localdomain sudo[59679]: pam_unix(sudo:session): session closed for user root
Dec 02 08:11:08 np0005541913.localdomain sudo[59725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:11:08 np0005541913.localdomain sudo[59725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:11:08 np0005541913.localdomain sudo[59725]: pam_unix(sudo:session): session closed for user root
Dec 02 08:11:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:11:20 np0005541913.localdomain podman[59740]: 2025-12-02 08:11:20.419313815 +0000 UTC m=+0.057869595 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:11:20 np0005541913.localdomain podman[59740]: 2025-12-02 08:11:20.619973395 +0000 UTC m=+0.258529195 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:11:20 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:11:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:11:51 np0005541913.localdomain podman[59769]: 2025-12-02 08:11:51.434642306 +0000 UTC m=+0.075482761 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, version=17.1.12)
Dec 02 08:11:51 np0005541913.localdomain podman[59769]: 2025-12-02 08:11:51.629001852 +0000 UTC m=+0.269842307 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:11:51 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:12:08 np0005541913.localdomain sudo[59798]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:12:08 np0005541913.localdomain sudo[59798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:12:08 np0005541913.localdomain sudo[59798]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:08 np0005541913.localdomain sudo[59813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:12:08 np0005541913.localdomain sudo[59813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:12:09 np0005541913.localdomain sudo[59813]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:10 np0005541913.localdomain sudo[59860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:12:10 np0005541913.localdomain sudo[59860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:12:10 np0005541913.localdomain sudo[59860]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:10 np0005541913.localdomain sudo[59920]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpvksfhcsnqtgsbekjtveckpkkrhlggc ; /usr/bin/python3
Dec 02 08:12:10 np0005541913.localdomain sudo[59920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:10 np0005541913.localdomain python3[59922]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:12:10 np0005541913.localdomain sudo[59920]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:11 np0005541913.localdomain sudo[59965]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjzyuseyrvpgcgxfhjsgjurjrirwksqm ; /usr/bin/python3
Dec 02 08:12:11 np0005541913.localdomain sudo[59965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:11 np0005541913.localdomain python3[59967]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663130.4798465-98260-93706956727259/source _original_basename=tmpfp834ra9 follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:11 np0005541913.localdomain sudo[59965]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:12 np0005541913.localdomain sudo[59995]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjkbkmcataspcfdvhlwsqzllbekrgxjn ; /usr/bin/python3
Dec 02 08:12:12 np0005541913.localdomain sudo[59995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:12 np0005541913.localdomain python3[59997]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:12 np0005541913.localdomain sudo[59995]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:12 np0005541913.localdomain sudo[60045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvhwjrovdhqvrdpkinvwpydafmcnyygs ; /usr/bin/python3
Dec 02 08:12:12 np0005541913.localdomain sudo[60045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:13 np0005541913.localdomain sudo[60045]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:13 np0005541913.localdomain sudo[60063]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzmxsmvtukrindtabaylqhkqejvyfpej ; /usr/bin/python3
Dec 02 08:12:13 np0005541913.localdomain sudo[60063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:13 np0005541913.localdomain sudo[60063]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:14 np0005541913.localdomain sudo[60167]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpvjvrsytwuombwpoxvyeyddpwohuyng ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663133.609962-98468-74268929960886/async_wrapper.py 537411591793 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663133.609962-98468-74268929960886/AnsiballZ_command.py _
Dec 02 08:12:14 np0005541913.localdomain sudo[60167]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 08:12:14 np0005541913.localdomain ansible-async_wrapper.py[60169]: Invoked with 537411591793 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663133.609962-98468-74268929960886/AnsiballZ_command.py _
Dec 02 08:12:14 np0005541913.localdomain ansible-async_wrapper.py[60172]: Starting module and watcher
Dec 02 08:12:14 np0005541913.localdomain ansible-async_wrapper.py[60172]: Start watching 60173 (3600)
Dec 02 08:12:14 np0005541913.localdomain ansible-async_wrapper.py[60173]: Start module (60173)
Dec 02 08:12:14 np0005541913.localdomain ansible-async_wrapper.py[60169]: Return async_wrapper task started.
Dec 02 08:12:14 np0005541913.localdomain sudo[60167]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:14 np0005541913.localdomain sudo[60191]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daahqtcyyeryoytzscrwscykdwmrlaqh ; /usr/bin/python3
Dec 02 08:12:14 np0005541913.localdomain sudo[60191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:14 np0005541913.localdomain python3[60193]: ansible-ansible.legacy.async_status Invoked with jid=537411591793.60169 mode=status _async_dir=/tmp/.ansible_async
Dec 02 08:12:14 np0005541913.localdomain sudo[60191]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:17 np0005541913.localdomain puppet-user[60182]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:12:17 np0005541913.localdomain puppet-user[60182]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:12:17 np0005541913.localdomain puppet-user[60182]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:12:17 np0005541913.localdomain puppet-user[60182]:    (file & line not available)
Dec 02 08:12:17 np0005541913.localdomain puppet-user[60182]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:12:17 np0005541913.localdomain puppet-user[60182]:    (file & line not available)
Dec 02 08:12:17 np0005541913.localdomain puppet-user[60182]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.11 seconds
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]: Notice: Applied catalog in 0.04 seconds
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]: Application:
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:    Initial environment: production
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:    Converged environment: production
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:          Run mode: user
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]: Changes:
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]: Events:
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]: Resources:
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:             Total: 10
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]: Time:
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:          Schedule: 0.00
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:              File: 0.00
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:              Exec: 0.01
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:            Augeas: 0.01
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:    Transaction evaluation: 0.03
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:    Catalog application: 0.04
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:    Config retrieval: 0.15
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:          Last run: 1764663138
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:        Filebucket: 0.00
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:             Total: 0.05
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]: Version:
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:            Config: 1764663137
Dec 02 08:12:18 np0005541913.localdomain puppet-user[60182]:            Puppet: 7.10.0
Dec 02 08:12:18 np0005541913.localdomain ansible-async_wrapper.py[60173]: Module complete (60173)
Dec 02 08:12:19 np0005541913.localdomain ansible-async_wrapper.py[60172]: Done in kid B.
Dec 02 08:12:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:12:22 np0005541913.localdomain systemd[1]: tmp-crun.3ZaEcr.mount: Deactivated successfully.
Dec 02 08:12:22 np0005541913.localdomain podman[60305]: 2025-12-02 08:12:22.445312778 +0000 UTC m=+0.086831224 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, version=17.1.12, release=1761123044, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:12:22 np0005541913.localdomain podman[60305]: 2025-12-02 08:12:22.635830709 +0000 UTC m=+0.277349185 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:12:22 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:12:24 np0005541913.localdomain sudo[60348]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxsuwmtwjhpxkufivgrnqenpfxtrnibw ; /usr/bin/python3
Dec 02 08:12:24 np0005541913.localdomain sudo[60348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:24 np0005541913.localdomain python3[60350]: ansible-ansible.legacy.async_status Invoked with jid=537411591793.60169 mode=status _async_dir=/tmp/.ansible_async
Dec 02 08:12:24 np0005541913.localdomain sudo[60348]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:25 np0005541913.localdomain sudo[60364]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpljmvvivunfqcdcgnmgecjlxmzyuwui ; /usr/bin/python3
Dec 02 08:12:25 np0005541913.localdomain sudo[60364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:25 np0005541913.localdomain python3[60366]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 08:12:25 np0005541913.localdomain sudo[60364]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:25 np0005541913.localdomain sudo[60380]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjebbntzkwkkuhxcafulotijoruvtcsg ; /usr/bin/python3
Dec 02 08:12:25 np0005541913.localdomain sudo[60380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:26 np0005541913.localdomain python3[60382]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:26 np0005541913.localdomain sudo[60380]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:26 np0005541913.localdomain sudo[60430]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpahknwbpxfnvbenapivgmkrumzjpryl ; /usr/bin/python3
Dec 02 08:12:26 np0005541913.localdomain sudo[60430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:26 np0005541913.localdomain python3[60432]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:12:26 np0005541913.localdomain sudo[60430]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:26 np0005541913.localdomain sudo[60448]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibyonkhfvnbpfvgycqklsjeykzvpjpxk ; /usr/bin/python3
Dec 02 08:12:26 np0005541913.localdomain sudo[60448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:26 np0005541913.localdomain python3[60450]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpijj274xa recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 08:12:26 np0005541913.localdomain sudo[60448]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:27 np0005541913.localdomain sudo[60478]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efaotipgromgnvdewmpnzbmpextqihmb ; /usr/bin/python3
Dec 02 08:12:27 np0005541913.localdomain sudo[60478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:27 np0005541913.localdomain python3[60480]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:27 np0005541913.localdomain sudo[60478]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:27 np0005541913.localdomain sudo[60494]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shjivmnfrrodzycfvxqfvaunzmutnbtf ; /usr/bin/python3
Dec 02 08:12:27 np0005541913.localdomain sudo[60494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:27 np0005541913.localdomain sudo[60494]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:28 np0005541913.localdomain sudo[60581]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abovrigeukddfpgosnwklczubcdyuhuv ; /usr/bin/python3
Dec 02 08:12:28 np0005541913.localdomain sudo[60581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:28 np0005541913.localdomain python3[60583]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 02 08:12:28 np0005541913.localdomain sudo[60581]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:29 np0005541913.localdomain sudo[60600]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzxnjjfhintoqzknenyoqshdsruiyvpa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:29 np0005541913.localdomain sudo[60600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:29 np0005541913.localdomain python3[60602]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:29 np0005541913.localdomain sudo[60600]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:29 np0005541913.localdomain sudo[60616]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqszwadteddyplcqntdwtblxtdqltkra ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:29 np0005541913.localdomain sudo[60616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:29 np0005541913.localdomain sudo[60616]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:30 np0005541913.localdomain sudo[60632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bexgxknwijqqbittopcnlnugmjpsvtgh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:30 np0005541913.localdomain sudo[60632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:30 np0005541913.localdomain python3[60634]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:30 np0005541913.localdomain sudo[60632]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:31 np0005541913.localdomain sudo[60682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxoykugucmwpkgazgqgogmbrtnzowpjt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:31 np0005541913.localdomain sudo[60682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:31 np0005541913.localdomain python3[60684]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:12:31 np0005541913.localdomain sudo[60682]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:31 np0005541913.localdomain sudo[60700]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwaswcnhfkxxbploxdalosyudpepwqvu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:31 np0005541913.localdomain sudo[60700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:31 np0005541913.localdomain python3[60702]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:31 np0005541913.localdomain sudo[60700]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:31 np0005541913.localdomain sudo[60762]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agthaipukyxuqhqytloggxmhwvegnvrz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:31 np0005541913.localdomain sudo[60762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:32 np0005541913.localdomain python3[60764]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:12:32 np0005541913.localdomain sudo[60762]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:32 np0005541913.localdomain sudo[60780]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thkanulzhqlgttzmtvejjiybciltpfry ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:32 np0005541913.localdomain sudo[60780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:32 np0005541913.localdomain python3[60782]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:32 np0005541913.localdomain sudo[60780]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:33 np0005541913.localdomain sudo[60842]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxxlikotbbpszhtzuyrxsoklwuvmraze ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:33 np0005541913.localdomain sudo[60842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:33 np0005541913.localdomain python3[60844]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:12:33 np0005541913.localdomain sudo[60842]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:33 np0005541913.localdomain sudo[60860]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwqtxlohfdixytmdooifzohehclftgla ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:33 np0005541913.localdomain sudo[60860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:33 np0005541913.localdomain python3[60862]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:33 np0005541913.localdomain sudo[60860]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:33 np0005541913.localdomain sudo[60922]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ainwwyghfrdtsgrckiltrduxrsreferg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:33 np0005541913.localdomain sudo[60922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:34 np0005541913.localdomain python3[60924]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:12:34 np0005541913.localdomain sudo[60922]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:34 np0005541913.localdomain sudo[60940]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbunuvwsepuzncabngxlktglagepldth ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:34 np0005541913.localdomain sudo[60940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:34 np0005541913.localdomain python3[60942]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:34 np0005541913.localdomain sudo[60940]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:34 np0005541913.localdomain sudo[60970]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-poyxvknwxmmgqmynzwgmaadbyrjwukod ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:34 np0005541913.localdomain sudo[60970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:34 np0005541913.localdomain python3[60972]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:12:34 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:12:34 np0005541913.localdomain systemd-sysv-generator[61003]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:12:34 np0005541913.localdomain systemd-rc-local-generator[61000]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:12:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:12:35 np0005541913.localdomain sudo[60970]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:35 np0005541913.localdomain sudo[61056]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yepixtzqmiuvlbwyiqwuuhqyezsidlgy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:35 np0005541913.localdomain sudo[61056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:35 np0005541913.localdomain python3[61058]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:12:35 np0005541913.localdomain sudo[61056]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:35 np0005541913.localdomain sudo[61074]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwptlrwexznxokmnyntkgzxjgeokclpp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:35 np0005541913.localdomain sudo[61074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:36 np0005541913.localdomain python3[61076]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:36 np0005541913.localdomain sudo[61074]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:36 np0005541913.localdomain sudo[61136]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teoijehseyjzqycncaqtnyhfvgflopbb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:36 np0005541913.localdomain sudo[61136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:36 np0005541913.localdomain python3[61138]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:12:36 np0005541913.localdomain sudo[61136]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:36 np0005541913.localdomain sudo[61154]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvcrgiesjaqtdmfjvstypzgdwvnixgcd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:36 np0005541913.localdomain sudo[61154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:36 np0005541913.localdomain python3[61156]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:36 np0005541913.localdomain sudo[61154]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:37 np0005541913.localdomain sudo[61184]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whdkdzvohhasxdwiggvezgmofvulcpab ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:37 np0005541913.localdomain sudo[61184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:37 np0005541913.localdomain python3[61186]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:12:37 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:12:37 np0005541913.localdomain systemd-sysv-generator[61214]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:12:37 np0005541913.localdomain systemd-rc-local-generator[61209]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:12:37 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:12:37 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 08:12:37 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 08:12:37 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 08:12:37 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 08:12:37 np0005541913.localdomain sudo[61184]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:38 np0005541913.localdomain sudo[61241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htkvdhduhbopslwfseenvhnzalghkucn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:38 np0005541913.localdomain sudo[61241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:38 np0005541913.localdomain python3[61243]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 02 08:12:38 np0005541913.localdomain sudo[61241]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:38 np0005541913.localdomain sudo[61257]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gigdwuwpldrjjkrqmvblhjrvqnbwycub ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:38 np0005541913.localdomain sudo[61257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:38 np0005541913.localdomain sudo[61257]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:39 np0005541913.localdomain sudo[61297]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwfplyxttutzwxhwsazvwoclolphrukz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:39 np0005541913.localdomain sudo[61297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:39 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 02 08:12:40 np0005541913.localdomain podman[61461]: 2025-12-02 08:12:40.123170632 +0000 UTC m=+0.058326069 container create eeb97483e6e3a84a709a47bb762665bd54b701b239f0abc4e5f02f2760c5dd17 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3, io.buildah.version=1.41.4)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libpod-conmon-eeb97483e6e3a84a709a47bb762665bd54b701b239f0abc4e5f02f2760c5dd17.scope.
Dec 02 08:12:40 np0005541913.localdomain podman[61474]: 2025-12-02 08:12:40.159871203 +0000 UTC m=+0.085553079 container create 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:12:40 np0005541913.localdomain podman[61468]: 2025-12-02 08:12:40.176305156 +0000 UTC m=+0.104514371 container create 6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']})
Dec 02 08:12:40 np0005541913.localdomain podman[61462]: 2025-12-02 08:12:40.187938587 +0000 UTC m=+0.121420427 container create a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libpod-conmon-237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.scope.
Dec 02 08:12:40 np0005541913.localdomain podman[61462]: 2025-12-02 08:12:40.09187162 +0000 UTC m=+0.025353490 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 02 08:12:40 np0005541913.localdomain podman[61461]: 2025-12-02 08:12:40.093935466 +0000 UTC m=+0.029090913 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46603caa88f65e015c74097f596e48b006fc6fd2b23d7cf444ca3fcae1abca86/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libpod-conmon-6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56.scope.
Dec 02 08:12:40 np0005541913.localdomain podman[61461]: 2025-12-02 08:12:40.202090197 +0000 UTC m=+0.137245634 container init eeb97483e6e3a84a709a47bb762665bd54b701b239f0abc4e5f02f2760c5dd17 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082042a751b48593af3e4b42b09156dbc115dd133d7891319f3ff1ad0b672b0b/merged/scripts supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082042a751b48593af3e4b42b09156dbc115dd133d7891319f3ff1ad0b672b0b/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain podman[61461]: 2025-12-02 08:12:40.210320204 +0000 UTC m=+0.145475651 container start eeb97483e6e3a84a709a47bb762665bd54b701b239f0abc4e5f02f2760c5dd17 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: libpod-eeb97483e6e3a84a709a47bb762665bd54b701b239f0abc4e5f02f2760c5dd17.scope: Deactivated successfully.
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain podman[61474]: 2025-12-02 08:12:40.116436536 +0000 UTC m=+0.042118442 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 02 08:12:40 np0005541913.localdomain podman[61468]: 2025-12-02 08:12:40.116896919 +0000 UTC m=+0.045106124 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libpod-conmon-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11.scope.
Dec 02 08:12:40 np0005541913.localdomain podman[61545]: 2025-12-02 08:12:40.247982501 +0000 UTC m=+0.029861303 container died eeb97483e6e3a84a709a47bb762665bd54b701b239f0abc4e5f02f2760c5dd17 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step3, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:12:40 np0005541913.localdomain podman[61474]: 2025-12-02 08:12:40.256927438 +0000 UTC m=+0.182609324 container init 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Dec 02 08:12:40 np0005541913.localdomain podman[61468]: 2025-12-02 08:12:40.269230007 +0000 UTC m=+0.197439212 container init 6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, build-date=2025-11-19T00:35:22Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=nova_virtlogd_wrapper, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 02 08:12:40 np0005541913.localdomain sudo[61578]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:12:40 np0005541913.localdomain podman[61468]: 2025-12-02 08:12:40.278113792 +0000 UTC m=+0.206322997 container start 6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, version=17.1.12, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:12:40 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ff8ff724cb5f0d02131158e2fae849b6 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:12:40 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:12:40 np0005541913.localdomain podman[61474]: 2025-12-02 08:12:40.292253811 +0000 UTC m=+0.217935697 container start 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, container_name=collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:12:40 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4767aaabc3de112d8791c290aa2b669d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 02 08:12:40 np0005541913.localdomain sudo[61581]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:12:40 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 02 08:12:40 np0005541913.localdomain podman[61462]: 2025-12-02 08:12:40.316728756 +0000 UTC m=+0.250210596 container init a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:12:40 np0005541913.localdomain podman[61545]: 2025-12-02 08:12:40.320858729 +0000 UTC m=+0.102737511 container cleanup eeb97483e6e3a84a709a47bb762665bd54b701b239f0abc4e5f02f2760c5dd17 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, architecture=x86_64, container_name=ceilometer_init_log, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:12:45Z, distribution-scope=public, version=17.1.12, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: libpod-conmon-eeb97483e6e3a84a709a47bb762665bd54b701b239f0abc4e5f02f2760c5dd17.scope: Deactivated successfully.
Dec 02 08:12:40 np0005541913.localdomain podman[61462]: 2025-12-02 08:12:40.332225243 +0000 UTC m=+0.265707083 container start a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1761123044, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 02 08:12:40 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1c70cec5d3310de4d4589e1a95c8fd3c --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 02 08:12:40 np0005541913.localdomain sudo[61617]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:12:40 np0005541913.localdomain sudo[61617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:40 np0005541913.localdomain podman[61513]: 2025-12-02 08:12:40.356689467 +0000 UTC m=+0.231721857 container create 99492b9d730d6f61fe3a1a9619bd46efae9946c0fd0973cddb67893b4ee48d97 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libpod-conmon-99492b9d730d6f61fe3a1a9619bd46efae9946c0fd0973cddb67893b4ee48d97.scope.
Dec 02 08:12:40 np0005541913.localdomain sudo[61617]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:40 np0005541913.localdomain podman[61513]: 2025-12-02 08:12:40.316667714 +0000 UTC m=+0.191700104 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: libpod-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11.scope: Deactivated successfully.
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/422d62b3b9907c649268e279099615c7aa0520fd45eabb2e450a911bab63aaa2/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/422d62b3b9907c649268e279099615c7aa0520fd45eabb2e450a911bab63aaa2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/422d62b3b9907c649268e279099615c7aa0520fd45eabb2e450a911bab63aaa2/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain podman[61513]: 2025-12-02 08:12:40.425485713 +0000 UTC m=+0.300518083 container init 99492b9d730d6f61fe3a1a9619bd46efae9946c0fd0973cddb67893b4ee48d97 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_statedir_owner, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, version=17.1.12, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute)
Dec 02 08:12:40 np0005541913.localdomain podman[61513]: 2025-12-02 08:12:40.434549243 +0000 UTC m=+0.309581613 container start 99492b9d730d6f61fe3a1a9619bd46efae9946c0fd0973cddb67893b4ee48d97 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:12:40 np0005541913.localdomain podman[61513]: 2025-12-02 08:12:40.434905623 +0000 UTC m=+0.309938023 container attach 99492b9d730d6f61fe3a1a9619bd46efae9946c0fd0973cddb67893b4ee48d97 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.12, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:12:40 np0005541913.localdomain podman[61583]: 2025-12-02 08:12:40.439592842 +0000 UTC m=+0.137047718 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Dec 02 08:12:40 np0005541913.localdomain podman[61670]: 2025-12-02 08:12:40.459261504 +0000 UTC m=+0.037875345 container died a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Queued start job for default target Main User Target.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Created slice User Application Slice.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Reached target Paths.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Reached target Timers.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Starting D-Bus User Message Bus Socket...
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Starting Create User's Volatile Files and Directories...
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Finished Create User's Volatile Files and Directories.
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: libpod-99492b9d730d6f61fe3a1a9619bd46efae9946c0fd0973cddb67893b4ee48d97.scope: Deactivated successfully.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Listening on D-Bus User Message Bus Socket.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Reached target Sockets.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Reached target Basic System.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Reached target Main User Target.
Dec 02 08:12:40 np0005541913.localdomain systemd[61605]: Startup finished in 123ms.
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started User Manager for UID 0.
Dec 02 08:12:40 np0005541913.localdomain podman[61670]: 2025-12-02 08:12:40.48633758 +0000 UTC m=+0.064951401 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started Session c1 of User root.
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started Session c2 of User root.
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: libpod-conmon-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11.scope: Deactivated successfully.
Dec 02 08:12:40 np0005541913.localdomain sudo[61578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:40 np0005541913.localdomain sudo[61581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:40 np0005541913.localdomain podman[61513]: 2025-12-02 08:12:40.535277099 +0000 UTC m=+0.410309499 container died 99492b9d730d6f61fe3a1a9619bd46efae9946c0fd0973cddb67893b4ee48d97 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_statedir_owner, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git)
Dec 02 08:12:40 np0005541913.localdomain sudo[61578]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: session-c1.scope: Deactivated successfully.
Dec 02 08:12:40 np0005541913.localdomain podman[61708]: 2025-12-02 08:12:40.556286487 +0000 UTC m=+0.056440356 container cleanup 99492b9d730d6f61fe3a1a9619bd46efae9946c0fd0973cddb67893b4ee48d97 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=nova_statedir_owner, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: libpod-conmon-99492b9d730d6f61fe3a1a9619bd46efae9946c0fd0973cddb67893b4ee48d97.scope: Deactivated successfully.
Dec 02 08:12:40 np0005541913.localdomain sudo[61581]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: session-c2.scope: Deactivated successfully.
Dec 02 08:12:40 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1764661676 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Dec 02 08:12:40 np0005541913.localdomain podman[61583]: 2025-12-02 08:12:40.62532468 +0000 UTC m=+0.322779586 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Dec 02 08:12:40 np0005541913.localdomain podman[61583]: unhealthy
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Failed with result 'exit-code'.
Dec 02 08:12:40 np0005541913.localdomain podman[61843]: 2025-12-02 08:12:40.899710262 +0000 UTC m=+0.056032355 container create 9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libpod-conmon-9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1.scope.
Dec 02 08:12:40 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93ad9083e7cc3e7616303b5d13e7a101d6cbdaa325d96e32c757f24ef781f581/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93ad9083e7cc3e7616303b5d13e7a101d6cbdaa325d96e32c757f24ef781f581/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93ad9083e7cc3e7616303b5d13e7a101d6cbdaa325d96e32c757f24ef781f581/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93ad9083e7cc3e7616303b5d13e7a101d6cbdaa325d96e32c757f24ef781f581/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:40 np0005541913.localdomain podman[61843]: 2025-12-02 08:12:40.969018993 +0000 UTC m=+0.125341176 container init 9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 08:12:40 np0005541913.localdomain podman[61843]: 2025-12-02 08:12:40.873313175 +0000 UTC m=+0.029635268 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:40 np0005541913.localdomain podman[61843]: 2025-12-02 08:12:40.975747858 +0000 UTC m=+0.132069941 container start 9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:35:22Z, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container)
Dec 02 08:12:40 np0005541913.localdomain podman[61865]: 2025-12-02 08:12:40.983161582 +0000 UTC m=+0.073783435 container create d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtsecretd, release=1761123044, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc.)
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f.scope.
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain podman[61865]: 2025-12-02 08:12:41.041419328 +0000 UTC m=+0.132041201 container init d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, architecture=x86_64, container_name=nova_virtsecretd)
Dec 02 08:12:41 np0005541913.localdomain podman[61865]: 2025-12-02 08:12:40.944681412 +0000 UTC m=+0.035303305 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:41 np0005541913.localdomain podman[61865]: 2025-12-02 08:12:41.049361716 +0000 UTC m=+0.139983599 container start d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, container_name=nova_virtsecretd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 08:12:41 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ff8ff724cb5f0d02131158e2fae849b6 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:41 np0005541913.localdomain sudo[61892]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:12:41 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started Session c3 of User root.
Dec 02 08:12:41 np0005541913.localdomain sudo[61892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-46603caa88f65e015c74097f596e48b006fc6fd2b23d7cf444ca3fcae1abca86-merged.mount: Deactivated successfully.
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eeb97483e6e3a84a709a47bb762665bd54b701b239f0abc4e5f02f2760c5dd17-userdata-shm.mount: Deactivated successfully.
Dec 02 08:12:41 np0005541913.localdomain sudo[61892]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: session-c3.scope: Deactivated successfully.
Dec 02 08:12:41 np0005541913.localdomain podman[62014]: 2025-12-02 08:12:41.480676723 +0000 UTC m=+0.065557198 container create 21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 08:12:41 np0005541913.localdomain podman[62015]: 2025-12-02 08:12:41.513154558 +0000 UTC m=+0.093796105 container create c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8.scope.
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.scope.
Dec 02 08:12:41 np0005541913.localdomain podman[62014]: 2025-12-02 08:12:41.541379086 +0000 UTC m=+0.126259551 container init 21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_id=tripleo_step3, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_virtnodedevd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container)
Dec 02 08:12:41 np0005541913.localdomain podman[62014]: 2025-12-02 08:12:41.548644407 +0000 UTC m=+0.133524872 container start 21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtnodedevd, architecture=x86_64)
Dec 02 08:12:41 np0005541913.localdomain podman[62014]: 2025-12-02 08:12:41.448775134 +0000 UTC m=+0.033655619 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:41 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ff8ff724cb5f0d02131158e2fae849b6 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eee6dae47ff617871c47add2aa57f33c2f7e68905855055afb3a7b04648ecacd/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eee6dae47ff617871c47add2aa57f33c2f7e68905855055afb3a7b04648ecacd/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:41 np0005541913.localdomain podman[62015]: 2025-12-02 08:12:41.465943127 +0000 UTC m=+0.046584744 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:12:41 np0005541913.localdomain podman[62015]: 2025-12-02 08:12:41.582373446 +0000 UTC m=+0.163015003 container init c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z)
Dec 02 08:12:41 np0005541913.localdomain sudo[62050]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:12:41 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started Session c4 of User root.
Dec 02 08:12:41 np0005541913.localdomain sudo[62065]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:12:41 np0005541913.localdomain podman[62015]: 2025-12-02 08:12:41.615814828 +0000 UTC m=+0.196456365 container start c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:12:41 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=230f4ebc92ecc6f511b0217abb58f1b6 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 02 08:12:41 np0005541913.localdomain sudo[62050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:41 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: Started Session c5 of User root.
Dec 02 08:12:41 np0005541913.localdomain sudo[62065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:41 np0005541913.localdomain podman[62069]: 2025-12-02 08:12:41.693246141 +0000 UTC m=+0.067826500 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, container_name=iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Dec 02 08:12:41 np0005541913.localdomain sudo[62065]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:41 np0005541913.localdomain sudo[62050]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: session-c5.scope: Deactivated successfully.
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: session-c4.scope: Deactivated successfully.
Dec 02 08:12:41 np0005541913.localdomain kernel: Loading iSCSI transport class v2.0-870.
Dec 02 08:12:41 np0005541913.localdomain podman[62069]: 2025-12-02 08:12:41.784720912 +0000 UTC m=+0.159301291 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.12, release=1761123044, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 02 08:12:41 np0005541913.localdomain podman[62069]: unhealthy
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:12:41 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Failed with result 'exit-code'.
Dec 02 08:12:42 np0005541913.localdomain podman[62191]: 2025-12-02 08:12:42.109885703 +0000 UTC m=+0.089461566 container create 4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']})
Dec 02 08:12:42 np0005541913.localdomain systemd[1]: Started libpod-conmon-4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56.scope.
Dec 02 08:12:42 np0005541913.localdomain podman[62191]: 2025-12-02 08:12:42.064000469 +0000 UTC m=+0.043576392 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:42 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain podman[62191]: 2025-12-02 08:12:42.188945312 +0000 UTC m=+0.168521185 container init 4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Dec 02 08:12:42 np0005541913.localdomain podman[62191]: 2025-12-02 08:12:42.199152654 +0000 UTC m=+0.178728527 container start 4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vendor=Red Hat, Inc., config_id=tripleo_step3, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtstoraged, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:12:42 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ff8ff724cb5f0d02131158e2fae849b6 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:42 np0005541913.localdomain sudo[62210]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:12:42 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:12:42 np0005541913.localdomain systemd[1]: Started Session c6 of User root.
Dec 02 08:12:42 np0005541913.localdomain sudo[62210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:42 np0005541913.localdomain sudo[62210]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:42 np0005541913.localdomain systemd[1]: session-c6.scope: Deactivated successfully.
Dec 02 08:12:42 np0005541913.localdomain podman[62293]: 2025-12-02 08:12:42.612245198 +0000 UTC m=+0.064774396 container create df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:12:42 np0005541913.localdomain systemd[1]: Started libpod-conmon-df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca.scope.
Dec 02 08:12:42 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain podman[62293]: 2025-12-02 08:12:42.572925964 +0000 UTC m=+0.025455222 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:42 np0005541913.localdomain podman[62293]: 2025-12-02 08:12:42.678582906 +0000 UTC m=+0.131112094 container init df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, config_id=tripleo_step3, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, container_name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:12:42 np0005541913.localdomain podman[62293]: 2025-12-02 08:12:42.684073357 +0000 UTC m=+0.136602555 container start df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, container_name=nova_virtqemud, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 08:12:42 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ff8ff724cb5f0d02131158e2fae849b6 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:42 np0005541913.localdomain sudo[62313]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:12:42 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:12:42 np0005541913.localdomain systemd[1]: Started Session c7 of User root.
Dec 02 08:12:42 np0005541913.localdomain sudo[62313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:42 np0005541913.localdomain sudo[62313]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:42 np0005541913.localdomain systemd[1]: session-c7.scope: Deactivated successfully.
Dec 02 08:12:43 np0005541913.localdomain podman[62401]: 2025-12-02 08:12:43.05718845 +0000 UTC m=+0.087418460 container create 16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, distribution-scope=public, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, container_name=nova_virtproxyd)
Dec 02 08:12:43 np0005541913.localdomain systemd[1]: Started libpod-conmon-16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3.scope.
Dec 02 08:12:43 np0005541913.localdomain podman[62401]: 2025-12-02 08:12:43.006550054 +0000 UTC m=+0.036780094 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:43 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:12:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 08:12:43 np0005541913.localdomain podman[62401]: 2025-12-02 08:12:43.14465254 +0000 UTC m=+0.174882540 container init 16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=nova_virtproxyd, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, build-date=2025-11-19T00:35:22Z, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Dec 02 08:12:43 np0005541913.localdomain systemd[1]: tmp-crun.z6QSUe.mount: Deactivated successfully.
Dec 02 08:12:43 np0005541913.localdomain podman[62401]: 2025-12-02 08:12:43.157920506 +0000 UTC m=+0.188150506 container start 16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, container_name=nova_virtproxyd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Dec 02 08:12:43 np0005541913.localdomain python3[61299]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ff8ff724cb5f0d02131158e2fae849b6 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:12:43 np0005541913.localdomain sudo[62421]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:12:43 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:12:43 np0005541913.localdomain systemd[1]: Started Session c8 of User root.
Dec 02 08:12:43 np0005541913.localdomain sudo[62421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:12:43 np0005541913.localdomain sudo[62421]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:43 np0005541913.localdomain systemd[1]: session-c8.scope: Deactivated successfully.
Dec 02 08:12:43 np0005541913.localdomain sudo[61297]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:43 np0005541913.localdomain sudo[62482]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-womoemobusdwdknsjypsvzfqltbbrsnt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:43 np0005541913.localdomain sudo[62482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:43 np0005541913.localdomain python3[62484]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:43 np0005541913.localdomain sudo[62482]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:43 np0005541913.localdomain sudo[62498]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-racrjkjmfguketsjrdbhcplgrurtrrdx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:43 np0005541913.localdomain sudo[62498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:44 np0005541913.localdomain python3[62500]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:44 np0005541913.localdomain sudo[62498]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:44 np0005541913.localdomain sudo[62514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjycjxkehuwwatbahrxqsldbmqwyprlw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:44 np0005541913.localdomain sudo[62514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:44 np0005541913.localdomain python3[62516]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:44 np0005541913.localdomain sudo[62514]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:44 np0005541913.localdomain sudo[62530]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yakkavbggjurtzzrrqbsttvctraucuoe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:44 np0005541913.localdomain sudo[62530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:44 np0005541913.localdomain python3[62532]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:44 np0005541913.localdomain sudo[62530]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:44 np0005541913.localdomain sudo[62546]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mguiizlhsrcyqgoftrnayvksvmnclaiq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:44 np0005541913.localdomain sudo[62546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:44 np0005541913.localdomain python3[62548]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:44 np0005541913.localdomain sudo[62546]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:44 np0005541913.localdomain sudo[62562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drzrdcgvsinbeizaoyprdupusxvyawwj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:44 np0005541913.localdomain sudo[62562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:45 np0005541913.localdomain python3[62564]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:45 np0005541913.localdomain sudo[62562]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:45 np0005541913.localdomain sudo[62578]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umdlewssdzsltmgjkjkzqqeriussprvm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:45 np0005541913.localdomain sudo[62578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:45 np0005541913.localdomain python3[62580]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:45 np0005541913.localdomain sudo[62578]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:45 np0005541913.localdomain sudo[62594]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsusugmidzlcsmbyecdrydnbwmpdapzk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:45 np0005541913.localdomain sudo[62594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:45 np0005541913.localdomain python3[62596]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:45 np0005541913.localdomain sudo[62594]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:45 np0005541913.localdomain sudo[62610]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owouexemruqjnhykilibykftkbthuntp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:45 np0005541913.localdomain sudo[62610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:45 np0005541913.localdomain python3[62612]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:45 np0005541913.localdomain sudo[62610]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:45 np0005541913.localdomain sudo[62626]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vadrbbxcszpxdxzdxlnmtgypkdiapaso ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:45 np0005541913.localdomain sudo[62626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:46 np0005541913.localdomain python3[62628]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:46 np0005541913.localdomain sudo[62626]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:46 np0005541913.localdomain sudo[62642]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdabosnqulknrgsahvhwijmcgpkdcnps ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:46 np0005541913.localdomain sudo[62642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:46 np0005541913.localdomain python3[62644]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:46 np0005541913.localdomain sudo[62642]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:46 np0005541913.localdomain sudo[62658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfeesvohzqzrxkqltznjkuwjyxddxuih ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:46 np0005541913.localdomain sudo[62658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:46 np0005541913.localdomain python3[62660]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:46 np0005541913.localdomain sudo[62658]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:46 np0005541913.localdomain sudo[62674]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwxcdbimckwetjugmnegdxvwyoqbbcwi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:46 np0005541913.localdomain sudo[62674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:46 np0005541913.localdomain python3[62676]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:46 np0005541913.localdomain sudo[62674]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:47 np0005541913.localdomain sudo[62690]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzifstwzjawxlhiyzkuysoyudtjiuujv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:47 np0005541913.localdomain sudo[62690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:47 np0005541913.localdomain python3[62692]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:47 np0005541913.localdomain sudo[62690]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:47 np0005541913.localdomain sudo[62706]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onxvzwmabeecemwnvyvbklcqfrifmohz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:47 np0005541913.localdomain sudo[62706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:47 np0005541913.localdomain python3[62708]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:47 np0005541913.localdomain sudo[62706]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:47 np0005541913.localdomain sudo[62722]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrwluavjlzzfpohjhjnkesnxadbqtuzc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:47 np0005541913.localdomain sudo[62722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:47 np0005541913.localdomain python3[62724]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:47 np0005541913.localdomain sudo[62722]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:47 np0005541913.localdomain sudo[62738]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfltyykpmwnbvyeasxdfomweykugefgg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:47 np0005541913.localdomain sudo[62738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:47 np0005541913.localdomain python3[62740]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:47 np0005541913.localdomain sudo[62738]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:48 np0005541913.localdomain sudo[62754]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufuokjbabrafbthidgmsuksimpmfsdzt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:48 np0005541913.localdomain sudo[62754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:48 np0005541913.localdomain python3[62756]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:12:48 np0005541913.localdomain sudo[62754]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:48 np0005541913.localdomain sudo[62815]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olvithedlsldwjqgbscmexvwhbrzumpa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:48 np0005541913.localdomain sudo[62815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:48 np0005541913.localdomain python3[62817]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663168.2486794-99745-202156748293961/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:48 np0005541913.localdomain sudo[62815]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:49 np0005541913.localdomain sudo[62844]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vchynjqizokffzzohhnotrqtpxyxzokq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:49 np0005541913.localdomain sudo[62844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:49 np0005541913.localdomain python3[62846]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663168.2486794-99745-202156748293961/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:49 np0005541913.localdomain sudo[62844]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:49 np0005541913.localdomain sudo[62873]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwdpffnsodbktpdzdtdmunqnvcmzrjpj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:49 np0005541913.localdomain sudo[62873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:49 np0005541913.localdomain python3[62875]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663168.2486794-99745-202156748293961/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:49 np0005541913.localdomain sudo[62873]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:50 np0005541913.localdomain sudo[62902]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxqqfesdpsxccgfvozopbebuumpkvdwv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:50 np0005541913.localdomain sudo[62902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:50 np0005541913.localdomain python3[62904]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663168.2486794-99745-202156748293961/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:50 np0005541913.localdomain sudo[62902]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:50 np0005541913.localdomain sudo[62931]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znponbndkvdebgsjqunhnnktmvvnmfrd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:50 np0005541913.localdomain sudo[62931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:50 np0005541913.localdomain python3[62933]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663168.2486794-99745-202156748293961/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:50 np0005541913.localdomain sudo[62931]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:51 np0005541913.localdomain sudo[62960]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwasbmkcjwfxksgwtfxpnafgdxjclbgp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:51 np0005541913.localdomain sudo[62960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:51 np0005541913.localdomain python3[62962]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663168.2486794-99745-202156748293961/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:51 np0005541913.localdomain sudo[62960]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:51 np0005541913.localdomain sudo[62989]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhttomkejflramfgeczpaiufhdfpimcw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:51 np0005541913.localdomain sudo[62989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:51 np0005541913.localdomain python3[62991]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663168.2486794-99745-202156748293961/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:51 np0005541913.localdomain sudo[62989]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:52 np0005541913.localdomain sudo[63018]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-soyrdlelrxpzefekfjnlhribxmmexfmc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:52 np0005541913.localdomain sudo[63018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:52 np0005541913.localdomain python3[63020]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663168.2486794-99745-202156748293961/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:52 np0005541913.localdomain sudo[63018]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:52 np0005541913.localdomain sudo[63047]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msinswfnrsqmasszbdtlasjkmtyxjndu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:52 np0005541913.localdomain sudo[63047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:12:52 np0005541913.localdomain podman[63050]: 2025-12-02 08:12:52.962420628 +0000 UTC m=+0.093383874 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Dec 02 08:12:53 np0005541913.localdomain python3[63049]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663168.2486794-99745-202156748293961/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:12:53 np0005541913.localdomain sudo[63047]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:53 np0005541913.localdomain sudo[63093]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzsoxtcdxaxjzdcotllpuzdqrqynyzdf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:53 np0005541913.localdomain sudo[63093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:53 np0005541913.localdomain podman[63050]: 2025-12-02 08:12:53.170001258 +0000 UTC m=+0.300964504 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Activating special unit Exit the Session...
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Stopped target Main User Target.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Stopped target Basic System.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Stopped target Paths.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Stopped target Sockets.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Stopped target Timers.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Closed D-Bus User Message Bus Socket.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Stopped Create User's Volatile Files and Directories.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Removed slice User Application Slice.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Reached target Shutdown.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Finished Exit the Session.
Dec 02 08:12:53 np0005541913.localdomain systemd[61605]: Reached target Exit the Session.
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 02 08:12:53 np0005541913.localdomain python3[63095]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:12:53 np0005541913.localdomain systemd-sysv-generator[63127]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:12:53 np0005541913.localdomain systemd-rc-local-generator[63122]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:12:53 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:12:53 np0005541913.localdomain sudo[63093]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:54 np0005541913.localdomain sudo[63147]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrzeiszklqjtkzrwjujefwhukjobfyux ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:54 np0005541913.localdomain sudo[63147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:54 np0005541913.localdomain python3[63149]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:12:54 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:12:54 np0005541913.localdomain systemd-sysv-generator[63178]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:12:54 np0005541913.localdomain systemd-rc-local-generator[63173]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:12:54 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:12:54 np0005541913.localdomain systemd[1]: Starting collectd container...
Dec 02 08:12:54 np0005541913.localdomain systemd[1]: Started collectd container.
Dec 02 08:12:54 np0005541913.localdomain sudo[63147]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:55 np0005541913.localdomain sudo[63214]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqpcasdxtronrbxlstkloxztsjsltusd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:55 np0005541913.localdomain sudo[63214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:55 np0005541913.localdomain python3[63216]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:12:55 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:12:55 np0005541913.localdomain systemd-rc-local-generator[63247]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:12:55 np0005541913.localdomain systemd-sysv-generator[63250]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:12:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:12:55 np0005541913.localdomain systemd[1]: Starting iscsid container...
Dec 02 08:12:55 np0005541913.localdomain systemd[1]: Started iscsid container.
Dec 02 08:12:56 np0005541913.localdomain sudo[63214]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:56 np0005541913.localdomain sudo[63281]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hutvwgmlwxfzwwkzgfczcqhrmvwfermu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:56 np0005541913.localdomain sudo[63281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:56 np0005541913.localdomain python3[63283]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:12:56 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:12:56 np0005541913.localdomain systemd-rc-local-generator[63313]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:12:56 np0005541913.localdomain systemd-sysv-generator[63316]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:12:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:12:57 np0005541913.localdomain systemd[1]: Starting nova_virtlogd_wrapper container...
Dec 02 08:12:57 np0005541913.localdomain systemd[1]: Started nova_virtlogd_wrapper container.
Dec 02 08:12:57 np0005541913.localdomain sudo[63281]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:57 np0005541913.localdomain sudo[63348]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdhmvbcurmyyylnoadgneynivnepmydx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:57 np0005541913.localdomain sudo[63348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:12:57 np0005541913.localdomain python3[63350]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:12:58 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:12:58 np0005541913.localdomain systemd-rc-local-generator[63376]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:12:58 np0005541913.localdomain systemd-sysv-generator[63380]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:12:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:12:59 np0005541913.localdomain systemd[1]: Starting nova_virtnodedevd container...
Dec 02 08:12:59 np0005541913.localdomain tripleo-start-podman-container[63390]: Creating additional drop-in dependency for "nova_virtnodedevd" (21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8)
Dec 02 08:12:59 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:12:59 np0005541913.localdomain systemd-sysv-generator[63447]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:12:59 np0005541913.localdomain systemd-rc-local-generator[63444]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:12:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:12:59 np0005541913.localdomain systemd[1]: Started nova_virtnodedevd container.
Dec 02 08:12:59 np0005541913.localdomain sudo[63348]: pam_unix(sudo:session): session closed for user root
Dec 02 08:12:59 np0005541913.localdomain sudo[63470]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fclpxnjhvnipvwobzlcbbrkbsjpyodrm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:12:59 np0005541913.localdomain sudo[63470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:00 np0005541913.localdomain python3[63472]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:13:00 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:13:00 np0005541913.localdomain systemd-rc-local-generator[63503]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:13:00 np0005541913.localdomain systemd-sysv-generator[63506]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:13:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:13:00 np0005541913.localdomain systemd[1]: Starting nova_virtproxyd container...
Dec 02 08:13:00 np0005541913.localdomain tripleo-start-podman-container[63513]: Creating additional drop-in dependency for "nova_virtproxyd" (16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3)
Dec 02 08:13:00 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:13:00 np0005541913.localdomain systemd-sysv-generator[63571]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:13:00 np0005541913.localdomain systemd-rc-local-generator[63566]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:13:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:13:01 np0005541913.localdomain systemd[1]: Started nova_virtproxyd container.
Dec 02 08:13:01 np0005541913.localdomain sudo[63470]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:02 np0005541913.localdomain sudo[63594]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzewefuskzomqmwfvbualerovpqundzf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:13:02 np0005541913.localdomain sudo[63594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:02 np0005541913.localdomain python3[63596]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:13:02 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:13:02 np0005541913.localdomain systemd-sysv-generator[63627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:13:02 np0005541913.localdomain systemd-rc-local-generator[63622]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:13:02 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:13:03 np0005541913.localdomain systemd[1]: Starting nova_virtqemud container...
Dec 02 08:13:03 np0005541913.localdomain tripleo-start-podman-container[63636]: Creating additional drop-in dependency for "nova_virtqemud" (df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca)
Dec 02 08:13:03 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:13:03 np0005541913.localdomain systemd-rc-local-generator[63693]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:13:03 np0005541913.localdomain systemd-sysv-generator[63698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:13:03 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:13:03 np0005541913.localdomain systemd[1]: Started nova_virtqemud container.
Dec 02 08:13:03 np0005541913.localdomain sudo[63594]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:03 np0005541913.localdomain sudo[63719]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjjbktikzmrzqiqderoigfmedqpniywm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:13:03 np0005541913.localdomain sudo[63719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:04 np0005541913.localdomain python3[63721]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:13:04 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:13:04 np0005541913.localdomain systemd-rc-local-generator[63748]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:13:04 np0005541913.localdomain systemd-sysv-generator[63753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:13:04 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:13:04 np0005541913.localdomain systemd[1]: Starting nova_virtsecretd container...
Dec 02 08:13:04 np0005541913.localdomain tripleo-start-podman-container[63761]: Creating additional drop-in dependency for "nova_virtsecretd" (d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f)
Dec 02 08:13:04 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:13:04 np0005541913.localdomain systemd-sysv-generator[63822]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:13:04 np0005541913.localdomain systemd-rc-local-generator[63819]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:13:04 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:13:05 np0005541913.localdomain systemd[1]: Started nova_virtsecretd container.
Dec 02 08:13:05 np0005541913.localdomain sudo[63719]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:05 np0005541913.localdomain sudo[63845]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbnxlvivudbevbkuzzctvosscirejxmc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:13:05 np0005541913.localdomain sudo[63845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:05 np0005541913.localdomain python3[63847]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:13:05 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:13:06 np0005541913.localdomain systemd-rc-local-generator[63876]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:13:06 np0005541913.localdomain systemd-sysv-generator[63880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:13:06 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:13:06 np0005541913.localdomain systemd[1]: Starting nova_virtstoraged container...
Dec 02 08:13:06 np0005541913.localdomain tripleo-start-podman-container[63887]: Creating additional drop-in dependency for "nova_virtstoraged" (4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56)
Dec 02 08:13:06 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:13:06 np0005541913.localdomain systemd-rc-local-generator[63936]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:13:06 np0005541913.localdomain systemd-sysv-generator[63943]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:13:08 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:13:08 np0005541913.localdomain systemd[1]: Started nova_virtstoraged container.
Dec 02 08:13:08 np0005541913.localdomain sudo[63845]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:08 np0005541913.localdomain sudo[63968]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ullbvupvpojylymsgfxvuigpxhwldioi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:13:08 np0005541913.localdomain sudo[63968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:08 np0005541913.localdomain python3[63970]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:13:08 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:13:08 np0005541913.localdomain systemd-rc-local-generator[63994]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:13:08 np0005541913.localdomain systemd-sysv-generator[63999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: Starting rsyslog container...
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: tmp-crun.746POh.mount: Deactivated successfully.
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:13:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:09 np0005541913.localdomain podman[64010]: 2025-12-02 08:13:09.330934147 +0000 UTC m=+0.146197950 container init a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:49:49Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, container_name=rsyslog, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 02 08:13:09 np0005541913.localdomain podman[64010]: 2025-12-02 08:13:09.341639422 +0000 UTC m=+0.156903225 container start a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-rsyslog-container, tcib_managed=true)
Dec 02 08:13:09 np0005541913.localdomain podman[64010]: rsyslog
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: Started rsyslog container.
Dec 02 08:13:09 np0005541913.localdomain sudo[64027]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:13:09 np0005541913.localdomain sudo[64027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:13:09 np0005541913.localdomain sudo[63968]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:09 np0005541913.localdomain sudo[64027]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: libpod-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11.scope: Deactivated successfully.
Dec 02 08:13:09 np0005541913.localdomain podman[64041]: 2025-12-02 08:13:09.495804481 +0000 UTC m=+0.040605851 container died a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12)
Dec 02 08:13:09 np0005541913.localdomain podman[64041]: 2025-12-02 08:13:09.519852843 +0000 UTC m=+0.064654193 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-rsyslog, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:13:09 np0005541913.localdomain podman[64057]: 2025-12-02 08:13:09.607471048 +0000 UTC m=+0.048992222 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:13:09 np0005541913.localdomain podman[64057]: rsyslog
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 02 08:13:09 np0005541913.localdomain sudo[64083]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erruaqqunzhlkmiqxogtiheihexujnfu ; /usr/bin/python3
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: Stopped rsyslog container.
Dec 02 08:13:09 np0005541913.localdomain sudo[64083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: Starting rsyslog container...
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:13:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:09 np0005541913.localdomain podman[64085]: 2025-12-02 08:13:09.922825979 +0000 UTC m=+0.112015898 container init a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git)
Dec 02 08:13:09 np0005541913.localdomain podman[64085]: 2025-12-02 08:13:09.933138163 +0000 UTC m=+0.122328092 container start a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, version=17.1.12, container_name=rsyslog, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3)
Dec 02 08:13:09 np0005541913.localdomain podman[64085]: rsyslog
Dec 02 08:13:09 np0005541913.localdomain systemd[1]: Started rsyslog container.
Dec 02 08:13:09 np0005541913.localdomain sudo[64105]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:13:09 np0005541913.localdomain sudo[64105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:13:09 np0005541913.localdomain python3[64086]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:13:09 np0005541913.localdomain sudo[64083]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:10 np0005541913.localdomain sudo[64105]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: libpod-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11.scope: Deactivated successfully.
Dec 02 08:13:10 np0005541913.localdomain podman[64109]: 2025-12-02 08:13:10.081690977 +0000 UTC m=+0.040248030 container died a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, container_name=rsyslog, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:13:10 np0005541913.localdomain podman[64109]: 2025-12-02 08:13:10.102913062 +0000 UTC m=+0.061470105 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-rsyslog-container, version=17.1.12, container_name=rsyslog, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:13:10 np0005541913.localdomain podman[64123]: 2025-12-02 08:13:10.16963118 +0000 UTC m=+0.041045681 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, container_name=rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Dec 02 08:13:10 np0005541913.localdomain podman[64123]: rsyslog
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 02 08:13:10 np0005541913.localdomain sudo[64136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:13:10 np0005541913.localdomain sudo[64136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:13:10 np0005541913.localdomain sudo[64136]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336-merged.mount: Deactivated successfully.
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11-userdata-shm.mount: Deactivated successfully.
Dec 02 08:13:10 np0005541913.localdomain sudo[64151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: Stopped rsyslog container.
Dec 02 08:13:10 np0005541913.localdomain sudo[64151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: Starting rsyslog container...
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:13:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:10 np0005541913.localdomain podman[64166]: 2025-12-02 08:13:10.465637688 +0000 UTC m=+0.122496197 container init a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, distribution-scope=public)
Dec 02 08:13:10 np0005541913.localdomain podman[64166]: 2025-12-02 08:13:10.47516724 +0000 UTC m=+0.132025749 container start a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 02 08:13:10 np0005541913.localdomain podman[64166]: rsyslog
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: Started rsyslog container.
Dec 02 08:13:10 np0005541913.localdomain sudo[64199]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:13:10 np0005541913.localdomain sudo[64199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:13:10 np0005541913.localdomain sudo[64199]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: libpod-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11.scope: Deactivated successfully.
Dec 02 08:13:10 np0005541913.localdomain sudo[64234]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-getqbhamufeiwhcxbqzyffkxszfzwpco ; /usr/bin/python3
Dec 02 08:13:10 np0005541913.localdomain sudo[64234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:10 np0005541913.localdomain podman[64231]: 2025-12-02 08:13:10.639300424 +0000 UTC m=+0.060616592 container died a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-rsyslog, container_name=rsyslog, version=17.1.12, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 02 08:13:10 np0005541913.localdomain podman[64231]: 2025-12-02 08:13:10.660996732 +0000 UTC m=+0.082312870 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044)
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:13:10 np0005541913.localdomain podman[64260]: 2025-12-02 08:13:10.773742759 +0000 UTC m=+0.081355323 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 02 08:13:10 np0005541913.localdomain sudo[64234]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:10 np0005541913.localdomain podman[64259]: 2025-12-02 08:13:10.809409112 +0000 UTC m=+0.121180720 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=rsyslog, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, release=1761123044)
Dec 02 08:13:10 np0005541913.localdomain podman[64259]: rsyslog
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 02 08:13:10 np0005541913.localdomain podman[64260]: 2025-12-02 08:13:10.858399922 +0000 UTC m=+0.166012476 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, tcib_managed=true, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:13:10 np0005541913.localdomain sudo[64151]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:10 np0005541913.localdomain sudo[64346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cggqyfoatypxmrecdnaevoykmjjowrke ; /usr/bin/python3
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: Stopped rsyslog container.
Dec 02 08:13:10 np0005541913.localdomain sudo[64346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:10 np0005541913.localdomain systemd[1]: Starting rsyslog container...
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:13:11 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:11 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:11 np0005541913.localdomain podman[64349]: 2025-12-02 08:13:11.116843755 +0000 UTC m=+0.108691116 container init a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, tcib_managed=true, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64)
Dec 02 08:13:11 np0005541913.localdomain podman[64349]: 2025-12-02 08:13:11.122892942 +0000 UTC m=+0.114740303 container start a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, release=1761123044, config_id=tripleo_step3)
Dec 02 08:13:11 np0005541913.localdomain podman[64349]: rsyslog
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: Started rsyslog container.
Dec 02 08:13:11 np0005541913.localdomain sudo[64368]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:13:11 np0005541913.localdomain sudo[64368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:13:11 np0005541913.localdomain sudo[64346]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:11 np0005541913.localdomain sudo[64368]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: libpod-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11.scope: Deactivated successfully.
Dec 02 08:13:11 np0005541913.localdomain podman[64385]: 2025-12-02 08:13:11.31295717 +0000 UTC m=+0.053917938 container died a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11-userdata-shm.mount: Deactivated successfully.
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336-merged.mount: Deactivated successfully.
Dec 02 08:13:11 np0005541913.localdomain podman[64385]: 2025-12-02 08:13:11.344415377 +0000 UTC m=+0.085376115 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, container_name=rsyslog, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:13:11 np0005541913.localdomain sudo[64417]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyhjxrpoltgzzvhjtispxeaxvvzchvyu ; /usr/bin/python3
Dec 02 08:13:11 np0005541913.localdomain sudo[64417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:11 np0005541913.localdomain podman[64399]: 2025-12-02 08:13:11.433179032 +0000 UTC m=+0.060230280 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_id=tripleo_step3, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 02 08:13:11 np0005541913.localdomain podman[64399]: rsyslog
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: Stopped rsyslog container.
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: Starting rsyslog container...
Dec 02 08:13:11 np0005541913.localdomain python3[64424]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005541913 step=3 update_config_hash_only=False
Dec 02 08:13:11 np0005541913.localdomain sudo[64417]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:11 np0005541913.localdomain sudo[64425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:13:11 np0005541913.localdomain sudo[64425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:13:11 np0005541913.localdomain sudo[64425]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:13:11 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:11 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 02 08:13:11 np0005541913.localdomain podman[64429]: 2025-12-02 08:13:11.756865933 +0000 UTC m=+0.138787166 container init a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 02 08:13:11 np0005541913.localdomain podman[64429]: 2025-12-02 08:13:11.767285631 +0000 UTC m=+0.149206844 container start a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 02 08:13:11 np0005541913.localdomain podman[64429]: rsyslog
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: Started rsyslog container.
Dec 02 08:13:11 np0005541913.localdomain sudo[64459]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:13:11 np0005541913.localdomain sudo[64459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:13:11 np0005541913.localdomain sudo[64459]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: libpod-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11.scope: Deactivated successfully.
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:13:11 np0005541913.localdomain podman[64463]: 2025-12-02 08:13:11.952269739 +0000 UTC m=+0.075406410 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3)
Dec 02 08:13:11 np0005541913.localdomain podman[64463]: 2025-12-02 08:13:11.967058406 +0000 UTC m=+0.090195097 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 02 08:13:11 np0005541913.localdomain sudo[64502]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozecekqrkuruqiwgzxhvcxxavsekomlo ; /usr/bin/python3
Dec 02 08:13:11 np0005541913.localdomain sudo[64502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:11 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:13:12 np0005541913.localdomain podman[64462]: 2025-12-02 08:13:12.030766692 +0000 UTC m=+0.156271448 container died a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:49Z, url=https://www.redhat.com, config_id=tripleo_step3, container_name=rsyslog, version=17.1.12, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:13:12 np0005541913.localdomain podman[64462]: 2025-12-02 08:13:12.057142718 +0000 UTC m=+0.182647404 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, release=1761123044, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, version=17.1.12, managed_by=tripleo_ansible)
Dec 02 08:13:12 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:13:12 np0005541913.localdomain podman[64510]: 2025-12-02 08:13:12.134897731 +0000 UTC m=+0.048983541 container cleanup a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., container_name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1c70cec5d3310de4d4589e1a95c8fd3c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, vcs-type=git)
Dec 02 08:13:12 np0005541913.localdomain podman[64510]: rsyslog
Dec 02 08:13:12 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 02 08:13:12 np0005541913.localdomain python3[64508]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:13:12 np0005541913.localdomain sudo[64502]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a68110bf93a17cc09560fbaa73e0aa62bc7d4bb5fb9957036f7904918efbde11-userdata-shm.mount: Deactivated successfully.
Dec 02 08:13:12 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Dec 02 08:13:12 np0005541913.localdomain systemd[1]: Stopped rsyslog container.
Dec 02 08:13:12 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Dec 02 08:13:12 np0005541913.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 02 08:13:12 np0005541913.localdomain systemd[1]: Failed to start rsyslog container.
Dec 02 08:13:12 np0005541913.localdomain sudo[64536]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpmvrbolhncthnfziccsbefzzoouqcjb ; /usr/bin/python3
Dec 02 08:13:12 np0005541913.localdomain sudo[64536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:13:12 np0005541913.localdomain python3[64538]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 02 08:13:12 np0005541913.localdomain sudo[64536]: pam_unix(sudo:session): session closed for user root
Dec 02 08:13:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:13:23 np0005541913.localdomain systemd[1]: tmp-crun.kqf6hy.mount: Deactivated successfully.
Dec 02 08:13:23 np0005541913.localdomain podman[64539]: 2025-12-02 08:13:23.463309581 +0000 UTC m=+0.095085012 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:13:23 np0005541913.localdomain podman[64539]: 2025-12-02 08:13:23.662398557 +0000 UTC m=+0.294173948 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd)
Dec 02 08:13:23 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:13:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:13:41 np0005541913.localdomain podman[64569]: 2025-12-02 08:13:41.457029628 +0000 UTC m=+0.097655902 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd)
Dec 02 08:13:41 np0005541913.localdomain podman[64569]: 2025-12-02 08:13:41.467489177 +0000 UTC m=+0.108115401 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Dec 02 08:13:41 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:13:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:13:42 np0005541913.localdomain podman[64590]: 2025-12-02 08:13:42.44223618 +0000 UTC m=+0.085700573 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, container_name=iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:13:42 np0005541913.localdomain podman[64590]: 2025-12-02 08:13:42.478397867 +0000 UTC m=+0.121862230 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, name=rhosp17/openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, container_name=iscsid, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:13:42 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:13:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:13:54 np0005541913.localdomain podman[64610]: 2025-12-02 08:13:54.485301615 +0000 UTC m=+0.098364693 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:13:54 np0005541913.localdomain podman[64610]: 2025-12-02 08:13:54.708091274 +0000 UTC m=+0.321154332 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:13:54 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:14:11 np0005541913.localdomain sudo[64640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:14:11 np0005541913.localdomain sudo[64640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:14:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:14:11 np0005541913.localdomain sudo[64640]: pam_unix(sudo:session): session closed for user root
Dec 02 08:14:11 np0005541913.localdomain sudo[64661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:14:11 np0005541913.localdomain sudo[64661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:14:12 np0005541913.localdomain systemd[1]: tmp-crun.f724w8.mount: Deactivated successfully.
Dec 02 08:14:12 np0005541913.localdomain podman[64655]: 2025-12-02 08:14:12.032170658 +0000 UTC m=+0.110135896 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, architecture=x86_64, container_name=collectd, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:14:12 np0005541913.localdomain podman[64655]: 2025-12-02 08:14:12.036627961 +0000 UTC m=+0.114593199 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=collectd, config_id=tripleo_step3, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Dec 02 08:14:12 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:14:12 np0005541913.localdomain sudo[64661]: pam_unix(sudo:session): session closed for user root
Dec 02 08:14:13 np0005541913.localdomain sudo[64724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:14:13 np0005541913.localdomain sudo[64724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:14:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:14:13 np0005541913.localdomain sudo[64724]: pam_unix(sudo:session): session closed for user root
Dec 02 08:14:13 np0005541913.localdomain systemd[1]: tmp-crun.R2t2iE.mount: Deactivated successfully.
Dec 02 08:14:13 np0005541913.localdomain podman[64739]: 2025-12-02 08:14:13.268665695 +0000 UTC m=+0.073392274 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044, architecture=x86_64, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 02 08:14:13 np0005541913.localdomain podman[64739]: 2025-12-02 08:14:13.284160172 +0000 UTC m=+0.088886761 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:14:13 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:14:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:14:25 np0005541913.localdomain podman[64758]: 2025-12-02 08:14:25.444963531 +0000 UTC m=+0.089411226 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Dec 02 08:14:25 np0005541913.localdomain podman[64758]: 2025-12-02 08:14:25.627008428 +0000 UTC m=+0.271456143 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=)
Dec 02 08:14:25 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:14:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:14:42 np0005541913.localdomain systemd[1]: tmp-crun.tyBjix.mount: Deactivated successfully.
Dec 02 08:14:42 np0005541913.localdomain podman[64787]: 2025-12-02 08:14:42.443865733 +0000 UTC m=+0.088809812 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64)
Dec 02 08:14:42 np0005541913.localdomain podman[64787]: 2025-12-02 08:14:42.479579563 +0000 UTC m=+0.124523652 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-collectd, version=17.1.12, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Dec 02 08:14:42 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:14:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:14:43 np0005541913.localdomain podman[64808]: 2025-12-02 08:14:43.443047819 +0000 UTC m=+0.088058201 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3, distribution-scope=public, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:14:43 np0005541913.localdomain podman[64808]: 2025-12-02 08:14:43.479972832 +0000 UTC m=+0.124983114 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., container_name=iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4)
Dec 02 08:14:43 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:14:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:14:56 np0005541913.localdomain podman[64827]: 2025-12-02 08:14:56.419679306 +0000 UTC m=+0.064918930 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 02 08:14:56 np0005541913.localdomain podman[64827]: 2025-12-02 08:14:56.58008218 +0000 UTC m=+0.225321874 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:14:56 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:15:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:15:13 np0005541913.localdomain sudo[64854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:15:13 np0005541913.localdomain sudo[64854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:15:13 np0005541913.localdomain sudo[64854]: pam_unix(sudo:session): session closed for user root
Dec 02 08:15:13 np0005541913.localdomain systemd[1]: tmp-crun.zCr1XO.mount: Deactivated successfully.
Dec 02 08:15:13 np0005541913.localdomain sudo[64879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:15:13 np0005541913.localdomain sudo[64879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:15:13 np0005541913.localdomain podman[64868]: 2025-12-02 08:15:13.471689534 +0000 UTC m=+0.097541294 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:15:13 np0005541913.localdomain podman[64868]: 2025-12-02 08:15:13.484013845 +0000 UTC m=+0.109865565 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git)
Dec 02 08:15:13 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:15:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:15:13 np0005541913.localdomain systemd[1]: tmp-crun.9FjElu.mount: Deactivated successfully.
Dec 02 08:15:13 np0005541913.localdomain podman[64904]: 2025-12-02 08:15:13.621528246 +0000 UTC m=+0.097478672 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step3, architecture=x86_64, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:15:13 np0005541913.localdomain podman[64904]: 2025-12-02 08:15:13.65594064 +0000 UTC m=+0.131891056 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container)
Dec 02 08:15:13 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:15:14 np0005541913.localdomain sudo[64879]: pam_unix(sudo:session): session closed for user root
Dec 02 08:15:18 np0005541913.localdomain sudo[64951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:15:18 np0005541913.localdomain sudo[64951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:15:18 np0005541913.localdomain sudo[64951]: pam_unix(sudo:session): session closed for user root
Dec 02 08:15:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:15:27 np0005541913.localdomain systemd[1]: tmp-crun.uel9My.mount: Deactivated successfully.
Dec 02 08:15:27 np0005541913.localdomain podman[64966]: 2025-12-02 08:15:27.43290632 +0000 UTC m=+0.074497505 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, release=1761123044, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:15:27 np0005541913.localdomain podman[64966]: 2025-12-02 08:15:27.668069596 +0000 UTC m=+0.309660791 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:15:27 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:15:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:15:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:15:44 np0005541913.localdomain systemd[1]: tmp-crun.HyTlj4.mount: Deactivated successfully.
Dec 02 08:15:44 np0005541913.localdomain podman[64997]: 2025-12-02 08:15:44.418269882 +0000 UTC m=+0.063818580 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, distribution-scope=public)
Dec 02 08:15:44 np0005541913.localdomain podman[64997]: 2025-12-02 08:15:44.453965251 +0000 UTC m=+0.099513939 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public)
Dec 02 08:15:44 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:15:44 np0005541913.localdomain podman[64996]: 2025-12-02 08:15:44.487581182 +0000 UTC m=+0.130493727 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.openshift.expose-services=, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:15:44 np0005541913.localdomain podman[64996]: 2025-12-02 08:15:44.497438945 +0000 UTC m=+0.140351440 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:15:44 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:15:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:15:58 np0005541913.localdomain podman[65035]: 2025-12-02 08:15:58.437771033 +0000 UTC m=+0.081953761 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:15:58 np0005541913.localdomain podman[65035]: 2025-12-02 08:15:58.631008338 +0000 UTC m=+0.275191096 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true)
Dec 02 08:15:58 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:16:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:16:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:16:15 np0005541913.localdomain systemd[1]: tmp-crun.YVI28A.mount: Deactivated successfully.
Dec 02 08:16:15 np0005541913.localdomain podman[65065]: 2025-12-02 08:16:15.486219295 +0000 UTC m=+0.129989743 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:16:15 np0005541913.localdomain podman[65065]: 2025-12-02 08:16:15.498024402 +0000 UTC m=+0.141794830 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, managed_by=tripleo_ansible, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:16:15 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:16:15 np0005541913.localdomain podman[65066]: 2025-12-02 08:16:15.488522469 +0000 UTC m=+0.128415990 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:16:15 np0005541913.localdomain podman[65066]: 2025-12-02 08:16:15.57302525 +0000 UTC m=+0.212918751 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 02 08:16:15 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:16:18 np0005541913.localdomain sudo[65103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:16:18 np0005541913.localdomain sudo[65103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:16:18 np0005541913.localdomain sudo[65103]: pam_unix(sudo:session): session closed for user root
Dec 02 08:16:18 np0005541913.localdomain sudo[65118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:16:18 np0005541913.localdomain sudo[65118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:16:18 np0005541913.localdomain sudo[65118]: pam_unix(sudo:session): session closed for user root
Dec 02 08:16:18 np0005541913.localdomain sudo[65165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:16:18 np0005541913.localdomain sudo[65165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:16:18 np0005541913.localdomain sudo[65165]: pam_unix(sudo:session): session closed for user root
Dec 02 08:16:19 np0005541913.localdomain sudo[65180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 08:16:19 np0005541913.localdomain sudo[65180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:16:19 np0005541913.localdomain sudo[65180]: pam_unix(sudo:session): session closed for user root
Dec 02 08:16:25 np0005541913.localdomain sudo[65215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:16:25 np0005541913.localdomain sudo[65215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:16:25 np0005541913.localdomain sudo[65215]: pam_unix(sudo:session): session closed for user root
Dec 02 08:16:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:16:29 np0005541913.localdomain podman[65230]: 2025-12-02 08:16:29.417344107 +0000 UTC m=+0.061795703 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 02 08:16:29 np0005541913.localdomain podman[65230]: 2025-12-02 08:16:29.630735009 +0000 UTC m=+0.275186595 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T22:49:46Z)
Dec 02 08:16:29 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:16:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:16:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:16:46 np0005541913.localdomain podman[65259]: 2025-12-02 08:16:46.435383705 +0000 UTC m=+0.075471542 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, name=rhosp17/openstack-collectd, distribution-scope=public, batch=17.1_20251118.1)
Dec 02 08:16:46 np0005541913.localdomain podman[65259]: 2025-12-02 08:16:46.448061656 +0000 UTC m=+0.088149523 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, container_name=collectd, com.redhat.component=openstack-collectd-container)
Dec 02 08:16:46 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:16:46 np0005541913.localdomain podman[65260]: 2025-12-02 08:16:46.497130356 +0000 UTC m=+0.132793950 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Dec 02 08:16:46 np0005541913.localdomain podman[65260]: 2025-12-02 08:16:46.535288744 +0000 UTC m=+0.170952358 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4)
Dec 02 08:16:46 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:17:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:17:00 np0005541913.localdomain podman[65297]: 2025-12-02 08:17:00.448927441 +0000 UTC m=+0.087634919 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 02 08:17:00 np0005541913.localdomain podman[65297]: 2025-12-02 08:17:00.659909228 +0000 UTC m=+0.298616676 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, tcib_managed=true, version=17.1.12, config_id=tripleo_step1, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=)
Dec 02 08:17:00 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:17:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:17:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 4435 writes, 20K keys, 4435 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4435 writes, 447 syncs, 9.92 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 211 writes, 571 keys, 211 commit groups, 1.0 writes per commit group, ingest: 0.53 MB, 0.00 MB/s
                                                          Interval WAL: 211 writes, 103 syncs, 2.05 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:17:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:17:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.2 total, 600.0 interval
                                                          Cumulative writes: 5176 writes, 22K keys, 5176 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5176 writes, 608 syncs, 8.51 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 184 writes, 477 keys, 184 commit groups, 1.0 writes per commit group, ingest: 0.44 MB, 0.00 MB/s
                                                          Interval WAL: 184 writes, 91 syncs, 2.02 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:17:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:17:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:17:17 np0005541913.localdomain systemd[1]: tmp-crun.Oguc3s.mount: Deactivated successfully.
Dec 02 08:17:17 np0005541913.localdomain podman[65326]: 2025-12-02 08:17:17.446217204 +0000 UTC m=+0.090354854 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:17:17 np0005541913.localdomain podman[65326]: 2025-12-02 08:17:17.457044854 +0000 UTC m=+0.101182554 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 02 08:17:17 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:17:17 np0005541913.localdomain podman[65327]: 2025-12-02 08:17:17.553731903 +0000 UTC m=+0.190889040 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12)
Dec 02 08:17:17 np0005541913.localdomain podman[65327]: 2025-12-02 08:17:17.587898319 +0000 UTC m=+0.225055456 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 02 08:17:17 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:17:17 np0005541913.localdomain sudo[65409]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfgrqvpccpeksokjwwuadkkhpxodoyaf ; /usr/bin/python3
Dec 02 08:17:17 np0005541913.localdomain sudo[65409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:18 np0005541913.localdomain python3[65411]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:18 np0005541913.localdomain sudo[65409]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:18 np0005541913.localdomain sudo[65454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkdamdczigqjhpbhvaymvjdamieyxsch ; /usr/bin/python3
Dec 02 08:17:18 np0005541913.localdomain sudo[65454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:18 np0005541913.localdomain python3[65456]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663437.7772546-107020-12998145733000/source _original_basename=tmpmxtlsuev follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:18 np0005541913.localdomain sudo[65454]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:19 np0005541913.localdomain sudo[65516]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpzfvpqzgrtsvobsxcpdokzxxnbhjllz ; /usr/bin/python3
Dec 02 08:17:19 np0005541913.localdomain sudo[65516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:19 np0005541913.localdomain python3[65518]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:19 np0005541913.localdomain sudo[65516]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:20 np0005541913.localdomain sudo[65559]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ueszylyqucugsxrhsxqaaipresabtolx ; /usr/bin/python3
Dec 02 08:17:20 np0005541913.localdomain sudo[65559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:20 np0005541913.localdomain python3[65561]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663439.6405184-107118-211606194337718/source _original_basename=tmpe9b8mm4p follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:20 np0005541913.localdomain sudo[65559]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:20 np0005541913.localdomain sudo[65621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfruyscwsdfyypwdkrowjbulypztkbto ; /usr/bin/python3
Dec 02 08:17:20 np0005541913.localdomain sudo[65621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:20 np0005541913.localdomain python3[65623]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:20 np0005541913.localdomain sudo[65621]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:21 np0005541913.localdomain sudo[65664]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gixabieoeocxkrpuiraqbvauyylgeyan ; /usr/bin/python3
Dec 02 08:17:21 np0005541913.localdomain sudo[65664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:21 np0005541913.localdomain python3[65666]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663440.6335638-107174-235607584383599/source _original_basename=tmplxiaaue1 follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:21 np0005541913.localdomain sudo[65664]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:21 np0005541913.localdomain sudo[65726]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okyfxfmwluirthtahmskhhhcojtfrsmp ; /usr/bin/python3
Dec 02 08:17:21 np0005541913.localdomain sudo[65726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:22 np0005541913.localdomain python3[65728]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:22 np0005541913.localdomain sudo[65726]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:22 np0005541913.localdomain sudo[65769]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpkwgbkohigohgczfysvmwcrenkhspes ; /usr/bin/python3
Dec 02 08:17:22 np0005541913.localdomain sudo[65769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:22 np0005541913.localdomain python3[65771]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663441.6692474-107229-229965608257019/source _original_basename=tmp15vm1sr1 follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:22 np0005541913.localdomain sudo[65769]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:22 np0005541913.localdomain sudo[65799]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxfgwitmmqblbefpipnejxbzvdpgqyhk ; /usr/bin/python3
Dec 02 08:17:22 np0005541913.localdomain sudo[65799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:23 np0005541913.localdomain python3[65801]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 08:17:23 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:23 np0005541913.localdomain systemd-sysv-generator[65829]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:23 np0005541913.localdomain systemd-rc-local-generator[65822]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:23 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:23 np0005541913.localdomain systemd-rc-local-generator[65866]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:23 np0005541913.localdomain systemd-sysv-generator[65871]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:23 np0005541913.localdomain sudo[65799]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:23 np0005541913.localdomain sudo[65889]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsdfgfkgkdohvzrxgoffvcfjxkblkezj ; /usr/bin/python3
Dec 02 08:17:23 np0005541913.localdomain sudo[65889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:24 np0005541913.localdomain python3[65891]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:17:24 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:24 np0005541913.localdomain systemd-rc-local-generator[65915]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:24 np0005541913.localdomain systemd-sysv-generator[65918]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:24 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:24 np0005541913.localdomain systemd-sysv-generator[65961]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:24 np0005541913.localdomain systemd-rc-local-generator[65958]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:24 np0005541913.localdomain systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Dec 02 08:17:24 np0005541913.localdomain sudo[65889]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:25 np0005541913.localdomain sudo[65980]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jytiuienfrqvfhupfkyrdaujhpairgoi ; /usr/bin/python3
Dec 02 08:17:25 np0005541913.localdomain sudo[65980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:25 np0005541913.localdomain sudo[65983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:17:25 np0005541913.localdomain python3[65982]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 08:17:25 np0005541913.localdomain sudo[65983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:17:25 np0005541913.localdomain sudo[65983]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:25 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:25 np0005541913.localdomain sudo[65999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 08:17:25 np0005541913.localdomain systemd-rc-local-generator[66033]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:25 np0005541913.localdomain systemd-sysv-generator[66037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:25 np0005541913.localdomain systemd[1]: Starting dnf makecache...
Dec 02 08:17:25 np0005541913.localdomain sudo[65999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:17:25 np0005541913.localdomain sudo[65980]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:25 np0005541913.localdomain dnf[66047]: Updating Subscription Management repositories.
Dec 02 08:17:25 np0005541913.localdomain sudo[66094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulneycokgjzpjxijksznimfuypwyqokr ; /usr/bin/python3
Dec 02 08:17:25 np0005541913.localdomain sudo[66094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:25 np0005541913.localdomain sudo[65999]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:25 np0005541913.localdomain python3[66103]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:25 np0005541913.localdomain sudo[66094]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:26 np0005541913.localdomain sudo[66158]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmyfqpwpxmwnwknkmipbkrqfnitrpjel ; /usr/bin/python3
Dec 02 08:17:26 np0005541913.localdomain sudo[66158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:26 np0005541913.localdomain sudo[66161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:17:26 np0005541913.localdomain sudo[66161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:17:26 np0005541913.localdomain sudo[66161]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:26 np0005541913.localdomain python3[66160]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663445.6787555-107411-109576048817741/source _original_basename=tmp36v5jdot follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:26 np0005541913.localdomain sudo[66158]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:26 np0005541913.localdomain sudo[66176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:17:26 np0005541913.localdomain sudo[66176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:17:26 np0005541913.localdomain sudo[66218]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrgqkwbhvtmcrshpoiwnxnwfttdhfeld ; /usr/bin/python3
Dec 02 08:17:26 np0005541913.localdomain sudo[66218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:26 np0005541913.localdomain python3[66220]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:17:26 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:26 np0005541913.localdomain sudo[66176]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:27 np0005541913.localdomain systemd-sysv-generator[66277]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:27 np0005541913.localdomain systemd-rc-local-generator[66274]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:27 np0005541913.localdomain sudo[66288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:17:27 np0005541913.localdomain sudo[66288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:17:27 np0005541913.localdomain sudo[66288]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:27 np0005541913.localdomain sudo[66304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 -- inventory --format=json-pretty --filter-for-batch
Dec 02 08:17:27 np0005541913.localdomain sudo[66304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:17:27 np0005541913.localdomain dnf[66047]: Metadata cache refreshed recently.
Dec 02 08:17:27 np0005541913.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 02 08:17:27 np0005541913.localdomain systemd[1]: Finished dnf makecache.
Dec 02 08:17:27 np0005541913.localdomain systemd[1]: dnf-makecache.service: Consumed 2.101s CPU time.
Dec 02 08:17:27 np0005541913.localdomain podman[66362]: 
Dec 02 08:17:27 np0005541913.localdomain podman[66362]: 2025-12-02 08:17:27.823851725 +0000 UTC m=+0.059205021 container create bfba0677b1a42588249889bb281e5e624ad8b5d18a9ce3d666bb5bfe6a5b75f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_heyrovsky, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 08:17:27 np0005541913.localdomain systemd[1]: Started libpod-conmon-bfba0677b1a42588249889bb281e5e624ad8b5d18a9ce3d666bb5bfe6a5b75f0.scope.
Dec 02 08:17:27 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:17:27 np0005541913.localdomain podman[66362]: 2025-12-02 08:17:27.870895489 +0000 UTC m=+0.106248805 container init bfba0677b1a42588249889bb281e5e624ad8b5d18a9ce3d666bb5bfe6a5b75f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_heyrovsky, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1763362218, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 08:17:27 np0005541913.localdomain podman[66362]: 2025-12-02 08:17:27.881921304 +0000 UTC m=+0.117274580 container start bfba0677b1a42588249889bb281e5e624ad8b5d18a9ce3d666bb5bfe6a5b75f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_heyrovsky, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, version=7, RELEASE=main, release=1763362218, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 02 08:17:27 np0005541913.localdomain podman[66362]: 2025-12-02 08:17:27.882099329 +0000 UTC m=+0.117452595 container attach bfba0677b1a42588249889bb281e5e624ad8b5d18a9ce3d666bb5bfe6a5b75f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_heyrovsky, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_CLEAN=True, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, io.buildah.version=1.41.4)
Dec 02 08:17:27 np0005541913.localdomain great_heyrovsky[66377]: 167 167
Dec 02 08:17:27 np0005541913.localdomain systemd[1]: libpod-bfba0677b1a42588249889bb281e5e624ad8b5d18a9ce3d666bb5bfe6a5b75f0.scope: Deactivated successfully.
Dec 02 08:17:27 np0005541913.localdomain podman[66362]: 2025-12-02 08:17:27.887785707 +0000 UTC m=+0.123139053 container died bfba0677b1a42588249889bb281e5e624ad8b5d18a9ce3d666bb5bfe6a5b75f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_heyrovsky, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1763362218, CEPH_POINT_RELEASE=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 08:17:27 np0005541913.localdomain podman[66362]: 2025-12-02 08:17:27.806832703 +0000 UTC m=+0.042185989 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 08:17:27 np0005541913.localdomain podman[66382]: 2025-12-02 08:17:27.958145346 +0000 UTC m=+0.063950413 container remove bfba0677b1a42588249889bb281e5e624ad8b5d18a9ce3d666bb5bfe6a5b75f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_heyrovsky, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, release=1763362218, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Dec 02 08:17:27 np0005541913.localdomain systemd[1]: libpod-conmon-bfba0677b1a42588249889bb281e5e624ad8b5d18a9ce3d666bb5bfe6a5b75f0.scope: Deactivated successfully.
Dec 02 08:17:28 np0005541913.localdomain podman[66402]: 
Dec 02 08:17:28 np0005541913.localdomain podman[66402]: 2025-12-02 08:17:28.127649713 +0000 UTC m=+0.064316663 container create 615be9ead2c5dcd4403460bc5a56ec910cfcefc1a40bf2568e8f2ae7777b810c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_taussig, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, RELEASE=main, ceph=True, architecture=x86_64, GIT_BRANCH=main, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=)
Dec 02 08:17:28 np0005541913.localdomain systemd[1]: Started libpod-conmon-615be9ead2c5dcd4403460bc5a56ec910cfcefc1a40bf2568e8f2ae7777b810c.scope.
Dec 02 08:17:28 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:17:28 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c0c7febe39725fab0d61a6143d3683d801789cb15e98b14674d3e7becd398c4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 08:17:28 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c0c7febe39725fab0d61a6143d3683d801789cb15e98b14674d3e7becd398c4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 08:17:28 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c0c7febe39725fab0d61a6143d3683d801789cb15e98b14674d3e7becd398c4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 08:17:28 np0005541913.localdomain podman[66402]: 2025-12-02 08:17:28.19609991 +0000 UTC m=+0.132766860 container init 615be9ead2c5dcd4403460bc5a56ec910cfcefc1a40bf2568e8f2ae7777b810c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_taussig, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_BRANCH=main, ceph=True, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 08:17:28 np0005541913.localdomain podman[66402]: 2025-12-02 08:17:28.09973315 +0000 UTC m=+0.036400120 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 08:17:28 np0005541913.localdomain podman[66402]: 2025-12-02 08:17:28.206324983 +0000 UTC m=+0.142991933 container start 615be9ead2c5dcd4403460bc5a56ec910cfcefc1a40bf2568e8f2ae7777b810c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_taussig, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, release=1763362218, vendor=Red Hat, Inc., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, version=7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Dec 02 08:17:28 np0005541913.localdomain podman[66402]: 2025-12-02 08:17:28.20658972 +0000 UTC m=+0.143256710 container attach 615be9ead2c5dcd4403460bc5a56ec910cfcefc1a40bf2568e8f2ae7777b810c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_taussig, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 08:17:28 np0005541913.localdomain systemd[1]: Reached target tripleo_nova_libvirt.target.
Dec 02 08:17:28 np0005541913.localdomain sudo[66218]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:28 np0005541913.localdomain sudo[66436]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbwuuaafnxhpdqryxlvgvhdyxhwuclan ; /usr/bin/python3
Dec 02 08:17:28 np0005541913.localdomain sudo[66436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:28 np0005541913.localdomain python3[66440]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:17:28 np0005541913.localdomain sudo[66436]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-af9ff8a7a12f523ef36617097e6108784abba47ec48d635a283b941c3d6d20d8-merged.mount: Deactivated successfully.
Dec 02 08:17:29 np0005541913.localdomain sudo[67707]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngwslnbibnuhneztvgrfcasjzsszlbop ; /usr/bin/python3
Dec 02 08:17:29 np0005541913.localdomain sudo[67707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]: [
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:     {
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         "available": false,
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         "ceph_device": false,
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         "lsm_data": {},
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         "lvs": [],
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         "path": "/dev/sr0",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         "rejected_reasons": [
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "Insufficient space (<5GB)",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "Has a FileSystem"
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         ],
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         "sys_api": {
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "actuators": null,
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "device_nodes": "sr0",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "human_readable_size": "482.00 KB",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "id_bus": "ata",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "model": "QEMU DVD-ROM",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "nr_requests": "2",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "partitions": {},
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "path": "/dev/sr0",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "removable": "1",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "rev": "2.5+",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "ro": "0",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "rotational": "1",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "sas_address": "",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "sas_device_handle": "",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "scheduler_mode": "mq-deadline",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "sectors": 0,
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "sectorsize": "2048",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "size": 493568.0,
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "support_discard": "0",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "type": "disk",
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:             "vendor": "QEMU"
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:         }
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]:     }
Dec 02 08:17:29 np0005541913.localdomain stupefied_taussig[66416]: ]
Dec 02 08:17:29 np0005541913.localdomain systemd[1]: libpod-615be9ead2c5dcd4403460bc5a56ec910cfcefc1a40bf2568e8f2ae7777b810c.scope: Deactivated successfully.
Dec 02 08:17:29 np0005541913.localdomain podman[66402]: 2025-12-02 08:17:29.128519746 +0000 UTC m=+1.065186676 container died 615be9ead2c5dcd4403460bc5a56ec910cfcefc1a40bf2568e8f2ae7777b810c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_taussig, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, RELEASE=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, io.buildah.version=1.41.4, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:17:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5c0c7febe39725fab0d61a6143d3683d801789cb15e98b14674d3e7becd398c4-merged.mount: Deactivated successfully.
Dec 02 08:17:29 np0005541913.localdomain podman[68053]: 2025-12-02 08:17:29.234135833 +0000 UTC m=+0.092232697 container remove 615be9ead2c5dcd4403460bc5a56ec910cfcefc1a40bf2568e8f2ae7777b810c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_taussig, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-type=git, RELEASE=main, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Dec 02 08:17:29 np0005541913.localdomain systemd[1]: libpod-conmon-615be9ead2c5dcd4403460bc5a56ec910cfcefc1a40bf2568e8f2ae7777b810c.scope: Deactivated successfully.
Dec 02 08:17:29 np0005541913.localdomain sudo[66304]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:29 np0005541913.localdomain sudo[67707]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:29 np0005541913.localdomain sudo[68081]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhmkpvtbupxzopspxlvskjzshxerzcuu ; /usr/bin/python3
Dec 02 08:17:29 np0005541913.localdomain sudo[68081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:29 np0005541913.localdomain sudo[68081]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:30 np0005541913.localdomain sudo[68170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:17:30 np0005541913.localdomain sudo[68170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:17:30 np0005541913.localdomain sudo[68170]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:30 np0005541913.localdomain sudo[68198]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muojgrbuzwmipkydlxuihbozcdoxwcip ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663449.7532995-107547-160712638006330/async_wrapper.py 107953783535 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663449.7532995-107547-160712638006330/AnsiballZ_command.py _
Dec 02 08:17:30 np0005541913.localdomain sudo[68198]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 08:17:30 np0005541913.localdomain ansible-async_wrapper.py[68202]: Invoked with 107953783535 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663449.7532995-107547-160712638006330/AnsiballZ_command.py _
Dec 02 08:17:30 np0005541913.localdomain ansible-async_wrapper.py[68205]: Starting module and watcher
Dec 02 08:17:30 np0005541913.localdomain ansible-async_wrapper.py[68205]: Start watching 68206 (3600)
Dec 02 08:17:30 np0005541913.localdomain ansible-async_wrapper.py[68206]: Start module (68206)
Dec 02 08:17:30 np0005541913.localdomain ansible-async_wrapper.py[68202]: Return async_wrapper task started.
Dec 02 08:17:30 np0005541913.localdomain sudo[68198]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:30 np0005541913.localdomain sudo[68221]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uioffbxtyuyobawgiggkhxcqigpuavyg ; /usr/bin/python3
Dec 02 08:17:30 np0005541913.localdomain sudo[68221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:30 np0005541913.localdomain python3[68226]: ansible-ansible.legacy.async_status Invoked with jid=107953783535.68202 mode=status _async_dir=/tmp/.ansible_async
Dec 02 08:17:30 np0005541913.localdomain sudo[68221]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:17:31 np0005541913.localdomain podman[68240]: 2025-12-02 08:17:31.450233068 +0000 UTC m=+0.084532763 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4)
Dec 02 08:17:31 np0005541913.localdomain podman[68240]: 2025-12-02 08:17:31.645926131 +0000 UTC m=+0.280225806 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd)
Dec 02 08:17:31 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:    (file & line not available)
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:    (file & line not available)
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 02 08:17:34 np0005541913.localdomain puppet-user[68225]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.21 seconds
Dec 02 08:17:35 np0005541913.localdomain ansible-async_wrapper.py[68205]: 68206 still running (3600)
Dec 02 08:17:40 np0005541913.localdomain ansible-async_wrapper.py[68205]: 68206 still running (3595)
Dec 02 08:17:40 np0005541913.localdomain sudo[68453]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbohabgnkqptphlbndtcwwgvxpjijxbj ; /usr/bin/python3
Dec 02 08:17:40 np0005541913.localdomain sudo[68453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:40 np0005541913.localdomain python3[68455]: ansible-ansible.legacy.async_status Invoked with jid=107953783535.68202 mode=status _async_dir=/tmp/.ansible_async
Dec 02 08:17:40 np0005541913.localdomain sudo[68453]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:41 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 08:17:41 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 08:17:41 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:41 np0005541913.localdomain systemd-sysv-generator[68535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:41 np0005541913.localdomain systemd-rc-local-generator[68528]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:41 np0005541913.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 08:17:42 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 08:17:42 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 08:17:42 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.217s CPU time.
Dec 02 08:17:42 np0005541913.localdomain systemd[1]: run-rf94fc1652d754b0fbd559e1a9ee4d21e.service: Deactivated successfully.
Dec 02 08:17:43 np0005541913.localdomain puppet-user[68225]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Dec 02 08:17:43 np0005541913.localdomain puppet-user[68225]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}73f5e19a2a837d39ae500d3423960b5d4f0c8c0d1d962d4648bd104a53eb0bb3'
Dec 02 08:17:43 np0005541913.localdomain puppet-user[68225]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Dec 02 08:17:43 np0005541913.localdomain puppet-user[68225]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Dec 02 08:17:43 np0005541913.localdomain puppet-user[68225]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Dec 02 08:17:43 np0005541913.localdomain puppet-user[68225]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Dec 02 08:17:45 np0005541913.localdomain ansible-async_wrapper.py[68205]: 68206 still running (3590)
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:17:48 np0005541913.localdomain puppet-user[68225]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: tmp-crun.gm6PYz.mount: Deactivated successfully.
Dec 02 08:17:48 np0005541913.localdomain podman[69559]: 2025-12-02 08:17:48.456410944 +0000 UTC m=+0.086378403 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:48 np0005541913.localdomain podman[69559]: 2025-12-02 08:17:48.492775672 +0000 UTC m=+0.122743171 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3)
Dec 02 08:17:48 np0005541913.localdomain podman[69558]: 2025-12-02 08:17:48.509339161 +0000 UTC m=+0.139351681 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step3, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z)
Dec 02 08:17:48 np0005541913.localdomain podman[69558]: 2025-12-02 08:17:48.52086531 +0000 UTC m=+0.150877810 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, release=1761123044)
Dec 02 08:17:48 np0005541913.localdomain systemd-sysv-generator[69627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:48 np0005541913.localdomain systemd-rc-local-generator[69620]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Dec 02 08:17:48 np0005541913.localdomain snmpd[69635]: Can't find directory of RPM packages
Dec 02 08:17:48 np0005541913.localdomain snmpd[69635]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Dec 02 08:17:48 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:49 np0005541913.localdomain systemd-rc-local-generator[69658]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:49 np0005541913.localdomain systemd-sysv-generator[69661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:49 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:17:49 np0005541913.localdomain systemd-rc-local-generator[69695]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:17:49 np0005541913.localdomain systemd-sysv-generator[69700]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:17:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]: Notice: Applied catalog in 15.20 seconds
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]: Application:
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:    Initial environment: production
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:    Converged environment: production
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:          Run mode: user
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]: Changes:
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:             Total: 8
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]: Events:
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:           Success: 8
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:             Total: 8
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]: Resources:
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:         Restarted: 1
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:           Changed: 8
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:       Out of sync: 8
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:             Total: 19
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]: Time:
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:        Filebucket: 0.00
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:          Schedule: 0.00
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:            Augeas: 0.01
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:              File: 0.06
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:    Config retrieval: 0.27
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:           Service: 1.19
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:    Transaction evaluation: 15.19
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:    Catalog application: 15.20
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:          Last run: 1764663469
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:              Exec: 5.05
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:           Package: 8.75
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:             Total: 15.21
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]: Version:
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:            Config: 1764663454
Dec 02 08:17:49 np0005541913.localdomain puppet-user[68225]:            Puppet: 7.10.0
Dec 02 08:17:49 np0005541913.localdomain ansible-async_wrapper.py[68206]: Module complete (68206)
Dec 02 08:17:50 np0005541913.localdomain ansible-async_wrapper.py[68205]: Done in kid B.
Dec 02 08:17:51 np0005541913.localdomain sudo[69720]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmhsibbqcnxjwqgddpnbowkfvdotxatq ; /usr/bin/python3
Dec 02 08:17:51 np0005541913.localdomain sudo[69720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:51 np0005541913.localdomain python3[69722]: ansible-ansible.legacy.async_status Invoked with jid=107953783535.68202 mode=status _async_dir=/tmp/.ansible_async
Dec 02 08:17:51 np0005541913.localdomain sudo[69720]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:51 np0005541913.localdomain sudo[69736]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scfusfxlchexgemruavjvwvdamxfohgu ; /usr/bin/python3
Dec 02 08:17:51 np0005541913.localdomain sudo[69736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:51 np0005541913.localdomain python3[69738]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 08:17:51 np0005541913.localdomain sudo[69736]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:52 np0005541913.localdomain sudo[69752]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdckrnlpkahiuetbfkvguvxuergujzzg ; /usr/bin/python3
Dec 02 08:17:52 np0005541913.localdomain sudo[69752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:52 np0005541913.localdomain python3[69754]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:17:52 np0005541913.localdomain sudo[69752]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:52 np0005541913.localdomain sudo[69802]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tirakmmhzthqhemhlozhlaropkrvtuzn ; /usr/bin/python3
Dec 02 08:17:52 np0005541913.localdomain sudo[69802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:52 np0005541913.localdomain python3[69804]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:52 np0005541913.localdomain sudo[69802]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:52 np0005541913.localdomain sudo[69820]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynneibvobdsvqnyrvdxccgniltqkhgsa ; /usr/bin/python3
Dec 02 08:17:52 np0005541913.localdomain sudo[69820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:53 np0005541913.localdomain python3[69822]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp92itetqj recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 08:17:53 np0005541913.localdomain sudo[69820]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:53 np0005541913.localdomain sudo[69850]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujvtorpbeywavxqowrbvbqllkbjxdzri ; /usr/bin/python3
Dec 02 08:17:53 np0005541913.localdomain sudo[69850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:53 np0005541913.localdomain python3[69852]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:53 np0005541913.localdomain sudo[69850]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:53 np0005541913.localdomain sudo[69866]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhtyizxstujzgxiwznthjqtpndneldem ; /usr/bin/python3
Dec 02 08:17:53 np0005541913.localdomain sudo[69866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:54 np0005541913.localdomain sudo[69866]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:54 np0005541913.localdomain sudo[69953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpcqgfvsjlshcftindesqbzdnuljpbii ; /usr/bin/python3
Dec 02 08:17:54 np0005541913.localdomain sudo[69953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:54 np0005541913.localdomain python3[69955]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 02 08:17:54 np0005541913.localdomain sudo[69953]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:55 np0005541913.localdomain sudo[69972]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fylupetpxywcqlwejtgftdbcbboqmofm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:55 np0005541913.localdomain sudo[69972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:55 np0005541913.localdomain python3[69974]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:55 np0005541913.localdomain sudo[69972]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:55 np0005541913.localdomain sudo[69988]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clakcizklelxzgstkimcqpmysbrpsxbf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:55 np0005541913.localdomain sudo[69988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:55 np0005541913.localdomain sudo[69988]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:56 np0005541913.localdomain sudo[70004]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjwjcyhxcvtdzibhwcakjudciwbqayut ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:56 np0005541913.localdomain sudo[70004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:56 np0005541913.localdomain python3[70006]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:17:56 np0005541913.localdomain sudo[70004]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:56 np0005541913.localdomain sudo[70054]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgrzqvxtptfixdwywbijrrcxhzgfofwg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:56 np0005541913.localdomain sudo[70054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:56 np0005541913.localdomain python3[70056]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:56 np0005541913.localdomain sudo[70054]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:56 np0005541913.localdomain sudo[70072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecwmwfunktbvvmslaxqwkmgyyystkccp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:56 np0005541913.localdomain sudo[70072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:57 np0005541913.localdomain python3[70074]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:57 np0005541913.localdomain sudo[70072]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:57 np0005541913.localdomain sudo[70134]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twalwnyvtpkjdoxwqajyvivpsjmolcvc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:57 np0005541913.localdomain sudo[70134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:57 np0005541913.localdomain python3[70136]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:57 np0005541913.localdomain sudo[70134]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:57 np0005541913.localdomain sudo[70152]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czoitdhcbrapcgnltwceezobrfysifzs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:57 np0005541913.localdomain sudo[70152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:57 np0005541913.localdomain python3[70154]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:57 np0005541913.localdomain sudo[70152]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:58 np0005541913.localdomain sudo[70214]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvladdxweuetxlltyogvxhhtlfqsvexm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:58 np0005541913.localdomain sudo[70214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:58 np0005541913.localdomain python3[70216]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:58 np0005541913.localdomain sudo[70214]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:58 np0005541913.localdomain sudo[70232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xitoyeyineozbblsxsjanvjwdwxmdlcc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:58 np0005541913.localdomain sudo[70232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:58 np0005541913.localdomain python3[70234]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:58 np0005541913.localdomain sudo[70232]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:59 np0005541913.localdomain sudo[70294]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amebvzdbprptrovchtsszdscgfvmyusp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:59 np0005541913.localdomain sudo[70294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:59 np0005541913.localdomain python3[70296]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:17:59 np0005541913.localdomain sudo[70294]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:59 np0005541913.localdomain sudo[70312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fffbjpuuknsvrfrfnupxvvoaieexzoht ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:59 np0005541913.localdomain sudo[70312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:17:59 np0005541913.localdomain python3[70314]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:17:59 np0005541913.localdomain sudo[70312]: pam_unix(sudo:session): session closed for user root
Dec 02 08:17:59 np0005541913.localdomain sudo[70342]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywlhvzclsbskfcfoufbakfxmwgcpbtdx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:17:59 np0005541913.localdomain sudo[70342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:00 np0005541913.localdomain python3[70344]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:18:00 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:00 np0005541913.localdomain systemd-sysv-generator[70372]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:00 np0005541913.localdomain systemd-rc-local-generator[70369]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:00 np0005541913.localdomain sudo[70342]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:00 np0005541913.localdomain sudo[70428]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lswkldhnelwuirszgwafatifeomkzqzv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:00 np0005541913.localdomain sudo[70428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:00 np0005541913.localdomain python3[70430]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:18:00 np0005541913.localdomain sudo[70428]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:01 np0005541913.localdomain sudo[70446]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztuozjsmvjztwctlqcfwthulumhfjvir ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:01 np0005541913.localdomain sudo[70446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:01 np0005541913.localdomain python3[70448]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:01 np0005541913.localdomain sudo[70446]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:01 np0005541913.localdomain sudo[70508]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrdxwcxnuiccbfdzakyokmzsopcqgubp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:01 np0005541913.localdomain sudo[70508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:01 np0005541913.localdomain python3[70510]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:18:01 np0005541913.localdomain sudo[70508]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:01 np0005541913.localdomain sudo[70526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjmbcbyunukkaioffwnpgibbwuxzfrow ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:01 np0005541913.localdomain sudo[70526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:18:01 np0005541913.localdomain systemd[1]: tmp-crun.qtjLhd.mount: Deactivated successfully.
Dec 02 08:18:01 np0005541913.localdomain podman[70528]: 2025-12-02 08:18:01.959517567 +0000 UTC m=+0.092621608 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:18:02 np0005541913.localdomain python3[70529]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:02 np0005541913.localdomain sudo[70526]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:02 np0005541913.localdomain podman[70528]: 2025-12-02 08:18:02.140150762 +0000 UTC m=+0.273254823 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:18:02 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:18:02 np0005541913.localdomain sudo[70586]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bthnhzelviazqkxttrpzhcaclzhzevyr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:02 np0005541913.localdomain sudo[70586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:02 np0005541913.localdomain python3[70588]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:18:02 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:02 np0005541913.localdomain systemd-rc-local-generator[70610]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:02 np0005541913.localdomain systemd-sysv-generator[70613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:02 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:02 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 08:18:02 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 08:18:02 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 08:18:03 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 08:18:03 np0005541913.localdomain sudo[70586]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:03 np0005541913.localdomain sudo[70642]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ianxwamdkhsqafapuiyszdticrpcptbz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:03 np0005541913.localdomain sudo[70642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:03 np0005541913.localdomain python3[70644]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 02 08:18:03 np0005541913.localdomain sudo[70642]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:03 np0005541913.localdomain sudo[70658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgyjayqxifgtriawkbsvifrmvmkvdomu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:03 np0005541913.localdomain sudo[70658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:04 np0005541913.localdomain sudo[70658]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:05 np0005541913.localdomain sudo[70700]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aucpatkamaqvimpsvgtmrugwkhnbavnp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:05 np0005541913.localdomain sudo[70700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:05 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 02 08:18:05 np0005541913.localdomain podman[70877]: 2025-12-02 08:18:05.939162467 +0000 UTC m=+0.070232637 container create 656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, container_name=configure_cms_options, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:18:05 np0005541913.localdomain podman[70906]: 2025-12-02 08:18:05.977773617 +0000 UTC m=+0.086209290 container create 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 02 08:18:05 np0005541913.localdomain podman[70876]: 2025-12-02 08:18:05.899856528 +0000 UTC m=+0.031661258 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Dec 02 08:18:06 np0005541913.localdomain podman[70881]: 2025-12-02 08:18:06.003409717 +0000 UTC m=+0.127751441 container create 5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_libvirt_init_secret, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0.scope.
Dec 02 08:18:06 np0005541913.localdomain podman[70877]: 2025-12-02 08:18:05.906581994 +0000 UTC m=+0.037652154 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.scope.
Dec 02 08:18:06 np0005541913.localdomain podman[70881]: 2025-12-02 08:18:05.911687355 +0000 UTC m=+0.036029149 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:18:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fddcd6dd4df186203ff55efce1dca7750680c9de7878dc7d77dfefe109af9b62/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:06 np0005541913.localdomain podman[70906]: 2025-12-02 08:18:05.929438808 +0000 UTC m=+0.037874471 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe.scope.
Dec 02 08:18:06 np0005541913.localdomain podman[70878]: 2025-12-02 08:18:06.050397909 +0000 UTC m=+0.177622833 container create 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:18:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b94aabaeee0e41c77050836c47e281aafe3b0b49cec59de508354f5d2967adc/merged/etc/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b94aabaeee0e41c77050836c47e281aafe3b0b49cec59de508354f5d2967adc/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b94aabaeee0e41c77050836c47e281aafe3b0b49cec59de508354f5d2967adc/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:18:06 np0005541913.localdomain podman[70906]: 2025-12-02 08:18:06.06485581 +0000 UTC m=+0.173291503 container init 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:18:06 np0005541913.localdomain podman[70881]: 2025-12-02 08:18:06.068886001 +0000 UTC m=+0.193227755 container init 5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_libvirt_init_secret, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:18:06 np0005541913.localdomain podman[70881]: 2025-12-02 08:18:06.076097141 +0000 UTC m=+0.200438865 container start 5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:18:06 np0005541913.localdomain podman[70881]: 2025-12-02 08:18:06.076233485 +0000 UTC m=+0.200575269 container attach 5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, batch=17.1_20251118.1, container_name=nova_libvirt_init_secret, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, architecture=x86_64)
Dec 02 08:18:06 np0005541913.localdomain podman[70877]: 2025-12-02 08:18:06.080391621 +0000 UTC m=+0.211461801 container init 656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, release=1761123044, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Dec 02 08:18:06 np0005541913.localdomain sudo[70959]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.scope.
Dec 02 08:18:06 np0005541913.localdomain podman[70876]: 2025-12-02 08:18:06.088785403 +0000 UTC m=+0.220590103 container create 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:18:06 np0005541913.localdomain sudo[70959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:18:06 np0005541913.localdomain podman[70906]: 2025-12-02 08:18:06.104794466 +0000 UTC m=+0.213230129 container start 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Dec 02 08:18:06 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=72848ce4d815e5b4e89ff3e01c5f9f7e --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:18:06 np0005541913.localdomain podman[70878]: 2025-12-02 08:18:06.014106004 +0000 UTC m=+0.141330978 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.scope.
Dec 02 08:18:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5dc9262725001f2f73a799452ce705d444359a7e34fc5a93c05c8a39696c355/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:18:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8a416db81901f96d6fd72f5969e70208d019cecbe75cef9d1ed7630b319da67/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:06 np0005541913.localdomain sudo[70959]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:06 np0005541913.localdomain podman[70877]: 2025-12-02 08:18:06.140256449 +0000 UTC m=+0.271326609 container start 656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=configure_cms_options, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, version=17.1.12, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:18:06 np0005541913.localdomain podman[70877]: 2025-12-02 08:18:06.140955758 +0000 UTC m=+0.272025938 container attach 656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, container_name=configure_cms_options, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:18:06 np0005541913.localdomain podman[70878]: 2025-12-02 08:18:06.154057982 +0000 UTC m=+0.281282896 container init 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron)
Dec 02 08:18:06 np0005541913.localdomain sudo[71003]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:18:06 np0005541913.localdomain sudo[71003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:18:06 np0005541913.localdomain podman[70878]: 2025-12-02 08:18:06.184134425 +0000 UTC m=+0.311359339 container start 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:18:06 np0005541913.localdomain podman[70876]: 2025-12-02 08:18:06.18866579 +0000 UTC m=+0.320470510 container init 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:18:06 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 02 08:18:06 np0005541913.localdomain sudo[71028]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:18:06 np0005541913.localdomain sudo[71028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:18:06 np0005541913.localdomain ovs-vsctl[71035]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options
Dec 02 08:18:06 np0005541913.localdomain sudo[71003]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: libpod-656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0.scope: Deactivated successfully.
Dec 02 08:18:06 np0005541913.localdomain podman[70876]: 2025-12-02 08:18:06.229973545 +0000 UTC m=+0.361778245 container start 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: libpod-5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe.scope: Deactivated successfully.
Dec 02 08:18:06 np0005541913.localdomain podman[70877]: 2025-12-02 08:18:06.230747246 +0000 UTC m=+0.361817426 container died 656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:18:06 np0005541913.localdomain crond[71000]: (CRON) STARTUP (1.5.7)
Dec 02 08:18:06 np0005541913.localdomain crond[71000]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 41% if used.)
Dec 02 08:18:06 np0005541913.localdomain crond[71000]: (CRON) INFO (running with inotify support)
Dec 02 08:18:06 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=72848ce4d815e5b4e89ff3e01c5f9f7e --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Dec 02 08:18:06 np0005541913.localdomain podman[70969]: 2025-12-02 08:18:06.253751623 +0000 UTC m=+0.150633254 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:18:06 np0005541913.localdomain sudo[71028]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:06 np0005541913.localdomain podman[70881]: 2025-12-02 08:18:06.284943009 +0000 UTC m=+0.409284753 container died 5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, container_name=nova_libvirt_init_secret, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 02 08:18:06 np0005541913.localdomain podman[71044]: 2025-12-02 08:18:06.385547865 +0000 UTC m=+0.148211117 container cleanup 656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, container_name=configure_cms_options, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: libpod-conmon-656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0.scope: Deactivated successfully.
Dec 02 08:18:06 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764661676 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Dec 02 08:18:06 np0005541913.localdomain podman[70969]: 2025-12-02 08:18:06.398519585 +0000 UTC m=+0.295401226 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:18:06 np0005541913.localdomain podman[70969]: unhealthy
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed with result 'exit-code'.
Dec 02 08:18:06 np0005541913.localdomain podman[71057]: 2025-12-02 08:18:06.437149526 +0000 UTC m=+0.190865980 container cleanup 5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: libpod-conmon-5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe.scope: Deactivated successfully.
Dec 02 08:18:06 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=ff8ff724cb5f0d02131158e2fae849b6 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack
Dec 02 08:18:06 np0005541913.localdomain podman[71015]: 2025-12-02 08:18:06.370293273 +0000 UTC m=+0.181361457 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:18:06 np0005541913.localdomain podman[71039]: 2025-12-02 08:18:06.520575707 +0000 UTC m=+0.285340897 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, vcs-type=git)
Dec 02 08:18:06 np0005541913.localdomain podman[71039]: 2025-12-02 08:18:06.533829134 +0000 UTC m=+0.298594334 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Dec 02 08:18:06 np0005541913.localdomain podman[71039]: unhealthy
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Failed with result 'exit-code'.
Dec 02 08:18:06 np0005541913.localdomain podman[71015]: 2025-12-02 08:18:06.554478987 +0000 UTC m=+0.365547161 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:18:06 np0005541913.localdomain podman[71239]: 2025-12-02 08:18:06.666703646 +0000 UTC m=+0.072111359 container create 8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, version=17.1.12, container_name=setup_ovs_manager, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e.scope.
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:18:06 np0005541913.localdomain podman[71239]: 2025-12-02 08:18:06.623831539 +0000 UTC m=+0.029239282 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 02 08:18:06 np0005541913.localdomain podman[71239]: 2025-12-02 08:18:06.728886899 +0000 UTC m=+0.134294622 container init 8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=setup_ovs_manager, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:18:06 np0005541913.localdomain podman[71239]: 2025-12-02 08:18:06.73686712 +0000 UTC m=+0.142274823 container start 8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:14:25Z, container_name=setup_ovs_manager, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 02 08:18:06 np0005541913.localdomain podman[71239]: 2025-12-02 08:18:06.736999764 +0000 UTC m=+0.142407467 container attach 8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, container_name=setup_ovs_manager, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 02 08:18:06 np0005541913.localdomain podman[71265]: 2025-12-02 08:18:06.76212194 +0000 UTC m=+0.070352540 container create 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.scope.
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:18:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aed02a8eef27d7fad5076c16a3501516599cfd6963ae4f4d75e8f0b164242bc5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:06 np0005541913.localdomain podman[71265]: 2025-12-02 08:18:06.719252283 +0000 UTC m=+0.027482883 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:18:06 np0005541913.localdomain podman[71265]: 2025-12-02 08:18:06.848066332 +0000 UTC m=+0.156296922 container init 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:18:06 np0005541913.localdomain sudo[71294]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:18:06 np0005541913.localdomain sudo[71294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:18:06 np0005541913.localdomain podman[71265]: 2025-12-02 08:18:06.891761052 +0000 UTC m=+0.199991682 container start 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4)
Dec 02 08:18:06 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ff8ff724cb5f0d02131158e2fae849b6 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0b94aabaeee0e41c77050836c47e281aafe3b0b49cec59de508354f5d2967adc-merged.mount: Deactivated successfully.
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5417040510e8bae38145b491ecca5f879da783fbeaa8939a882be5d9417513fe-userdata-shm.mount: Deactivated successfully.
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a7c14a8989e4d415fd166e88f713e89b4166ed5e691e2325b8968269ca1a9aa5-merged.mount: Deactivated successfully.
Dec 02 08:18:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-656b9b116d635a442ed314ff9c3944ec3c499c0f90954d885e293495308ceba0-userdata-shm.mount: Deactivated successfully.
Dec 02 08:18:06 np0005541913.localdomain podman[71296]: 2025-12-02 08:18:06.964688773 +0000 UTC m=+0.070168266 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:18:06 np0005541913.localdomain sudo[71294]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:06 np0005541913.localdomain sshd[71327]: Server listening on 0.0.0.0 port 2022.
Dec 02 08:18:06 np0005541913.localdomain sshd[71327]: Server listening on :: port 2022.
Dec 02 08:18:07 np0005541913.localdomain sudo[71344]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxmx1g3t9/privsep.sock
Dec 02 08:18:07 np0005541913.localdomain sudo[71344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 02 08:18:07 np0005541913.localdomain podman[71296]: 2025-12-02 08:18:07.125967632 +0000 UTC m=+0.231447145 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:18:07 np0005541913.localdomain podman[71296]: unhealthy
Dec 02 08:18:07 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:18:07 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Failed with result 'exit-code'.
Dec 02 08:18:07 np0005541913.localdomain kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 02 08:18:07 np0005541913.localdomain sudo[71344]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:09 np0005541913.localdomain ovs-vsctl[71471]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 02 08:18:09 np0005541913.localdomain systemd[1]: libpod-8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e.scope: Deactivated successfully.
Dec 02 08:18:09 np0005541913.localdomain systemd[1]: libpod-8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e.scope: Consumed 2.929s CPU time.
Dec 02 08:18:09 np0005541913.localdomain podman[71239]: 2025-12-02 08:18:09.697208718 +0000 UTC m=+3.102616531 container died 8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, container_name=setup_ovs_manager, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:18:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e-userdata-shm.mount: Deactivated successfully.
Dec 02 08:18:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a423cc2ecc4b4a7a413eebe91da3e0f5986adaa9cffa0acabc604ad76a95339a-merged.mount: Deactivated successfully.
Dec 02 08:18:09 np0005541913.localdomain podman[71473]: 2025-12-02 08:18:09.797402813 +0000 UTC m=+0.082920618 container cleanup 8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, container_name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 08:18:09 np0005541913.localdomain systemd[1]: libpod-conmon-8b8f203b1106160e8add055392f6e9894b323cfd37e74c53036d6af648edb28e.scope: Deactivated successfully.
Dec 02 08:18:09 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764661676 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764661676'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Dec 02 08:18:10 np0005541913.localdomain podman[71586]: 2025-12-02 08:18:10.203044113 +0000 UTC m=+0.065827694 container create 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, distribution-scope=public, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com)
Dec 02 08:18:10 np0005541913.localdomain podman[71585]: 2025-12-02 08:18:10.234405792 +0000 UTC m=+0.098581502 container create e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1)
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started libpod-conmon-1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.scope.
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started libpod-conmon-e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.scope.
Dec 02 08:18:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1af3edb87ae84c24194878020e22370aba8355c75888d8a0972cd3b1ac86c8/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1af3edb87ae84c24194878020e22370aba8355c75888d8a0972cd3b1ac86c8/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1af3edb87ae84c24194878020e22370aba8355c75888d8a0972cd3b1ac86c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:10 np0005541913.localdomain podman[71586]: 2025-12-02 08:18:10.165865323 +0000 UTC m=+0.028648924 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 02 08:18:10 np0005541913.localdomain podman[71585]: 2025-12-02 08:18:10.178285007 +0000 UTC m=+0.042460787 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:18:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa2735d70b4229c33d88157dc663cc996128839f7744195fee819ab923e68e6b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa2735d70b4229c33d88157dc663cc996128839f7744195fee819ab923e68e6b/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa2735d70b4229c33d88157dc663cc996128839f7744195fee819ab923e68e6b/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:18:10 np0005541913.localdomain podman[71586]: 2025-12-02 08:18:10.300503264 +0000 UTC m=+0.163286835 container init 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:18:10 np0005541913.localdomain podman[71585]: 2025-12-02 08:18:10.319715866 +0000 UTC m=+0.183891586 container init e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:18:10 np0005541913.localdomain sudo[71623]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:18:10 np0005541913.localdomain sudo[71623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:18:10 np0005541913.localdomain podman[71586]: 2025-12-02 08:18:10.344444862 +0000 UTC m=+0.207228453 container start 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Dec 02 08:18:10 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=d1544001d5773d0045aaf61439ef5e02 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 02 08:18:10 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:18:10 np0005541913.localdomain podman[71585]: 2025-12-02 08:18:10.388538043 +0000 UTC m=+0.252713753 container start e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 02 08:18:10 np0005541913.localdomain python3[70702]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 02 08:18:10 np0005541913.localdomain sudo[71623]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:18:10 np0005541913.localdomain podman[71628]: 2025-12-02 08:18:10.448808293 +0000 UTC m=+0.100900066 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, io.openshift.expose-services=, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, container_name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:18:10 np0005541913.localdomain podman[71647]: 2025-12-02 08:18:10.533736336 +0000 UTC m=+0.143654001 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Queued start job for default target Main User Target.
Dec 02 08:18:10 np0005541913.localdomain podman[71628]: 2025-12-02 08:18:10.562260006 +0000 UTC m=+0.214351669 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Created slice User Application Slice.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Reached target Paths.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Reached target Timers.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Starting D-Bus User Message Bus Socket...
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Starting Create User's Volatile Files and Directories...
Dec 02 08:18:10 np0005541913.localdomain podman[71647]: 2025-12-02 08:18:10.572965973 +0000 UTC m=+0.182883648 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ovn-controller)
Dec 02 08:18:10 np0005541913.localdomain podman[71628]: unhealthy
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:18:10 np0005541913.localdomain podman[71647]: unhealthy
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Listening on D-Bus User Message Bus Socket.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Reached target Sockets.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Finished Create User's Volatile Files and Directories.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Reached target Basic System.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Reached target Main User Target.
Dec 02 08:18:10 np0005541913.localdomain systemd[71655]: Startup finished in 148ms.
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started User Manager for UID 0.
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: Started Session c9 of User root.
Dec 02 08:18:10 np0005541913.localdomain systemd[1]: session-c9.scope: Deactivated successfully.
Dec 02 08:18:10 np0005541913.localdomain kernel: device br-int entered promiscuous mode
Dec 02 08:18:10 np0005541913.localdomain NetworkManager[5965]: <info>  [1764663490.6957] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Dec 02 08:18:10 np0005541913.localdomain sudo[70700]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:10 np0005541913.localdomain systemd-udevd[71739]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 08:18:10 np0005541913.localdomain kernel: device genev_sys_6081 entered promiscuous mode
Dec 02 08:18:10 np0005541913.localdomain NetworkManager[5965]: <info>  [1764663490.7337] device (genev_sys_6081): carrier: link connected
Dec 02 08:18:10 np0005541913.localdomain NetworkManager[5965]: <info>  [1764663490.7343] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Dec 02 08:18:10 np0005541913.localdomain sudo[71759]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayizmwpjpjygnbvdklviatipqwqjciyq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:10 np0005541913.localdomain sudo[71759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:11 np0005541913.localdomain python3[71761]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:11 np0005541913.localdomain sudo[71759]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:11 np0005541913.localdomain sudo[71775]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhkkikpogugewvyqhpeotedauowfyoib ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:11 np0005541913.localdomain sudo[71775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:11 np0005541913.localdomain python3[71777]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:11 np0005541913.localdomain sudo[71775]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:11 np0005541913.localdomain sudo[71791]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usymqfgbposszvzjslszkzpmazatmthz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:11 np0005541913.localdomain sudo[71791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:11 np0005541913.localdomain python3[71793]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:11 np0005541913.localdomain sudo[71791]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:11 np0005541913.localdomain sudo[71807]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byfrtoezuynfesiyfnjsmmxffsyruoek ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:11 np0005541913.localdomain sudo[71807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:11 np0005541913.localdomain python3[71809]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:11 np0005541913.localdomain sudo[71807]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:12 np0005541913.localdomain sudo[71823]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgseautqbgqkuotpfkfuwepmqqisqeob ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:12 np0005541913.localdomain sudo[71823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:12 np0005541913.localdomain sudo[71826]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmp_n98mxbl/privsep.sock
Dec 02 08:18:12 np0005541913.localdomain sudo[71826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Dec 02 08:18:12 np0005541913.localdomain python3[71827]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:12 np0005541913.localdomain sudo[71823]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:12 np0005541913.localdomain sudo[71843]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrdmkptsurmrpgaqoeihrznaoblkqkvd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:12 np0005541913.localdomain sudo[71843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:12 np0005541913.localdomain python3[71845]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:12 np0005541913.localdomain sudo[71843]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:12 np0005541913.localdomain sudo[71859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqgrpwkbcqaiorkvkycfocpverwcccvg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:12 np0005541913.localdomain sudo[71859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:12 np0005541913.localdomain sudo[71826]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:12 np0005541913.localdomain python3[71861]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:18:12 np0005541913.localdomain sudo[71859]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:12 np0005541913.localdomain sudo[71877]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqazoivqrmvzfhnjgweiylqrgngxpwmd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:12 np0005541913.localdomain sudo[71877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:12 np0005541913.localdomain python3[71879]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:18:12 np0005541913.localdomain sudo[71877]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:13 np0005541913.localdomain sudo[71895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkmpmawnamloelnfniwwrcjofpkufghj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:13 np0005541913.localdomain sudo[71895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:13 np0005541913.localdomain python3[71897]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:18:13 np0005541913.localdomain sudo[71895]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:13 np0005541913.localdomain sudo[71911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcruwuwlyetykeniavdqicliawuwoygm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:13 np0005541913.localdomain sudo[71911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:13 np0005541913.localdomain python3[71913]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:18:13 np0005541913.localdomain sudo[71911]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:13 np0005541913.localdomain sudo[71927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chjecvfkbrgcqwrbenbgrbsuhdisnrct ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:13 np0005541913.localdomain sudo[71927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:13 np0005541913.localdomain python3[71929]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:18:13 np0005541913.localdomain sudo[71927]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:13 np0005541913.localdomain sudo[71943]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncccrxmmfkotqydvksiwthgyppbiovhu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:13 np0005541913.localdomain sudo[71943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:13 np0005541913.localdomain python3[71945]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:18:13 np0005541913.localdomain sudo[71943]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:14 np0005541913.localdomain sudo[72004]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjxecvkbonnmxbolzkchyvmpfmxwbiqv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:14 np0005541913.localdomain sudo[72004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:14 np0005541913.localdomain python3[72006]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663494.054966-108778-16394698894446/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:14 np0005541913.localdomain sudo[72004]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:14 np0005541913.localdomain sudo[72033]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbucfrlavithhtxvrslanqbbxnthprvv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:14 np0005541913.localdomain sudo[72033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:15 np0005541913.localdomain python3[72035]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663494.054966-108778-16394698894446/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:15 np0005541913.localdomain sudo[72033]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:15 np0005541913.localdomain sudo[72062]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esejmpqcnrmocawojcdrblwnsxpzmeba ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:15 np0005541913.localdomain sudo[72062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:15 np0005541913.localdomain python3[72064]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663494.054966-108778-16394698894446/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:15 np0005541913.localdomain sudo[72062]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:16 np0005541913.localdomain sudo[72091]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzlsvcyucntkeatmnqgysqgxismskjjf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:16 np0005541913.localdomain sudo[72091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:16 np0005541913.localdomain python3[72093]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663494.054966-108778-16394698894446/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:16 np0005541913.localdomain sudo[72091]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:16 np0005541913.localdomain sudo[72120]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qreqoiwmepqskmviezaiufrhcpqlwrae ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:16 np0005541913.localdomain sudo[72120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:16 np0005541913.localdomain python3[72122]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663494.054966-108778-16394698894446/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:16 np0005541913.localdomain sudo[72120]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:17 np0005541913.localdomain sudo[72149]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haxrdfvhmtacpfjuqiugmckmtmgpmnty ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:17 np0005541913.localdomain sudo[72149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:17 np0005541913.localdomain python3[72151]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663494.054966-108778-16394698894446/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:17 np0005541913.localdomain sudo[72149]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:17 np0005541913.localdomain sudo[72165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erfjzyhbbotpiwrnbfrsybfocekhbdti ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:17 np0005541913.localdomain sudo[72165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:17 np0005541913.localdomain python3[72167]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 08:18:17 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:17 np0005541913.localdomain systemd-sysv-generator[72194]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:17 np0005541913.localdomain systemd-rc-local-generator[72191]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:17 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:18 np0005541913.localdomain sudo[72165]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:18 np0005541913.localdomain sudo[72217]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmdemwspmdmzvlvidnwjsxtjahdiotnf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:18 np0005541913.localdomain sudo[72217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:18 np0005541913.localdomain python3[72219]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:18:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:18:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:18:18 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:18 np0005541913.localdomain podman[72222]: 2025-12-02 08:18:18.962940798 +0000 UTC m=+0.088199944 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:18:19 np0005541913.localdomain systemd-sysv-generator[72271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:19 np0005541913.localdomain systemd-rc-local-generator[72268]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:19 np0005541913.localdomain podman[72221]: 2025-12-02 08:18:19.044722984 +0000 UTC m=+0.168636773 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:51:28Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 02 08:18:19 np0005541913.localdomain podman[72222]: 2025-12-02 08:18:19.049695502 +0000 UTC m=+0.174954698 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:18:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:19 np0005541913.localdomain podman[72221]: 2025-12-02 08:18:19.104242314 +0000 UTC m=+0.228156103 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, distribution-scope=public, container_name=collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc.)
Dec 02 08:18:19 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:18:19 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:18:19 np0005541913.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Dec 02 08:18:19 np0005541913.localdomain tripleo-start-podman-container[72295]: Creating additional drop-in dependency for "ceilometer_agent_compute" (4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be)
Dec 02 08:18:19 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:19 np0005541913.localdomain systemd-sysv-generator[72358]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:19 np0005541913.localdomain systemd-rc-local-generator[72355]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:19 np0005541913.localdomain systemd[1]: Started ceilometer_agent_compute container.
Dec 02 08:18:19 np0005541913.localdomain sudo[72217]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:20 np0005541913.localdomain sudo[72378]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adaezgubzunnnrlsrjjmhnmipisnnvus ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:20 np0005541913.localdomain sudo[72378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:20 np0005541913.localdomain python3[72380]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:20 np0005541913.localdomain systemd-sysv-generator[72409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:20 np0005541913.localdomain systemd-rc-local-generator[72406]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Activating special unit Exit the Session...
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Stopped target Main User Target.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Stopped target Basic System.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Stopped target Paths.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Stopped target Sockets.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Stopped target Timers.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Closed D-Bus User Message Bus Socket.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Stopped Create User's Volatile Files and Directories.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Removed slice User Application Slice.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Reached target Shutdown.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Finished Exit the Session.
Dec 02 08:18:20 np0005541913.localdomain systemd[71655]: Reached target Exit the Session.
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: Starting ceilometer_agent_ipmi container...
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 02 08:18:20 np0005541913.localdomain systemd[1]: Started ceilometer_agent_ipmi container.
Dec 02 08:18:20 np0005541913.localdomain sudo[72378]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:21 np0005541913.localdomain sudo[72449]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgjnmxatyvehsflxhvdgdcbcfmcivoxk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:21 np0005541913.localdomain sudo[72449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:21 np0005541913.localdomain python3[72451]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:18:21 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:21 np0005541913.localdomain systemd-rc-local-generator[72474]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:21 np0005541913.localdomain systemd-sysv-generator[72480]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:21 np0005541913.localdomain systemd[1]: Starting logrotate_crond container...
Dec 02 08:18:21 np0005541913.localdomain systemd[1]: Started logrotate_crond container.
Dec 02 08:18:21 np0005541913.localdomain sudo[72449]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:22 np0005541913.localdomain sudo[72515]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlrninznxssaqmhccatuoomhrwjwixsc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:22 np0005541913.localdomain sudo[72515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:22 np0005541913.localdomain python3[72517]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:18:23 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:23 np0005541913.localdomain systemd-rc-local-generator[72543]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:23 np0005541913.localdomain systemd-sysv-generator[72547]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:23 np0005541913.localdomain systemd[1]: Starting nova_migration_target container...
Dec 02 08:18:23 np0005541913.localdomain systemd[1]: Started nova_migration_target container.
Dec 02 08:18:23 np0005541913.localdomain sudo[72515]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:24 np0005541913.localdomain sudo[72583]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfupdjrqofksyymtpmrjjhozloaxytdj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:24 np0005541913.localdomain sudo[72583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:24 np0005541913.localdomain python3[72585]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:18:24 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:24 np0005541913.localdomain systemd-sysv-generator[72617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:24 np0005541913.localdomain systemd-rc-local-generator[72612]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:25 np0005541913.localdomain systemd[1]: Starting ovn_controller container...
Dec 02 08:18:25 np0005541913.localdomain tripleo-start-podman-container[72625]: Creating additional drop-in dependency for "ovn_controller" (e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b)
Dec 02 08:18:25 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:25 np0005541913.localdomain systemd-sysv-generator[72682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:25 np0005541913.localdomain systemd-rc-local-generator[72677]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:25 np0005541913.localdomain systemd[1]: Started ovn_controller container.
Dec 02 08:18:25 np0005541913.localdomain sudo[72583]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:26 np0005541913.localdomain sudo[72707]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhlpxebjunyvrruowvqurzkoharsrvex ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:18:26 np0005541913.localdomain sudo[72707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:26 np0005541913.localdomain python3[72709]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:18:26 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:18:26 np0005541913.localdomain systemd-rc-local-generator[72734]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:18:26 np0005541913.localdomain systemd-sysv-generator[72739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:18:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:18:26 np0005541913.localdomain systemd[1]: Starting ovn_metadata_agent container...
Dec 02 08:18:26 np0005541913.localdomain systemd[1]: Started ovn_metadata_agent container.
Dec 02 08:18:26 np0005541913.localdomain sudo[72707]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:27 np0005541913.localdomain sudo[72789]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdfjnqyfuyssawlgjrhfbvfylkfdtldc ; /usr/bin/python3
Dec 02 08:18:27 np0005541913.localdomain sudo[72789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:27 np0005541913.localdomain python3[72791]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:27 np0005541913.localdomain sudo[72789]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:27 np0005541913.localdomain sudo[72837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uphyiexxangkdgzsydqmbnjztvyedqva ; /usr/bin/python3
Dec 02 08:18:27 np0005541913.localdomain sudo[72837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:28 np0005541913.localdomain sudo[72837]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:28 np0005541913.localdomain sudo[72880]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcjoauhumxccpoirajcmlofkhhqubqow ; /usr/bin/python3
Dec 02 08:18:28 np0005541913.localdomain sudo[72880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:28 np0005541913.localdomain sudo[72880]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:28 np0005541913.localdomain sudo[72911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ruqeebewrpujrkdmskzbeexakqjvxlml ; /usr/bin/python3
Dec 02 08:18:28 np0005541913.localdomain sudo[72911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:29 np0005541913.localdomain python3[72913]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005541913 step=4 update_config_hash_only=False
Dec 02 08:18:29 np0005541913.localdomain sudo[72911]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:29 np0005541913.localdomain sudo[72927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aodmbtkhluxhhmkuvtemqnjkcwzsjrkd ; /usr/bin/python3
Dec 02 08:18:29 np0005541913.localdomain sudo[72927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:29 np0005541913.localdomain python3[72929]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:18:29 np0005541913.localdomain sudo[72927]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:29 np0005541913.localdomain sudo[72943]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtbitwwnqngrjfhpidwsfbhuimwhokjf ; /usr/bin/python3
Dec 02 08:18:29 np0005541913.localdomain sudo[72943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:18:30 np0005541913.localdomain python3[72945]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 02 08:18:30 np0005541913.localdomain sudo[72943]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:30 np0005541913.localdomain sudo[72946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:18:30 np0005541913.localdomain sudo[72946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:18:30 np0005541913.localdomain sudo[72946]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:30 np0005541913.localdomain sudo[72961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:18:30 np0005541913.localdomain sudo[72961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:18:31 np0005541913.localdomain sudo[72961]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:32 np0005541913.localdomain sudo[73007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:18:32 np0005541913.localdomain sudo[73007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:18:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:18:32 np0005541913.localdomain sudo[73007]: pam_unix(sudo:session): session closed for user root
Dec 02 08:18:32 np0005541913.localdomain podman[73022]: 2025-12-02 08:18:32.428117402 +0000 UTC m=+0.078308111 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr)
Dec 02 08:18:32 np0005541913.localdomain podman[73022]: 2025-12-02 08:18:32.62040788 +0000 UTC m=+0.270598569 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1)
Dec 02 08:18:32 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:18:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:18:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:18:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:18:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:18:37 np0005541913.localdomain podman[73052]: 2025-12-02 08:18:37.450229547 +0000 UTC m=+0.091252029 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=logrotate_crond, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:18:37 np0005541913.localdomain podman[73052]: 2025-12-02 08:18:37.458570928 +0000 UTC m=+0.099593480 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Dec 02 08:18:37 np0005541913.localdomain podman[73053]: 2025-12-02 08:18:37.496646014 +0000 UTC m=+0.137832290 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:18:37 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:18:37 np0005541913.localdomain systemd[1]: tmp-crun.GVqAEm.mount: Deactivated successfully.
Dec 02 08:18:37 np0005541913.localdomain podman[73054]: 2025-12-02 08:18:37.578574844 +0000 UTC m=+0.212296063 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.openshift.expose-services=)
Dec 02 08:18:37 np0005541913.localdomain podman[73054]: 2025-12-02 08:18:37.60801168 +0000 UTC m=+0.241732899 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, tcib_managed=true, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute)
Dec 02 08:18:37 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:18:37 np0005541913.localdomain podman[73062]: 2025-12-02 08:18:37.662682424 +0000 UTC m=+0.292810904 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:18:37 np0005541913.localdomain podman[73062]: 2025-12-02 08:18:37.711547239 +0000 UTC m=+0.341675729 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:18:37 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:18:37 np0005541913.localdomain podman[73053]: 2025-12-02 08:18:37.929214309 +0000 UTC m=+0.570400505 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, io.buildah.version=1.41.4)
Dec 02 08:18:37 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:18:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:18:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:18:41 np0005541913.localdomain podman[73146]: 2025-12-02 08:18:41.441279286 +0000 UTC m=+0.079720750 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:18:41 np0005541913.localdomain systemd[1]: tmp-crun.gL0p5M.mount: Deactivated successfully.
Dec 02 08:18:41 np0005541913.localdomain podman[73145]: 2025-12-02 08:18:41.49197417 +0000 UTC m=+0.130551967 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, config_id=tripleo_step4)
Dec 02 08:18:41 np0005541913.localdomain podman[73146]: 2025-12-02 08:18:41.517510688 +0000 UTC m=+0.155952182 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com)
Dec 02 08:18:41 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:18:41 np0005541913.localdomain podman[73145]: 2025-12-02 08:18:41.550982115 +0000 UTC m=+0.189559892 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true)
Dec 02 08:18:41 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:18:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 08:18:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 08:18:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:18:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:18:49 np0005541913.localdomain podman[73194]: 2025-12-02 08:18:49.432268316 +0000 UTC m=+0.074094205 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container)
Dec 02 08:18:49 np0005541913.localdomain podman[73194]: 2025-12-02 08:18:49.446325735 +0000 UTC m=+0.088151634 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 02 08:18:49 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:18:49 np0005541913.localdomain podman[73193]: 2025-12-02 08:18:49.48369709 +0000 UTC m=+0.124293524 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://www.redhat.com, architecture=x86_64)
Dec 02 08:18:49 np0005541913.localdomain podman[73193]: 2025-12-02 08:18:49.490598391 +0000 UTC m=+0.131194895 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3)
Dec 02 08:18:49 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:19:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:19:03 np0005541913.localdomain systemd[1]: tmp-crun.0OCfyl.mount: Deactivated successfully.
Dec 02 08:19:03 np0005541913.localdomain podman[73232]: 2025-12-02 08:19:03.439091604 +0000 UTC m=+0.084254282 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:19:03 np0005541913.localdomain podman[73232]: 2025-12-02 08:19:03.623937287 +0000 UTC m=+0.269099945 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:19:03 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:19:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:19:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:19:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:19:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:19:08 np0005541913.localdomain podman[73263]: 2025-12-02 08:19:08.450898646 +0000 UTC m=+0.085357293 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1)
Dec 02 08:19:08 np0005541913.localdomain podman[73262]: 2025-12-02 08:19:08.498356849 +0000 UTC m=+0.135398007 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Dec 02 08:19:08 np0005541913.localdomain podman[73265]: 2025-12-02 08:19:08.552291681 +0000 UTC m=+0.181447471 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com)
Dec 02 08:19:08 np0005541913.localdomain podman[73265]: 2025-12-02 08:19:08.587958067 +0000 UTC m=+0.217113787 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 02 08:19:08 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:19:08 np0005541913.localdomain podman[73264]: 2025-12-02 08:19:08.613657998 +0000 UTC m=+0.243679002 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute)
Dec 02 08:19:08 np0005541913.localdomain podman[73262]: 2025-12-02 08:19:08.635109191 +0000 UTC m=+0.272150429 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, container_name=logrotate_crond, name=rhosp17/openstack-cron, batch=17.1_20251118.1, distribution-scope=public)
Dec 02 08:19:08 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:19:08 np0005541913.localdomain podman[73264]: 2025-12-02 08:19:08.666290094 +0000 UTC m=+0.296311128 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:19:08 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:19:08 np0005541913.localdomain podman[73263]: 2025-12-02 08:19:08.822907356 +0000 UTC m=+0.457365953 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 02 08:19:08 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:19:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:19:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:19:12 np0005541913.localdomain podman[73357]: 2025-12-02 08:19:12.477242417 +0000 UTC m=+0.068349192 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 08:19:12 np0005541913.localdomain podman[73358]: 2025-12-02 08:19:12.486736379 +0000 UTC m=+0.076340002 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-type=git, container_name=ovn_controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:19:12 np0005541913.localdomain podman[73358]: 2025-12-02 08:19:12.509652144 +0000 UTC m=+0.099255777 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, version=17.1.12, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:19:12 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:19:12 np0005541913.localdomain podman[73357]: 2025-12-02 08:19:12.545189406 +0000 UTC m=+0.136296161 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:19:12 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:19:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:19:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:19:20 np0005541913.localdomain systemd[1]: tmp-crun.J6tsXh.mount: Deactivated successfully.
Dec 02 08:19:20 np0005541913.localdomain podman[73406]: 2025-12-02 08:19:20.451008955 +0000 UTC m=+0.084562850 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:19:20 np0005541913.localdomain podman[73406]: 2025-12-02 08:19:20.466037361 +0000 UTC m=+0.099591316 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=)
Dec 02 08:19:20 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:19:20 np0005541913.localdomain podman[73405]: 2025-12-02 08:19:20.555881297 +0000 UTC m=+0.188622440 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:19:20 np0005541913.localdomain podman[73405]: 2025-12-02 08:19:20.596261133 +0000 UTC m=+0.229002286 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:51:28Z, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 02 08:19:20 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:19:32 np0005541913.localdomain sudo[73446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:19:32 np0005541913.localdomain sudo[73446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:19:32 np0005541913.localdomain sudo[73446]: pam_unix(sudo:session): session closed for user root
Dec 02 08:19:32 np0005541913.localdomain sudo[73461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 08:19:32 np0005541913.localdomain sudo[73461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:19:33 np0005541913.localdomain podman[73546]: 2025-12-02 08:19:33.46885287 +0000 UTC m=+0.090064903 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.openshift.expose-services=)
Dec 02 08:19:33 np0005541913.localdomain podman[73546]: 2025-12-02 08:19:33.574247636 +0000 UTC m=+0.195459699 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, RELEASE=main, architecture=x86_64, release=1763362218)
Dec 02 08:19:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:19:33 np0005541913.localdomain podman[73593]: 2025-12-02 08:19:33.75191193 +0000 UTC m=+0.083235233 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container)
Dec 02 08:19:33 np0005541913.localdomain sudo[73461]: pam_unix(sudo:session): session closed for user root
Dec 02 08:19:33 np0005541913.localdomain podman[73593]: 2025-12-02 08:19:33.938908043 +0000 UTC m=+0.270231396 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., distribution-scope=public)
Dec 02 08:19:33 np0005541913.localdomain sudo[73643]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:19:33 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:19:33 np0005541913.localdomain sudo[73643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:19:33 np0005541913.localdomain sudo[73643]: pam_unix(sudo:session): session closed for user root
Dec 02 08:19:34 np0005541913.localdomain sudo[73658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:19:34 np0005541913.localdomain sudo[73658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:19:34 np0005541913.localdomain sudo[73658]: pam_unix(sudo:session): session closed for user root
Dec 02 08:19:35 np0005541913.localdomain sudo[73705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:19:35 np0005541913.localdomain sudo[73705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:19:35 np0005541913.localdomain sudo[73705]: pam_unix(sudo:session): session closed for user root
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: tmp-crun.fGqmJh.mount: Deactivated successfully.
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: tmp-crun.PPpyBa.mount: Deactivated successfully.
Dec 02 08:19:39 np0005541913.localdomain podman[73721]: 2025-12-02 08:19:39.512824736 +0000 UTC m=+0.144226771 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:19:39 np0005541913.localdomain podman[73733]: 2025-12-02 08:19:39.473844248 +0000 UTC m=+0.094728702 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc.)
Dec 02 08:19:39 np0005541913.localdomain podman[73722]: 2025-12-02 08:19:39.540716208 +0000 UTC m=+0.163598558 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1)
Dec 02 08:19:39 np0005541913.localdomain podman[73733]: 2025-12-02 08:19:39.55311364 +0000 UTC m=+0.173998054 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:19:39 np0005541913.localdomain podman[73722]: 2025-12-02 08:19:39.595245176 +0000 UTC m=+0.218127546 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64)
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:19:39 np0005541913.localdomain podman[73720]: 2025-12-02 08:19:39.557175443 +0000 UTC m=+0.191445487 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Dec 02 08:19:39 np0005541913.localdomain podman[73720]: 2025-12-02 08:19:39.640002624 +0000 UTC m=+0.274272658 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, architecture=x86_64, distribution-scope=public, vcs-type=git, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:19:39 np0005541913.localdomain podman[73721]: 2025-12-02 08:19:39.86726174 +0000 UTC m=+0.498663725 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:19:39 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:19:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:19:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:19:43 np0005541913.localdomain podman[73811]: 2025-12-02 08:19:43.438299817 +0000 UTC m=+0.076783815 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_metadata_agent, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 08:19:43 np0005541913.localdomain podman[73811]: 2025-12-02 08:19:43.475128005 +0000 UTC m=+0.113611953 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:19:43 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:19:43 np0005541913.localdomain podman[73812]: 2025-12-02 08:19:43.495280263 +0000 UTC m=+0.133546255 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:19:43 np0005541913.localdomain podman[73812]: 2025-12-02 08:19:43.518829685 +0000 UTC m=+0.157095667 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1761123044, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 02 08:19:43 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:19:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:19:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:19:51 np0005541913.localdomain systemd[1]: tmp-crun.IGoZUe.mount: Deactivated successfully.
Dec 02 08:19:51 np0005541913.localdomain podman[73859]: 2025-12-02 08:19:51.50407197 +0000 UTC m=+0.102102894 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, release=1761123044, architecture=x86_64, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4)
Dec 02 08:19:51 np0005541913.localdomain podman[73858]: 2025-12-02 08:19:51.464806745 +0000 UTC m=+0.104379128 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Dec 02 08:19:51 np0005541913.localdomain podman[73859]: 2025-12-02 08:19:51.535459289 +0000 UTC m=+0.133490203 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, architecture=x86_64, container_name=iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:19:51 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:19:51 np0005541913.localdomain podman[73858]: 2025-12-02 08:19:51.545399474 +0000 UTC m=+0.184971847 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public)
Dec 02 08:19:51 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:20:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:20:04 np0005541913.localdomain podman[73897]: 2025-12-02 08:20:04.434501627 +0000 UTC m=+0.079043787 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git)
Dec 02 08:20:04 np0005541913.localdomain podman[73897]: 2025-12-02 08:20:04.637534103 +0000 UTC m=+0.282076263 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:20:04 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:20:10 np0005541913.localdomain podman[73925]: 2025-12-02 08:20:10.443829754 +0000 UTC m=+0.084792597 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, container_name=logrotate_crond, vcs-type=git, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4)
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: tmp-crun.NbERkn.mount: Deactivated successfully.
Dec 02 08:20:10 np0005541913.localdomain podman[73926]: 2025-12-02 08:20:10.453244514 +0000 UTC m=+0.086977377 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, config_id=tripleo_step4)
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: tmp-crun.ax10b0.mount: Deactivated successfully.
Dec 02 08:20:10 np0005541913.localdomain podman[73928]: 2025-12-02 08:20:10.508505943 +0000 UTC m=+0.139129950 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:20:10 np0005541913.localdomain podman[73925]: 2025-12-02 08:20:10.534950265 +0000 UTC m=+0.175913058 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:20:10 np0005541913.localdomain podman[73927]: 2025-12-02 08:20:10.490792523 +0000 UTC m=+0.124657480 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, release=1761123044, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:20:10 np0005541913.localdomain podman[73928]: 2025-12-02 08:20:10.556036408 +0000 UTC m=+0.186660445 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z)
Dec 02 08:20:10 np0005541913.localdomain podman[73927]: 2025-12-02 08:20:10.573880721 +0000 UTC m=+0.207745638 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:20:10 np0005541913.localdomain podman[73926]: 2025-12-02 08:20:10.807744671 +0000 UTC m=+0.441477584 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:20:10 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:20:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:20:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:20:14 np0005541913.localdomain podman[74015]: 2025-12-02 08:20:14.441857041 +0000 UTC m=+0.082918324 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Dec 02 08:20:14 np0005541913.localdomain podman[74016]: 2025-12-02 08:20:14.489420977 +0000 UTC m=+0.127448716 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:20:14 np0005541913.localdomain podman[74015]: 2025-12-02 08:20:14.495505635 +0000 UTC m=+0.136566958 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:20:14 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:20:14 np0005541913.localdomain podman[74016]: 2025-12-02 08:20:14.533279511 +0000 UTC m=+0.171307320 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:20:14 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:20:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:20:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:20:22 np0005541913.localdomain podman[74064]: 2025-12-02 08:20:22.420753323 +0000 UTC m=+0.062394227 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:20:22 np0005541913.localdomain systemd[1]: tmp-crun.zWsRgX.mount: Deactivated successfully.
Dec 02 08:20:22 np0005541913.localdomain podman[74063]: 2025-12-02 08:20:22.491848841 +0000 UTC m=+0.131898100 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:20:22 np0005541913.localdomain podman[74064]: 2025-12-02 08:20:22.510694732 +0000 UTC m=+0.152335626 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true)
Dec 02 08:20:22 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:20:22 np0005541913.localdomain podman[74063]: 2025-12-02 08:20:22.531147658 +0000 UTC m=+0.171196907 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1)
Dec 02 08:20:22 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:20:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:20:35 np0005541913.localdomain podman[74101]: 2025-12-02 08:20:35.437880089 +0000 UTC m=+0.076836696 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 02 08:20:35 np0005541913.localdomain sudo[74130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:20:35 np0005541913.localdomain sudo[74130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:20:35 np0005541913.localdomain sudo[74130]: pam_unix(sudo:session): session closed for user root
Dec 02 08:20:35 np0005541913.localdomain podman[74101]: 2025-12-02 08:20:35.665146945 +0000 UTC m=+0.304103542 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com)
Dec 02 08:20:35 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:20:35 np0005541913.localdomain sudo[74145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:20:35 np0005541913.localdomain sudo[74145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:20:36 np0005541913.localdomain sudo[74145]: pam_unix(sudo:session): session closed for user root
Dec 02 08:20:37 np0005541913.localdomain sudo[74192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:20:37 np0005541913.localdomain sudo[74192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:20:37 np0005541913.localdomain sudo[74192]: pam_unix(sudo:session): session closed for user root
Dec 02 08:20:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:20:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:20:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:20:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:20:41 np0005541913.localdomain podman[74208]: 2025-12-02 08:20:41.458019265 +0000 UTC m=+0.092264233 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_migration_target)
Dec 02 08:20:41 np0005541913.localdomain podman[74207]: 2025-12-02 08:20:41.499845182 +0000 UTC m=+0.135672143 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:20:41 np0005541913.localdomain podman[74207]: 2025-12-02 08:20:41.511288509 +0000 UTC m=+0.147115550 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12)
Dec 02 08:20:41 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:20:41 np0005541913.localdomain podman[74209]: 2025-12-02 08:20:41.552931801 +0000 UTC m=+0.185210505 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:20:41 np0005541913.localdomain podman[74210]: 2025-12-02 08:20:41.620584042 +0000 UTC m=+0.248116844 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:20:41 np0005541913.localdomain podman[74209]: 2025-12-02 08:20:41.633468639 +0000 UTC m=+0.265747353 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:20:41 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:20:41 np0005541913.localdomain podman[74210]: 2025-12-02 08:20:41.647512127 +0000 UTC m=+0.275044949 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:20:41 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:20:41 np0005541913.localdomain podman[74208]: 2025-12-02 08:20:41.794857583 +0000 UTC m=+0.429102521 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12)
Dec 02 08:20:41 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:20:42 np0005541913.localdomain systemd[1]: tmp-crun.CwrRNC.mount: Deactivated successfully.
Dec 02 08:20:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:20:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:20:45 np0005541913.localdomain podman[74295]: 2025-12-02 08:20:45.426488096 +0000 UTC m=+0.068091104 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:20:45 np0005541913.localdomain systemd[1]: tmp-crun.ofKjDb.mount: Deactivated successfully.
Dec 02 08:20:45 np0005541913.localdomain podman[74294]: 2025-12-02 08:20:45.453301357 +0000 UTC m=+0.092096888 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:20:45 np0005541913.localdomain podman[74295]: 2025-12-02 08:20:45.506965851 +0000 UTC m=+0.148568859 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, container_name=ovn_controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Dec 02 08:20:45 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:20:45 np0005541913.localdomain podman[74294]: 2025-12-02 08:20:45.563203257 +0000 UTC m=+0.201998778 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Dec 02 08:20:45 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:20:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:20:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:20:53 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:20:53 np0005541913.localdomain recover_tripleo_nova_virtqemud[74345]: 62312
Dec 02 08:20:53 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:20:53 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:20:53 np0005541913.localdomain podman[74343]: 2025-12-02 08:20:53.423633491 +0000 UTC m=+0.061596484 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:20:53 np0005541913.localdomain podman[74342]: 2025-12-02 08:20:53.436056625 +0000 UTC m=+0.076805105 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, container_name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:20:53 np0005541913.localdomain podman[74343]: 2025-12-02 08:20:53.439894651 +0000 UTC m=+0.077857664 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid)
Dec 02 08:20:53 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:20:53 np0005541913.localdomain podman[74342]: 2025-12-02 08:20:53.494684387 +0000 UTC m=+0.135432937 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:20:53 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:21:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:21:06 np0005541913.localdomain podman[74382]: 2025-12-02 08:21:06.446582318 +0000 UTC m=+0.085237349 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 02 08:21:06 np0005541913.localdomain podman[74382]: 2025-12-02 08:21:06.634076014 +0000 UTC m=+0.272731005 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:21:06 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:21:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:21:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:21:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:21:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:21:12 np0005541913.localdomain systemd[1]: tmp-crun.7V6GiO.mount: Deactivated successfully.
Dec 02 08:21:12 np0005541913.localdomain podman[74410]: 2025-12-02 08:21:12.448304525 +0000 UTC m=+0.086270348 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Dec 02 08:21:12 np0005541913.localdomain podman[74411]: 2025-12-02 08:21:12.48935801 +0000 UTC m=+0.123144678 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git)
Dec 02 08:21:12 np0005541913.localdomain podman[74417]: 2025-12-02 08:21:12.495184251 +0000 UTC m=+0.120867425 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:21:12 np0005541913.localdomain podman[74412]: 2025-12-02 08:21:12.424763103 +0000 UTC m=+0.058924601 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Dec 02 08:21:12 np0005541913.localdomain podman[74417]: 2025-12-02 08:21:12.538068348 +0000 UTC m=+0.163751522 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:21:12 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:21:12 np0005541913.localdomain podman[74412]: 2025-12-02 08:21:12.559972063 +0000 UTC m=+0.194133581 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, container_name=ceilometer_agent_compute, version=17.1.12)
Dec 02 08:21:12 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:21:12 np0005541913.localdomain podman[74410]: 2025-12-02 08:21:12.582838586 +0000 UTC m=+0.220804409 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, distribution-scope=public, name=rhosp17/openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 02 08:21:12 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:21:12 np0005541913.localdomain podman[74411]: 2025-12-02 08:21:12.828835771 +0000 UTC m=+0.462622419 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true)
Dec 02 08:21:12 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:21:16 np0005541913.localdomain sudo[74549]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzjlypimiidcfytbrgvlpbbgitrvgbzu ; /usr/bin/python3
Dec 02 08:21:16 np0005541913.localdomain sudo[74549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:21:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:21:16 np0005541913.localdomain podman[74553]: 2025-12-02 08:21:16.137794738 +0000 UTC m=+0.082753061 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 08:21:16 np0005541913.localdomain podman[74551]: 2025-12-02 08:21:16.176836847 +0000 UTC m=+0.121889282 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 02 08:21:16 np0005541913.localdomain python3[74552]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:21:16 np0005541913.localdomain podman[74553]: 2025-12-02 08:21:16.210487979 +0000 UTC m=+0.155446262 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, release=1761123044, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public)
Dec 02 08:21:16 np0005541913.localdomain sudo[74549]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:16 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:21:16 np0005541913.localdomain podman[74551]: 2025-12-02 08:21:16.229952396 +0000 UTC m=+0.175004851 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:21:16 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:21:16 np0005541913.localdomain sudo[74639]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apghykgtcngegorkbwwzdwravemxffha ; /usr/bin/python3
Dec 02 08:21:16 np0005541913.localdomain sudo[74639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:16 np0005541913.localdomain python3[74641]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663675.8658788-113047-29496742702015/source _original_basename=tmpe817b0h8 follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:21:16 np0005541913.localdomain sudo[74639]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:17 np0005541913.localdomain sudo[74669]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyteybqeyolqhpwdjcqgqmxietosjumn ; /usr/bin/python3
Dec 02 08:21:17 np0005541913.localdomain sudo[74669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:17 np0005541913.localdomain python3[74671]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:21:17 np0005541913.localdomain sudo[74669]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:18 np0005541913.localdomain sudo[74719]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oshimgpbmwsxbgtzkibwjlnuadofeggl ; /usr/bin/python3
Dec 02 08:21:18 np0005541913.localdomain sudo[74719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:18 np0005541913.localdomain sudo[74719]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:18 np0005541913.localdomain sudo[74737]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqramkvmzoosljedmcnyzpkcbwvxgsdx ; /usr/bin/python3
Dec 02 08:21:18 np0005541913.localdomain sudo[74737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:18 np0005541913.localdomain sudo[74737]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:19 np0005541913.localdomain sudo[74841]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-giosunwbqisgcpwdbqaqabhuergvqpxg ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663678.78743-113240-203508478189784/async_wrapper.py 943560368233 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663678.78743-113240-203508478189784/AnsiballZ_command.py _
Dec 02 08:21:19 np0005541913.localdomain sudo[74841]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 08:21:19 np0005541913.localdomain ansible-async_wrapper.py[74843]: Invoked with 943560368233 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663678.78743-113240-203508478189784/AnsiballZ_command.py _
Dec 02 08:21:19 np0005541913.localdomain ansible-async_wrapper.py[74846]: Starting module and watcher
Dec 02 08:21:19 np0005541913.localdomain ansible-async_wrapper.py[74846]: Start watching 74847 (3600)
Dec 02 08:21:19 np0005541913.localdomain ansible-async_wrapper.py[74847]: Start module (74847)
Dec 02 08:21:19 np0005541913.localdomain ansible-async_wrapper.py[74843]: Return async_wrapper task started.
Dec 02 08:21:19 np0005541913.localdomain sudo[74841]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:19 np0005541913.localdomain sudo[74865]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vduiisksjoleoshefmzzhaqerpmcfuzq ; /usr/bin/python3
Dec 02 08:21:19 np0005541913.localdomain sudo[74865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:19 np0005541913.localdomain python3[74867]: ansible-ansible.legacy.async_status Invoked with jid=943560368233.74843 mode=status _async_dir=/tmp/.ansible_async
Dec 02 08:21:19 np0005541913.localdomain sudo[74865]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    (file: /etc/puppet/hiera.yaml)
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: Undefined variable '::deploy_config_name';
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    (file & line not available)
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    (file & line not available)
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Notice: Compiled catalog for np0005541913.localdomain in environment production in 0.22 seconds
Dec 02 08:21:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:21:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Notice: Applied catalog in 0.34 seconds
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Application:
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    Initial environment: production
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    Converged environment: production
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:          Run mode: user
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Changes:
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Events:
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Resources:
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:             Total: 19
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Time:
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:          Schedule: 0.00
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:           Package: 0.00
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:              Exec: 0.01
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:            Augeas: 0.01
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:              File: 0.02
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:           Service: 0.10
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    Config retrieval: 0.28
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    Transaction evaluation: 0.34
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:    Catalog application: 0.34
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:          Last run: 1764663683
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:        Filebucket: 0.00
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:             Total: 0.35
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]: Version:
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:            Config: 1764663683
Dec 02 08:21:23 np0005541913.localdomain puppet-user[74852]:            Puppet: 7.10.0
Dec 02 08:21:23 np0005541913.localdomain systemd[1]: tmp-crun.kJS4s8.mount: Deactivated successfully.
Dec 02 08:21:23 np0005541913.localdomain podman[74988]: 2025-12-02 08:21:23.695757284 +0000 UTC m=+0.084991212 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:21:23 np0005541913.localdomain podman[74988]: 2025-12-02 08:21:23.709557946 +0000 UTC m=+0.098791934 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1761123044)
Dec 02 08:21:23 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:21:23 np0005541913.localdomain ansible-async_wrapper.py[74847]: Module complete (74847)
Dec 02 08:21:23 np0005541913.localdomain podman[74987]: 2025-12-02 08:21:23.795791911 +0000 UTC m=+0.188235338 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, container_name=collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Dec 02 08:21:23 np0005541913.localdomain podman[74987]: 2025-12-02 08:21:23.832009943 +0000 UTC m=+0.224453310 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:21:23 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:21:24 np0005541913.localdomain ansible-async_wrapper.py[74846]: Done in kid B.
Dec 02 08:21:29 np0005541913.localdomain sudo[75040]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxnfgavmziveqtygfqrnbqzdsbvpoibo ; /usr/bin/python3
Dec 02 08:21:29 np0005541913.localdomain sudo[75040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:30 np0005541913.localdomain python3[75042]: ansible-ansible.legacy.async_status Invoked with jid=943560368233.74843 mode=status _async_dir=/tmp/.ansible_async
Dec 02 08:21:30 np0005541913.localdomain sudo[75040]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:30 np0005541913.localdomain sudo[75056]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzryqjaipjulqypgyskyvkagpygedxst ; /usr/bin/python3
Dec 02 08:21:30 np0005541913.localdomain sudo[75056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:30 np0005541913.localdomain python3[75058]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 08:21:30 np0005541913.localdomain sudo[75056]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:31 np0005541913.localdomain sudo[75072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwfncsaessoguuvchbygvecbmjxkxeoj ; /usr/bin/python3
Dec 02 08:21:31 np0005541913.localdomain sudo[75072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:31 np0005541913.localdomain python3[75074]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:21:31 np0005541913.localdomain sudo[75072]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:31 np0005541913.localdomain sudo[75122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwktqtsahiexsesnxclxefweltvzajnc ; /usr/bin/python3
Dec 02 08:21:31 np0005541913.localdomain sudo[75122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:31 np0005541913.localdomain python3[75124]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:21:31 np0005541913.localdomain sudo[75122]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:31 np0005541913.localdomain sudo[75140]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chngynhmfitudtdorninefrmsmmaxwov ; /usr/bin/python3
Dec 02 08:21:31 np0005541913.localdomain sudo[75140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:32 np0005541913.localdomain python3[75142]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpas6n_xmd recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 08:21:32 np0005541913.localdomain sudo[75140]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:32 np0005541913.localdomain sudo[75170]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhwhsflwmnkdkfjgwaoyzdlqcscvsmuj ; /usr/bin/python3
Dec 02 08:21:32 np0005541913.localdomain sudo[75170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:32 np0005541913.localdomain python3[75172]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:21:32 np0005541913.localdomain sudo[75170]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:32 np0005541913.localdomain sudo[75186]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebwtincntdxzbvhdrjhaiqozbrzmlvrs ; /usr/bin/python3
Dec 02 08:21:32 np0005541913.localdomain sudo[75186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:33 np0005541913.localdomain sudo[75186]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:33 np0005541913.localdomain sudo[75275]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsnbhghuuclavysmbcaqymjwxzpudbsi ; /usr/bin/python3
Dec 02 08:21:33 np0005541913.localdomain sudo[75275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:33 np0005541913.localdomain python3[75277]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 02 08:21:33 np0005541913.localdomain sudo[75275]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:34 np0005541913.localdomain sudo[75294]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zywjschwyghulcbjgixgzczfqcuoardw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:34 np0005541913.localdomain sudo[75294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:34 np0005541913.localdomain python3[75296]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:21:34 np0005541913.localdomain sudo[75294]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:34 np0005541913.localdomain sudo[75310]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhxzcskizpgrqmhjqpznuuxoygyjgbck ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:34 np0005541913.localdomain sudo[75310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:34 np0005541913.localdomain sudo[75310]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:35 np0005541913.localdomain sudo[75326]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxfrkgzbtgxmodynmcqbpqntpefqlhmo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:35 np0005541913.localdomain sudo[75326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:35 np0005541913.localdomain python3[75328]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:21:35 np0005541913.localdomain sudo[75326]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:35 np0005541913.localdomain sudo[75376]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rilkbrjvflkbdtjtlyuvkprkolgfniwc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:35 np0005541913.localdomain sudo[75376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:35 np0005541913.localdomain python3[75378]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:21:36 np0005541913.localdomain sudo[75376]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:36 np0005541913.localdomain sudo[75394]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddohwojbhftmhxxnucqjhfoafhmnoajz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:36 np0005541913.localdomain sudo[75394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:36 np0005541913.localdomain python3[75396]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:21:36 np0005541913.localdomain sudo[75394]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:36 np0005541913.localdomain sudo[75456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yineyvqelzrxcerbomkwoxbctbhtkqff ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:36 np0005541913.localdomain sudo[75456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:36 np0005541913.localdomain python3[75458]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:21:36 np0005541913.localdomain sudo[75456]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:36 np0005541913.localdomain sudo[75474]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qydpbutcnlkoxxbpkxxrbmbrouncfhsy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:36 np0005541913.localdomain sudo[75474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:21:36 np0005541913.localdomain podman[75476]: 2025-12-02 08:21:36.955724328 +0000 UTC m=+0.077823614 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 02 08:21:37 np0005541913.localdomain python3[75477]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:21:37 np0005541913.localdomain sudo[75474]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:37 np0005541913.localdomain podman[75476]: 2025-12-02 08:21:37.219051912 +0000 UTC m=+0.341151188 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com)
Dec 02 08:21:37 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:21:37 np0005541913.localdomain sudo[75534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:21:37 np0005541913.localdomain sudo[75534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:21:37 np0005541913.localdomain sudo[75534]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:37 np0005541913.localdomain sudo[75568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:21:37 np0005541913.localdomain sudo[75595]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epyomsgminnphfbdkaapzfrziziwkbho ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:37 np0005541913.localdomain sudo[75568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:21:37 np0005541913.localdomain sudo[75595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:37 np0005541913.localdomain python3[75598]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:21:37 np0005541913.localdomain sudo[75595]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:37 np0005541913.localdomain sudo[75614]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjspbqkyvgihbniyyukvrpoqusvgoqgf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:37 np0005541913.localdomain sudo[75614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:37 np0005541913.localdomain python3[75623]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:21:37 np0005541913.localdomain sudo[75614]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:38 np0005541913.localdomain sudo[75568]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:38 np0005541913.localdomain sudo[75709]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxapzacvcttpywrikpsabofjzyjqljbz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:38 np0005541913.localdomain sudo[75709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:38 np0005541913.localdomain python3[75711]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:21:38 np0005541913.localdomain sudo[75709]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:38 np0005541913.localdomain sudo[75727]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eekdcozslvvwdbvkfsgxfwtdpnzutwwn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:38 np0005541913.localdomain sudo[75727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:38 np0005541913.localdomain sudo[75730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:21:38 np0005541913.localdomain sudo[75730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:21:38 np0005541913.localdomain sudo[75730]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:38 np0005541913.localdomain python3[75729]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:21:38 np0005541913.localdomain sudo[75727]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:38 np0005541913.localdomain sudo[75772]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktlobvtwakzjvunivlltiimqanwijbqz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:38 np0005541913.localdomain sudo[75772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:39 np0005541913.localdomain python3[75774]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:21:39 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:21:39 np0005541913.localdomain systemd-sysv-generator[75802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:21:39 np0005541913.localdomain systemd-rc-local-generator[75799]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:21:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:21:39 np0005541913.localdomain sudo[75772]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:39 np0005541913.localdomain sudo[75858]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmweeoyfvjmedtigmnsuubpxpvojsixd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:39 np0005541913.localdomain sudo[75858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:40 np0005541913.localdomain python3[75860]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:21:40 np0005541913.localdomain sudo[75858]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:40 np0005541913.localdomain sudo[75876]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klmczfauybnpkffqaexeiqdmlmkczrlb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:40 np0005541913.localdomain sudo[75876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:40 np0005541913.localdomain python3[75878]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:21:40 np0005541913.localdomain sudo[75876]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:40 np0005541913.localdomain sudo[75938]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdwxsiqwakyavxkjtctbzxxlxcbfnajg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:40 np0005541913.localdomain sudo[75938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:40 np0005541913.localdomain python3[75940]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 08:21:40 np0005541913.localdomain sudo[75938]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:41 np0005541913.localdomain sudo[75956]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycpwzmnqvyakkxacgeraavwxfhxuyarp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:41 np0005541913.localdomain sudo[75956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:41 np0005541913.localdomain python3[75958]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:21:41 np0005541913.localdomain sudo[75956]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:41 np0005541913.localdomain sudo[75986]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vphnlynltkhxqpnijsvygaauhfmbopki ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:41 np0005541913.localdomain sudo[75986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:41 np0005541913.localdomain python3[75988]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:21:41 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:21:41 np0005541913.localdomain systemd-rc-local-generator[76011]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:21:41 np0005541913.localdomain systemd-sysv-generator[76015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:21:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:21:42 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 08:21:42 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 08:21:42 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 08:21:42 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 08:21:42 np0005541913.localdomain sudo[75986]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:42 np0005541913.localdomain sudo[76043]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcvnjfonphhlyypmvfalllkdmwhcrlna ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:42 np0005541913.localdomain sudo[76043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:42 np0005541913.localdomain python3[76045]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 02 08:21:42 np0005541913.localdomain sudo[76043]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:42 np0005541913.localdomain sudo[76059]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsfosazuyswbcrbfgtnwcjnqjnixxfel ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:42 np0005541913.localdomain sudo[76059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:21:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:21:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:21:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:21:42 np0005541913.localdomain systemd[1]: tmp-crun.vjBZPD.mount: Deactivated successfully.
Dec 02 08:21:43 np0005541913.localdomain podman[76063]: 2025-12-02 08:21:43.030975348 +0000 UTC m=+0.126374017 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, version=17.1.12, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4)
Dec 02 08:21:43 np0005541913.localdomain podman[76070]: 2025-12-02 08:21:43.043650118 +0000 UTC m=+0.129602635 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:21:43 np0005541913.localdomain podman[76064]: 2025-12-02 08:21:43.083769587 +0000 UTC m=+0.171810022 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1)
Dec 02 08:21:43 np0005541913.localdomain podman[76070]: 2025-12-02 08:21:43.092854869 +0000 UTC m=+0.178807376 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true)
Dec 02 08:21:43 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:21:43 np0005541913.localdomain podman[76064]: 2025-12-02 08:21:43.107901775 +0000 UTC m=+0.195942210 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 02 08:21:43 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:21:43 np0005541913.localdomain systemd[1]: tmp-crun.ARtC1X.mount: Deactivated successfully.
Dec 02 08:21:43 np0005541913.localdomain podman[76062]: 2025-12-02 08:21:43.006687586 +0000 UTC m=+0.104474160 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, container_name=logrotate_crond, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible)
Dec 02 08:21:43 np0005541913.localdomain podman[76062]: 2025-12-02 08:21:43.188954588 +0000 UTC m=+0.286741172 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:21:43 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:21:43 np0005541913.localdomain podman[76063]: 2025-12-02 08:21:43.367039843 +0000 UTC m=+0.462438502 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:21:43 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:21:43 np0005541913.localdomain sudo[76059]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:44 np0005541913.localdomain sudo[76193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdumpeeregqosserameayqflgpcgrqne ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:21:44 np0005541913.localdomain sudo[76193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:21:44 np0005541913.localdomain python3[76195]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 02 08:21:44 np0005541913.localdomain podman[76233]: 2025-12-02 08:21:44.951585347 +0000 UTC m=+0.065789601 container create 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:21:44 np0005541913.localdomain systemd[1]: Started libpod-conmon-1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.scope.
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:21:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0399317fe788e77a051163f65a715baa05b56d1254753267f43144269e89c7fb/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:21:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0399317fe788e77a051163f65a715baa05b56d1254753267f43144269e89c7fb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:21:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0399317fe788e77a051163f65a715baa05b56d1254753267f43144269e89c7fb/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 08:21:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0399317fe788e77a051163f65a715baa05b56d1254753267f43144269e89c7fb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 08:21:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0399317fe788e77a051163f65a715baa05b56d1254753267f43144269e89c7fb/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 08:21:45 np0005541913.localdomain podman[76233]: 2025-12-02 08:21:44.917271318 +0000 UTC m=+0.031475582 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:21:45 np0005541913.localdomain podman[76233]: 2025-12-02 08:21:45.045574207 +0000 UTC m=+0.159778491 container init 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_compute, name=rhosp17/openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: tmp-crun.zwBKVi.mount: Deactivated successfully.
Dec 02 08:21:45 np0005541913.localdomain sudo[76254]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:21:45 np0005541913.localdomain podman[76233]: 2025-12-02 08:21:45.086401456 +0000 UTC m=+0.200605710 container start 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute)
Dec 02 08:21:45 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:21:45 np0005541913.localdomain python3[76195]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:21:45 np0005541913.localdomain podman[76255]: 2025-12-02 08:21:45.184083539 +0000 UTC m=+0.087587634 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true)
Dec 02 08:21:45 np0005541913.localdomain podman[76255]: 2025-12-02 08:21:45.244945772 +0000 UTC m=+0.148449907 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.expose-services=, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:21:45 np0005541913.localdomain podman[76255]: unhealthy
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Queued start job for default target Main User Target.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Created slice User Application Slice.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Reached target Paths.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Reached target Timers.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Starting D-Bus User Message Bus Socket...
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Starting Create User's Volatile Files and Directories...
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Listening on D-Bus User Message Bus Socket.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Reached target Sockets.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Finished Create User's Volatile Files and Directories.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Reached target Basic System.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Reached target Main User Target.
Dec 02 08:21:45 np0005541913.localdomain systemd[76273]: Startup finished in 137ms.
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Started User Manager for UID 0.
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Started Session c10 of User root.
Dec 02 08:21:45 np0005541913.localdomain sudo[76254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Dec 02 08:21:45 np0005541913.localdomain sudo[76254]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: session-c10.scope: Deactivated successfully.
Dec 02 08:21:45 np0005541913.localdomain podman[76355]: 2025-12-02 08:21:45.569301635 +0000 UTC m=+0.091350478 container create bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=nova_wait_for_compute_service, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:21:45 np0005541913.localdomain podman[76355]: 2025-12-02 08:21:45.50621039 +0000 UTC m=+0.028259283 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Started libpod-conmon-bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d.scope.
Dec 02 08:21:45 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:21:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c52eb2917af814f67bf9757f04611b4867e02cd94735e31ef932542a90a8de8/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Dec 02 08:21:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c52eb2917af814f67bf9757f04611b4867e02cd94735e31ef932542a90a8de8/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 08:21:45 np0005541913.localdomain podman[76355]: 2025-12-02 08:21:45.648568278 +0000 UTC m=+0.170617121 container init bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, release=1761123044, version=17.1.12, name=rhosp17/openstack-nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, tcib_managed=true, config_id=tripleo_step5, io.openshift.expose-services=)
Dec 02 08:21:45 np0005541913.localdomain podman[76355]: 2025-12-02 08:21:45.65586025 +0000 UTC m=+0.177909123 container start bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:21:45 np0005541913.localdomain podman[76355]: 2025-12-02 08:21:45.656181429 +0000 UTC m=+0.178230312 container attach bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, container_name=nova_wait_for_compute_service, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true)
Dec 02 08:21:45 np0005541913.localdomain sudo[76375]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 08:21:45 np0005541913.localdomain sudo[76375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Dec 02 08:21:45 np0005541913.localdomain sudo[76375]: pam_unix(sudo:session): session closed for user root
Dec 02 08:21:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:21:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:21:46 np0005541913.localdomain systemd[1]: tmp-crun.YZYJQz.mount: Deactivated successfully.
Dec 02 08:21:46 np0005541913.localdomain podman[76380]: 2025-12-02 08:21:46.467381918 +0000 UTC m=+0.101396455 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, build-date=2025-11-18T23:34:05Z, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:21:46 np0005541913.localdomain podman[76380]: 2025-12-02 08:21:46.523071249 +0000 UTC m=+0.157085776 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12)
Dec 02 08:21:46 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:21:46 np0005541913.localdomain podman[76379]: 2025-12-02 08:21:46.526176505 +0000 UTC m=+0.160179192 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Dec 02 08:21:46 np0005541913.localdomain podman[76379]: 2025-12-02 08:21:46.607974928 +0000 UTC m=+0.241977685 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:21:46 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:21:46 np0005541913.localdomain systemd[1]: tmp-crun.U1n95X.mount: Deactivated successfully.
Dec 02 08:21:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:21:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:21:54 np0005541913.localdomain systemd[1]: tmp-crun.gzaz3Z.mount: Deactivated successfully.
Dec 02 08:21:54 np0005541913.localdomain podman[76425]: 2025-12-02 08:21:54.453941382 +0000 UTC m=+0.094289590 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:21:54 np0005541913.localdomain podman[76425]: 2025-12-02 08:21:54.48709807 +0000 UTC m=+0.127446198 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-collectd, vcs-type=git, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step3, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:21:54 np0005541913.localdomain podman[76426]: 2025-12-02 08:21:54.494278379 +0000 UTC m=+0.133452234 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true)
Dec 02 08:21:54 np0005541913.localdomain podman[76426]: 2025-12-02 08:21:54.526066448 +0000 UTC m=+0.165240303 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, distribution-scope=public, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, container_name=iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3)
Dec 02 08:21:54 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:21:54 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:21:55 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Activating special unit Exit the Session...
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Stopped target Main User Target.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Stopped target Basic System.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Stopped target Paths.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Stopped target Sockets.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Stopped target Timers.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Closed D-Bus User Message Bus Socket.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Stopped Create User's Volatile Files and Directories.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Removed slice User Application Slice.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Reached target Shutdown.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Finished Exit the Session.
Dec 02 08:21:55 np0005541913.localdomain systemd[76273]: Reached target Exit the Session.
Dec 02 08:21:55 np0005541913.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 02 08:21:55 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 02 08:21:55 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 02 08:21:55 np0005541913.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 02 08:21:55 np0005541913.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 02 08:21:55 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 02 08:21:55 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 02 08:22:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:22:08 np0005541913.localdomain podman[76468]: 2025-12-02 08:22:08.022835471 +0000 UTC m=+0.059718584 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1)
Dec 02 08:22:08 np0005541913.localdomain podman[76468]: 2025-12-02 08:22:08.187853356 +0000 UTC m=+0.224736579 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:46Z)
Dec 02 08:22:08 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:22:12 np0005541913.localdomain sshd[35537]: Received disconnect from 192.168.122.100 port 45012:11: disconnected by user
Dec 02 08:22:12 np0005541913.localdomain sshd[35537]: Disconnected from user zuul 192.168.122.100 port 45012
Dec 02 08:22:12 np0005541913.localdomain sshd[35534]: pam_unix(sshd:session): session closed for user zuul
Dec 02 08:22:12 np0005541913.localdomain systemd[1]: session-28.scope: Deactivated successfully.
Dec 02 08:22:12 np0005541913.localdomain systemd[1]: session-28.scope: Consumed 2.898s CPU time.
Dec 02 08:22:12 np0005541913.localdomain systemd-logind[757]: Session 28 logged out. Waiting for processes to exit.
Dec 02 08:22:12 np0005541913.localdomain systemd-logind[757]: Removed session 28.
Dec 02 08:22:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:22:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:22:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:22:13 np0005541913.localdomain podman[76496]: 2025-12-02 08:22:13.454159648 +0000 UTC m=+0.090360221 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:22:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:22:13 np0005541913.localdomain podman[76496]: 2025-12-02 08:22:13.514133377 +0000 UTC m=+0.150333950 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 02 08:22:13 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:22:13 np0005541913.localdomain podman[76495]: 2025-12-02 08:22:13.520565244 +0000 UTC m=+0.159287617 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:22:13 np0005541913.localdomain podman[76495]: 2025-12-02 08:22:13.603031426 +0000 UTC m=+0.241753809 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond)
Dec 02 08:22:13 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:22:13 np0005541913.localdomain podman[76497]: 2025-12-02 08:22:13.614991136 +0000 UTC m=+0.243668081 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Dec 02 08:22:13 np0005541913.localdomain podman[76497]: 2025-12-02 08:22:13.666136531 +0000 UTC m=+0.294813486 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Dec 02 08:22:13 np0005541913.localdomain podman[76537]: 2025-12-02 08:22:13.666399678 +0000 UTC m=+0.188122535 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:22:13 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:22:14 np0005541913.localdomain podman[76537]: 2025-12-02 08:22:14.035146929 +0000 UTC m=+0.556869786 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 02 08:22:14 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:22:14 np0005541913.localdomain systemd[1]: tmp-crun.f6U4Me.mount: Deactivated successfully.
Dec 02 08:22:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:22:15 np0005541913.localdomain systemd[1]: tmp-crun.6lYgM2.mount: Deactivated successfully.
Dec 02 08:22:15 np0005541913.localdomain podman[76587]: 2025-12-02 08:22:15.437116872 +0000 UTC m=+0.081649100 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:22:15 np0005541913.localdomain podman[76587]: 2025-12-02 08:22:15.494704675 +0000 UTC m=+0.139236863 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, container_name=nova_compute, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12)
Dec 02 08:22:15 np0005541913.localdomain podman[76587]: unhealthy
Dec 02 08:22:15 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:22:15 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 08:22:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:22:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:22:17 np0005541913.localdomain systemd[1]: tmp-crun.sO7Oi5.mount: Deactivated successfully.
Dec 02 08:22:17 np0005541913.localdomain podman[76610]: 2025-12-02 08:22:17.450272182 +0000 UTC m=+0.092701325 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, release=1761123044, architecture=x86_64)
Dec 02 08:22:17 np0005541913.localdomain podman[76611]: 2025-12-02 08:22:17.499492364 +0000 UTC m=+0.140825657 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Dec 02 08:22:17 np0005541913.localdomain podman[76610]: 2025-12-02 08:22:17.552879401 +0000 UTC m=+0.195308574 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 02 08:22:17 np0005541913.localdomain podman[76611]: 2025-12-02 08:22:17.569132 +0000 UTC m=+0.210465283 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12)
Dec 02 08:22:17 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:22:17 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:22:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:22:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:22:25 np0005541913.localdomain systemd[1]: tmp-crun.wmzfwv.mount: Deactivated successfully.
Dec 02 08:22:25 np0005541913.localdomain podman[76654]: 2025-12-02 08:22:25.444331684 +0000 UTC m=+0.088176241 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 02 08:22:25 np0005541913.localdomain podman[76654]: 2025-12-02 08:22:25.456025967 +0000 UTC m=+0.099870564 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, tcib_managed=true)
Dec 02 08:22:25 np0005541913.localdomain systemd[1]: tmp-crun.dYGM9c.mount: Deactivated successfully.
Dec 02 08:22:25 np0005541913.localdomain podman[76655]: 2025-12-02 08:22:25.494962454 +0000 UTC m=+0.131963631 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, container_name=iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git)
Dec 02 08:22:25 np0005541913.localdomain podman[76655]: 2025-12-02 08:22:25.505991989 +0000 UTC m=+0.142993196 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:22:25 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:22:25 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:22:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:22:38 np0005541913.localdomain podman[76693]: 2025-12-02 08:22:38.442105862 +0000 UTC m=+0.075050466 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:22:38 np0005541913.localdomain podman[76693]: 2025-12-02 08:22:38.743265214 +0000 UTC m=+0.376209898 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:22:38 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:22:38 np0005541913.localdomain sudo[76721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:22:38 np0005541913.localdomain sudo[76721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:22:38 np0005541913.localdomain sudo[76721]: pam_unix(sudo:session): session closed for user root
Dec 02 08:22:38 np0005541913.localdomain sudo[76736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:22:38 np0005541913.localdomain sudo[76736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:22:39 np0005541913.localdomain sudo[76736]: pam_unix(sudo:session): session closed for user root
Dec 02 08:22:40 np0005541913.localdomain sudo[76782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:22:40 np0005541913.localdomain sudo[76782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:22:40 np0005541913.localdomain sudo[76782]: pam_unix(sudo:session): session closed for user root
Dec 02 08:22:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:22:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:22:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:22:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:22:44 np0005541913.localdomain systemd[1]: tmp-crun.48ndkU.mount: Deactivated successfully.
Dec 02 08:22:44 np0005541913.localdomain podman[76800]: 2025-12-02 08:22:44.423021363 +0000 UTC m=+0.062642124 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, tcib_managed=true)
Dec 02 08:22:44 np0005541913.localdomain podman[76797]: 2025-12-02 08:22:44.428474074 +0000 UTC m=+0.069335189 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:22:44 np0005541913.localdomain podman[76798]: 2025-12-02 08:22:44.492719811 +0000 UTC m=+0.130000157 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com)
Dec 02 08:22:44 np0005541913.localdomain podman[76797]: 2025-12-02 08:22:44.511963323 +0000 UTC m=+0.152824458 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z)
Dec 02 08:22:44 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:22:44 np0005541913.localdomain podman[76799]: 2025-12-02 08:22:44.475953957 +0000 UTC m=+0.116416311 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:22:44 np0005541913.localdomain podman[76800]: 2025-12-02 08:22:44.527882174 +0000 UTC m=+0.167502935 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true)
Dec 02 08:22:44 np0005541913.localdomain podman[76799]: 2025-12-02 08:22:44.560214858 +0000 UTC m=+0.200677212 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4)
Dec 02 08:22:44 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:22:44 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:22:44 np0005541913.localdomain podman[76798]: 2025-12-02 08:22:44.863013934 +0000 UTC m=+0.500294350 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4)
Dec 02 08:22:44 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:22:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:22:46 np0005541913.localdomain podman[76895]: 2025-12-02 08:22:46.43569162 +0000 UTC m=+0.081391972 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:22:46 np0005541913.localdomain podman[76895]: 2025-12-02 08:22:46.496377419 +0000 UTC m=+0.142077811 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_id=tripleo_step5, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z)
Dec 02 08:22:46 np0005541913.localdomain podman[76895]: unhealthy
Dec 02 08:22:46 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:22:46 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 08:22:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:22:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:22:48 np0005541913.localdomain podman[76917]: 2025-12-02 08:22:48.421104133 +0000 UTC m=+0.062446239 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 02 08:22:48 np0005541913.localdomain podman[76918]: 2025-12-02 08:22:48.467715882 +0000 UTC m=+0.099187394 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public)
Dec 02 08:22:48 np0005541913.localdomain podman[76918]: 2025-12-02 08:22:48.513844228 +0000 UTC m=+0.145315690 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:22:48 np0005541913.localdomain podman[76917]: 2025-12-02 08:22:48.520208134 +0000 UTC m=+0.161550250 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:22:48 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:22:48 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:22:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:22:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:22:56 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:22:56 np0005541913.localdomain recover_tripleo_nova_virtqemud[76976]: 62312
Dec 02 08:22:56 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:22:56 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:22:56 np0005541913.localdomain systemd[1]: tmp-crun.dIyW6n.mount: Deactivated successfully.
Dec 02 08:22:56 np0005541913.localdomain podman[76962]: 2025-12-02 08:22:56.451955711 +0000 UTC m=+0.091928834 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd)
Dec 02 08:22:56 np0005541913.localdomain podman[76962]: 2025-12-02 08:22:56.460846016 +0000 UTC m=+0.100819179 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true)
Dec 02 08:22:56 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:22:56 np0005541913.localdomain podman[76963]: 2025-12-02 08:22:56.503645601 +0000 UTC m=+0.141374212 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, version=17.1.12, vcs-type=git)
Dec 02 08:22:56 np0005541913.localdomain podman[76963]: 2025-12-02 08:22:56.535131832 +0000 UTC m=+0.172860423 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:22:56 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:23:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:23:09 np0005541913.localdomain systemd[1]: tmp-crun.zIixDL.mount: Deactivated successfully.
Dec 02 08:23:09 np0005541913.localdomain podman[77001]: 2025-12-02 08:23:09.452287311 +0000 UTC m=+0.091463971 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:23:09 np0005541913.localdomain podman[77001]: 2025-12-02 08:23:09.645361982 +0000 UTC m=+0.284538723 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:23:09 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:23:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:23:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:23:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:23:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:23:15 np0005541913.localdomain podman[77030]: 2025-12-02 08:23:15.456548598 +0000 UTC m=+0.090031762 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:23:15 np0005541913.localdomain podman[77030]: 2025-12-02 08:23:15.493673706 +0000 UTC m=+0.127156780 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:23:15 np0005541913.localdomain podman[77031]: 2025-12-02 08:23:15.506769047 +0000 UTC m=+0.139394187 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4)
Dec 02 08:23:15 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:23:15 np0005541913.localdomain podman[77033]: 2025-12-02 08:23:15.554806676 +0000 UTC m=+0.183774894 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:23:15 np0005541913.localdomain podman[77033]: 2025-12-02 08:23:15.589520777 +0000 UTC m=+0.218488995 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git)
Dec 02 08:23:15 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:23:15 np0005541913.localdomain podman[77032]: 2025-12-02 08:23:15.658840175 +0000 UTC m=+0.287945297 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Dec 02 08:23:15 np0005541913.localdomain podman[77032]: 2025-12-02 08:23:15.710106202 +0000 UTC m=+0.339211314 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 02 08:23:15 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:23:15 np0005541913.localdomain podman[77031]: 2025-12-02 08:23:15.855085673 +0000 UTC m=+0.487710833 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:23:15 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:23:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:23:17 np0005541913.localdomain podman[77125]: 2025-12-02 08:23:17.427336496 +0000 UTC m=+0.068896287 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step5)
Dec 02 08:23:17 np0005541913.localdomain podman[77125]: 2025-12-02 08:23:17.480397214 +0000 UTC m=+0.121957035 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=nova_compute, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5)
Dec 02 08:23:17 np0005541913.localdomain podman[77125]: unhealthy
Dec 02 08:23:17 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:23:17 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 08:23:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:23:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:23:19 np0005541913.localdomain podman[77147]: 2025-12-02 08:23:19.43883894 +0000 UTC m=+0.081588903 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, tcib_managed=true, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team)
Dec 02 08:23:19 np0005541913.localdomain systemd[1]: tmp-crun.fBo5nI.mount: Deactivated successfully.
Dec 02 08:23:19 np0005541913.localdomain podman[77148]: 2025-12-02 08:23:19.50224621 +0000 UTC m=+0.142024313 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:23:19 np0005541913.localdomain podman[77147]: 2025-12-02 08:23:19.50925685 +0000 UTC m=+0.152006773 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Dec 02 08:23:19 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:23:19 np0005541913.localdomain podman[77148]: 2025-12-02 08:23:19.525672423 +0000 UTC m=+0.165450586 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:23:19 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:23:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:23:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:23:27 np0005541913.localdomain podman[77195]: 2025-12-02 08:23:27.432501957 +0000 UTC m=+0.077359448 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 02 08:23:27 np0005541913.localdomain systemd[1]: tmp-crun.CWbXkE.mount: Deactivated successfully.
Dec 02 08:23:27 np0005541913.localdomain podman[77196]: 2025-12-02 08:23:27.493229647 +0000 UTC m=+0.132280271 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3)
Dec 02 08:23:27 np0005541913.localdomain podman[77196]: 2025-12-02 08:23:27.501336325 +0000 UTC m=+0.140386919 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, distribution-scope=public, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:23:27 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:23:27 np0005541913.localdomain podman[77195]: 2025-12-02 08:23:27.522303531 +0000 UTC m=+0.167161022 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3)
Dec 02 08:23:27 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:23:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:23:40 np0005541913.localdomain sudo[77234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:23:40 np0005541913.localdomain sudo[77234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:23:40 np0005541913.localdomain sudo[77234]: pam_unix(sudo:session): session closed for user root
Dec 02 08:23:40 np0005541913.localdomain sudo[77257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:23:40 np0005541913.localdomain systemd[1]: tmp-crun.h2Jzrn.mount: Deactivated successfully.
Dec 02 08:23:40 np0005541913.localdomain sudo[77257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:23:40 np0005541913.localdomain podman[77248]: 2025-12-02 08:23:40.432112897 +0000 UTC m=+0.072186790 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:46Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, architecture=x86_64)
Dec 02 08:23:40 np0005541913.localdomain podman[77248]: 2025-12-02 08:23:40.597919982 +0000 UTC m=+0.237993885 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:23:40 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:23:40 np0005541913.localdomain sudo[77257]: pam_unix(sudo:session): session closed for user root
Dec 02 08:23:41 np0005541913.localdomain sudo[77324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:23:41 np0005541913.localdomain sudo[77324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:23:41 np0005541913.localdomain sudo[77324]: pam_unix(sudo:session): session closed for user root
Dec 02 08:23:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:23:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:23:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:23:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:23:46 np0005541913.localdomain podman[77341]: 2025-12-02 08:23:46.45626782 +0000 UTC m=+0.091846420 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:23:46 np0005541913.localdomain podman[77341]: 2025-12-02 08:23:46.50813488 +0000 UTC m=+0.143713500 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, release=1761123044, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:23:46 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:23:46 np0005541913.localdomain podman[77340]: 2025-12-02 08:23:46.422235041 +0000 UTC m=+0.063926336 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Dec 02 08:23:46 np0005541913.localdomain podman[77342]: 2025-12-02 08:23:46.554257944 +0000 UTC m=+0.185282061 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=)
Dec 02 08:23:46 np0005541913.localdomain podman[77339]: 2025-12-02 08:23:46.514512082 +0000 UTC m=+0.153925995 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:23:46 np0005541913.localdomain podman[77342]: 2025-12-02 08:23:46.573973166 +0000 UTC m=+0.204997263 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Dec 02 08:23:46 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:23:46 np0005541913.localdomain podman[77339]: 2025-12-02 08:23:46.598996342 +0000 UTC m=+0.238410225 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron)
Dec 02 08:23:46 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:23:46 np0005541913.localdomain podman[77340]: 2025-12-02 08:23:46.773597994 +0000 UTC m=+0.415289259 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:23:46 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:23:47 np0005541913.localdomain systemd[1]: tmp-crun.ApqL8s.mount: Deactivated successfully.
Dec 02 08:23:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:23:48 np0005541913.localdomain podman[77431]: 2025-12-02 08:23:48.443688018 +0000 UTC m=+0.082907179 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:23:48 np0005541913.localdomain podman[77431]: 2025-12-02 08:23:48.514056137 +0000 UTC m=+0.153275248 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:23:48 np0005541913.localdomain podman[77431]: unhealthy
Dec 02 08:23:48 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:23:48 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 08:23:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:23:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:23:50 np0005541913.localdomain podman[77451]: 2025-12-02 08:23:50.439396018 +0000 UTC m=+0.074198943 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com)
Dec 02 08:23:50 np0005541913.localdomain podman[77452]: 2025-12-02 08:23:50.503005015 +0000 UTC m=+0.133561735 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.openshift.expose-services=)
Dec 02 08:23:50 np0005541913.localdomain podman[77451]: 2025-12-02 08:23:50.527376202 +0000 UTC m=+0.162179087 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 02 08:23:50 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:23:50 np0005541913.localdomain podman[77452]: 2025-12-02 08:23:50.54914954 +0000 UTC m=+0.179706250 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 02 08:23:50 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:23:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:23:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:23:58 np0005541913.localdomain podman[77499]: 2025-12-02 08:23:58.442540851 +0000 UTC m=+0.078300964 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:23:58 np0005541913.localdomain podman[77499]: 2025-12-02 08:23:58.477372601 +0000 UTC m=+0.113132674 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, container_name=collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:23:58 np0005541913.localdomain systemd[1]: tmp-crun.WN5DBv.mount: Deactivated successfully.
Dec 02 08:23:58 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:23:58 np0005541913.localdomain podman[77500]: 2025-12-02 08:23:58.491831391 +0000 UTC m=+0.125517788 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:23:58 np0005541913.localdomain podman[77500]: 2025-12-02 08:23:58.501926804 +0000 UTC m=+0.135613211 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 02 08:23:58 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:24:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:24:11 np0005541913.localdomain systemd[1]: tmp-crun.HLMax1.mount: Deactivated successfully.
Dec 02 08:24:11 np0005541913.localdomain podman[77537]: 2025-12-02 08:24:11.419844267 +0000 UTC m=+0.065270472 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 02 08:24:11 np0005541913.localdomain podman[77537]: 2025-12-02 08:24:11.643976396 +0000 UTC m=+0.289402521 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, build-date=2025-11-18T22:49:46Z, vcs-type=git)
Dec 02 08:24:11 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:24:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:24:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:24:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:24:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:24:17 np0005541913.localdomain podman[77567]: 2025-12-02 08:24:17.460901316 +0000 UTC m=+0.091062780 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 08:24:17 np0005541913.localdomain podman[77567]: 2025-12-02 08:24:17.490798842 +0000 UTC m=+0.120960366 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=ceilometer_agent_compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:24:17 np0005541913.localdomain systemd[1]: tmp-crun.fegpL0.mount: Deactivated successfully.
Dec 02 08:24:17 np0005541913.localdomain podman[77566]: 2025-12-02 08:24:17.506507116 +0000 UTC m=+0.141993693 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:24:17 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:24:17 np0005541913.localdomain podman[77574]: 2025-12-02 08:24:17.570397671 +0000 UTC m=+0.195447566 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi)
Dec 02 08:24:17 np0005541913.localdomain podman[77565]: 2025-12-02 08:24:17.623577886 +0000 UTC m=+0.259531606 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, url=https://www.redhat.com, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Dec 02 08:24:17 np0005541913.localdomain podman[77565]: 2025-12-02 08:24:17.638050936 +0000 UTC m=+0.274004656 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, architecture=x86_64, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true)
Dec 02 08:24:17 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:24:17 np0005541913.localdomain podman[77574]: 2025-12-02 08:24:17.65339773 +0000 UTC m=+0.278447655 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi)
Dec 02 08:24:17 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:24:17 np0005541913.localdomain podman[77566]: 2025-12-02 08:24:17.854234301 +0000 UTC m=+0.489720838 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1)
Dec 02 08:24:17 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:24:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:24:19 np0005541913.localdomain systemd[1]: tmp-crun.JWQkzb.mount: Deactivated successfully.
Dec 02 08:24:19 np0005541913.localdomain podman[77660]: 2025-12-02 08:24:19.436950185 +0000 UTC m=+0.078861160 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, version=17.1.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:24:19 np0005541913.localdomain podman[77660]: 2025-12-02 08:24:19.498029183 +0000 UTC m=+0.139940208 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team)
Dec 02 08:24:19 np0005541913.localdomain podman[77660]: unhealthy
Dec 02 08:24:19 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:24:19 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 08:24:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:24:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:24:21 np0005541913.localdomain podman[77683]: 2025-12-02 08:24:21.452290517 +0000 UTC m=+0.088778467 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:24:21 np0005541913.localdomain podman[77683]: 2025-12-02 08:24:21.489970773 +0000 UTC m=+0.126458733 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 02 08:24:21 np0005541913.localdomain podman[77684]: 2025-12-02 08:24:21.502298416 +0000 UTC m=+0.135335763 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:24:21 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:24:21 np0005541913.localdomain podman[77684]: 2025-12-02 08:24:21.557121226 +0000 UTC m=+0.190158163 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller)
Dec 02 08:24:21 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:24:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:24:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:24:29 np0005541913.localdomain podman[77729]: 2025-12-02 08:24:29.449825596 +0000 UTC m=+0.088915560 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, version=17.1.12, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z)
Dec 02 08:24:29 np0005541913.localdomain podman[77729]: 2025-12-02 08:24:29.462969611 +0000 UTC m=+0.102059565 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-type=git, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., version=17.1.12, container_name=collectd, distribution-scope=public)
Dec 02 08:24:29 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:24:29 np0005541913.localdomain systemd[1]: tmp-crun.UVCJfQ.mount: Deactivated successfully.
Dec 02 08:24:29 np0005541913.localdomain podman[77730]: 2025-12-02 08:24:29.554088161 +0000 UTC m=+0.190789270 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public)
Dec 02 08:24:29 np0005541913.localdomain podman[77730]: 2025-12-02 08:24:29.593934186 +0000 UTC m=+0.230635335 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, url=https://www.redhat.com)
Dec 02 08:24:29 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:24:41 np0005541913.localdomain sudo[77768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:24:41 np0005541913.localdomain sudo[77768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:24:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:24:41 np0005541913.localdomain sudo[77768]: pam_unix(sudo:session): session closed for user root
Dec 02 08:24:42 np0005541913.localdomain sudo[77791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:24:42 np0005541913.localdomain systemd[1]: tmp-crun.AWoR36.mount: Deactivated successfully.
Dec 02 08:24:42 np0005541913.localdomain sudo[77791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:24:42 np0005541913.localdomain podman[77783]: 2025-12-02 08:24:42.025459423 +0000 UTC m=+0.095154798 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:24:42 np0005541913.localdomain podman[77783]: 2025-12-02 08:24:42.222101721 +0000 UTC m=+0.291797066 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z)
Dec 02 08:24:42 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:24:42 np0005541913.localdomain sudo[77791]: pam_unix(sudo:session): session closed for user root
Dec 02 08:24:43 np0005541913.localdomain sudo[77860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:24:43 np0005541913.localdomain sudo[77860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:24:43 np0005541913.localdomain sudo[77860]: pam_unix(sudo:session): session closed for user root
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:24:48 np0005541913.localdomain recover_tripleo_nova_virtqemud[77936]: 62312
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: tmp-crun.kAfyqm.mount: Deactivated successfully.
Dec 02 08:24:48 np0005541913.localdomain podman[77908]: 2025-12-02 08:24:48.358197656 +0000 UTC m=+0.091449610 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: tmp-crun.pLC2Jz.mount: Deactivated successfully.
Dec 02 08:24:48 np0005541913.localdomain podman[77901]: 2025-12-02 08:24:48.410661742 +0000 UTC m=+0.145686003 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container)
Dec 02 08:24:48 np0005541913.localdomain podman[77908]: 2025-12-02 08:24:48.418891744 +0000 UTC m=+0.152143688 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:24:48 np0005541913.localdomain podman[77902]: 2025-12-02 08:24:48.394757582 +0000 UTC m=+0.126866224 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 02 08:24:48 np0005541913.localdomain podman[77901]: 2025-12-02 08:24:48.469633233 +0000 UTC m=+0.204657494 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:24:48 np0005541913.localdomain podman[77909]: 2025-12-02 08:24:48.542395107 +0000 UTC m=+0.271287263 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:24:48 np0005541913.localdomain podman[77909]: 2025-12-02 08:24:48.576044225 +0000 UTC m=+0.304936391 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:24:48 np0005541913.localdomain podman[77902]: 2025-12-02 08:24:48.744035389 +0000 UTC m=+0.476144041 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team)
Dec 02 08:24:48 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:24:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:24:50 np0005541913.localdomain systemd[1]: tmp-crun.SFRoDW.mount: Deactivated successfully.
Dec 02 08:24:50 np0005541913.localdomain podman[78037]: 2025-12-02 08:24:50.402362065 +0000 UTC m=+0.049908668 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:24:50 np0005541913.localdomain podman[78037]: 2025-12-02 08:24:50.423859174 +0000 UTC m=+0.071405767 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=)
Dec 02 08:24:50 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:24:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:24:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:24:52 np0005541913.localdomain systemd[1]: tmp-crun.HrrkOl.mount: Deactivated successfully.
Dec 02 08:24:52 np0005541913.localdomain podman[78085]: 2025-12-02 08:24:52.443719177 +0000 UTC m=+0.085700294 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Dec 02 08:24:52 np0005541913.localdomain systemd[1]: tmp-crun.FPgeRa.mount: Deactivated successfully.
Dec 02 08:24:52 np0005541913.localdomain podman[78086]: 2025-12-02 08:24:52.497281303 +0000 UTC m=+0.135830477 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:24:52 np0005541913.localdomain podman[78085]: 2025-12-02 08:24:52.512931965 +0000 UTC m=+0.154913072 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Dec 02 08:24:52 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:24:52 np0005541913.localdomain podman[78086]: 2025-12-02 08:24:52.5712924 +0000 UTC m=+0.209841554 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible)
Dec 02 08:24:52 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:24:58 np0005541913.localdomain systemd[1]: libpod-bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d.scope: Deactivated successfully.
Dec 02 08:24:58 np0005541913.localdomain podman[78134]: 2025-12-02 08:24:58.544778515 +0000 UTC m=+0.049855836 container died bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_wait_for_compute_service, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044)
Dec 02 08:24:58 np0005541913.localdomain systemd[1]: tmp-crun.OSxXXO.mount: Deactivated successfully.
Dec 02 08:24:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d-userdata-shm.mount: Deactivated successfully.
Dec 02 08:24:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8c52eb2917af814f67bf9757f04611b4867e02cd94735e31ef932542a90a8de8-merged.mount: Deactivated successfully.
Dec 02 08:24:58 np0005541913.localdomain podman[78134]: 2025-12-02 08:24:58.586245314 +0000 UTC m=+0.091322575 container cleanup bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, container_name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute)
Dec 02 08:24:58 np0005541913.localdomain systemd[1]: libpod-conmon-bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d.scope: Deactivated successfully.
Dec 02 08:24:58 np0005541913.localdomain python3[76195]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=ff8ff724cb5f0d02131158e2fae849b6 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 02 08:24:58 np0005541913.localdomain sudo[76193]: pam_unix(sudo:session): session closed for user root
Dec 02 08:24:59 np0005541913.localdomain sudo[78186]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcmdpbmvcygqnnaahcmambkzigiyyovb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:24:59 np0005541913.localdomain sudo[78186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:24:59 np0005541913.localdomain python3[78188]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:24:59 np0005541913.localdomain sudo[78186]: pam_unix(sudo:session): session closed for user root
Dec 02 08:24:59 np0005541913.localdomain sudo[78202]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emegnnetkcmwobflarwzlqzwolocqblq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:24:59 np0005541913.localdomain sudo[78202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:24:59 np0005541913.localdomain python3[78204]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 08:24:59 np0005541913.localdomain sudo[78202]: pam_unix(sudo:session): session closed for user root
Dec 02 08:24:59 np0005541913.localdomain sudo[78263]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsxlfamxffhyuamjihavnnhydalsvqlx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:24:59 np0005541913.localdomain sudo[78263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:24:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:24:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:25:00 np0005541913.localdomain systemd[1]: tmp-crun.nCQip1.mount: Deactivated successfully.
Dec 02 08:25:00 np0005541913.localdomain podman[78267]: 2025-12-02 08:25:00.040943943 +0000 UTC m=+0.112993050 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-iscsid-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:25:00 np0005541913.localdomain python3[78265]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764663899.4964612-118006-217783275407979/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:25:00 np0005541913.localdomain podman[78267]: 2025-12-02 08:25:00.046565955 +0000 UTC m=+0.118615062 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z)
Dec 02 08:25:00 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:25:00 np0005541913.localdomain sudo[78263]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:00 np0005541913.localdomain podman[78266]: 2025-12-02 08:25:00.022335922 +0000 UTC m=+0.093612808 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:25:00 np0005541913.localdomain podman[78266]: 2025-12-02 08:25:00.103860172 +0000 UTC m=+0.175136998 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com)
Dec 02 08:25:00 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:25:00 np0005541913.localdomain sudo[78317]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmorvphwodjvobhpzazyvgkrbsmyjwmc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:25:00 np0005541913.localdomain sudo[78317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:25:00 np0005541913.localdomain python3[78319]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 08:25:00 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:25:00 np0005541913.localdomain systemd-rc-local-generator[78344]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:25:00 np0005541913.localdomain systemd-sysv-generator[78347]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:25:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:25:00 np0005541913.localdomain sudo[78317]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:01 np0005541913.localdomain sudo[78369]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkwlrazhuhmyijyqlduxhtwtgritjvqy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 02 08:25:01 np0005541913.localdomain sudo[78369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:25:01 np0005541913.localdomain python3[78371]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 08:25:01 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:25:01 np0005541913.localdomain systemd-rc-local-generator[78397]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:25:01 np0005541913.localdomain systemd-sysv-generator[78400]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:25:01 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:25:01 np0005541913.localdomain systemd[1]: Starting nova_compute container...
Dec 02 08:25:01 np0005541913.localdomain tripleo-start-podman-container[78411]: Creating additional drop-in dependency for "nova_compute" (1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1)
Dec 02 08:25:01 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 08:25:01 np0005541913.localdomain systemd-sysv-generator[78472]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 08:25:01 np0005541913.localdomain systemd-rc-local-generator[78467]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 08:25:02 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 08:25:02 np0005541913.localdomain systemd[1]: Started nova_compute container.
Dec 02 08:25:02 np0005541913.localdomain sudo[78369]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:02 np0005541913.localdomain sudo[78507]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfewuudrxuolnnebygldmaeffkehzwwc ; /usr/bin/python3
Dec 02 08:25:02 np0005541913.localdomain sudo[78507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:25:02 np0005541913.localdomain python3[78509]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:25:02 np0005541913.localdomain sudo[78507]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:03 np0005541913.localdomain sudo[78555]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsjdrjtjdyjfinchmzketiglzqkmpxxz ; /usr/bin/python3
Dec 02 08:25:03 np0005541913.localdomain sudo[78555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:25:03 np0005541913.localdomain sudo[78555]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:03 np0005541913.localdomain sudo[78598]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iabtrregxyaqfiydcrvueisdmsmvwkix ; /usr/bin/python3
Dec 02 08:25:03 np0005541913.localdomain sudo[78598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:25:03 np0005541913.localdomain sudo[78598]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:03 np0005541913.localdomain sudo[78628]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kawmdlekdltkvubjrzlzayizritlxqpc ; /usr/bin/python3
Dec 02 08:25:03 np0005541913.localdomain sudo[78628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:25:03 np0005541913.localdomain python3[78630]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005541913 step=5 update_config_hash_only=False
Dec 02 08:25:03 np0005541913.localdomain sudo[78628]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:04 np0005541913.localdomain sudo[78644]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dblshweopsqacjaddfrpnpkgacqovlcd ; /usr/bin/python3
Dec 02 08:25:04 np0005541913.localdomain sudo[78644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:25:04 np0005541913.localdomain python3[78646]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 08:25:04 np0005541913.localdomain sudo[78644]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:04 np0005541913.localdomain sudo[78660]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfwbaxzcuxqfxowdnngeuvmqwjcwlurn ; /usr/bin/python3
Dec 02 08:25:04 np0005541913.localdomain sudo[78660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 02 08:25:04 np0005541913.localdomain python3[78662]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 02 08:25:04 np0005541913.localdomain sudo[78660]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:25:12 np0005541913.localdomain podman[78663]: 2025-12-02 08:25:12.443862159 +0000 UTC m=+0.086188507 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, config_id=tripleo_step1, batch=17.1_20251118.1, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:25:12 np0005541913.localdomain podman[78663]: 2025-12-02 08:25:12.641094942 +0000 UTC m=+0.283421340 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, architecture=x86_64, managed_by=tripleo_ansible)
Dec 02 08:25:12 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:25:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:25:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:25:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:25:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:25:19 np0005541913.localdomain podman[78692]: 2025-12-02 08:25:19.469975363 +0000 UTC m=+0.106893216 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64)
Dec 02 08:25:19 np0005541913.localdomain systemd[1]: tmp-crun.Kb6RBg.mount: Deactivated successfully.
Dec 02 08:25:19 np0005541913.localdomain podman[78694]: 2025-12-02 08:25:19.522211623 +0000 UTC m=+0.150537514 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute)
Dec 02 08:25:19 np0005541913.localdomain podman[78693]: 2025-12-02 08:25:19.556693083 +0000 UTC m=+0.187933703 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:25:19 np0005541913.localdomain podman[78692]: 2025-12-02 08:25:19.583431274 +0000 UTC m=+0.220349117 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Dec 02 08:25:19 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:25:19 np0005541913.localdomain podman[78694]: 2025-12-02 08:25:19.60620872 +0000 UTC m=+0.234534621 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 02 08:25:19 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:25:19 np0005541913.localdomain podman[78700]: 2025-12-02 08:25:19.661037109 +0000 UTC m=+0.287668875 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Dec 02 08:25:19 np0005541913.localdomain podman[78700]: 2025-12-02 08:25:19.673332981 +0000 UTC m=+0.299964727 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible)
Dec 02 08:25:19 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:25:19 np0005541913.localdomain podman[78693]: 2025-12-02 08:25:19.932005072 +0000 UTC m=+0.563245612 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Dec 02 08:25:19 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:25:20 np0005541913.localdomain systemd[1]: tmp-crun.DlxCwn.mount: Deactivated successfully.
Dec 02 08:25:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:25:20 np0005541913.localdomain systemd[1]: tmp-crun.Pgqgca.mount: Deactivated successfully.
Dec 02 08:25:20 np0005541913.localdomain podman[78787]: 2025-12-02 08:25:20.549766594 +0000 UTC m=+0.067963945 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044)
Dec 02 08:25:20 np0005541913.localdomain podman[78787]: 2025-12-02 08:25:20.599118617 +0000 UTC m=+0.117315928 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 02 08:25:20 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:25:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:25:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:25:23 np0005541913.localdomain podman[78812]: 2025-12-02 08:25:23.45087326 +0000 UTC m=+0.090091692 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:25:23 np0005541913.localdomain podman[78813]: 2025-12-02 08:25:23.510836518 +0000 UTC m=+0.145575180 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 02 08:25:23 np0005541913.localdomain podman[78812]: 2025-12-02 08:25:23.526214633 +0000 UTC m=+0.165433085 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent)
Dec 02 08:25:23 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:25:23 np0005541913.localdomain podman[78813]: 2025-12-02 08:25:23.55979958 +0000 UTC m=+0.194538302 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, container_name=ovn_controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4)
Dec 02 08:25:23 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:25:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:25:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:25:30 np0005541913.localdomain podman[78857]: 2025-12-02 08:25:30.417805255 +0000 UTC m=+0.061979404 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 02 08:25:30 np0005541913.localdomain podman[78857]: 2025-12-02 08:25:30.43837906 +0000 UTC m=+0.082553249 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, container_name=collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 02 08:25:30 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:25:30 np0005541913.localdomain podman[78858]: 2025-12-02 08:25:30.435275996 +0000 UTC m=+0.072632621 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 02 08:25:30 np0005541913.localdomain podman[78858]: 2025-12-02 08:25:30.51432707 +0000 UTC m=+0.151683715 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:25:30 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:25:30 np0005541913.localdomain sshd[78895]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 08:25:31 np0005541913.localdomain sshd[78895]: Accepted publickey for zuul from 192.168.122.100 port 34488 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 08:25:31 np0005541913.localdomain systemd-logind[757]: New session 34 of user zuul.
Dec 02 08:25:31 np0005541913.localdomain systemd[1]: Started Session 34 of User zuul.
Dec 02 08:25:31 np0005541913.localdomain sshd[78895]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 08:25:31 np0005541913.localdomain sudo[79002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ashfbwngdwfpraucmgsuknjyxjpomjxt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764663931.2049224-40123-110204323011314/AnsiballZ_setup.py
Dec 02 08:25:31 np0005541913.localdomain sudo[79002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 08:25:32 np0005541913.localdomain python3[79004]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 08:25:34 np0005541913.localdomain sudo[79002]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:39 np0005541913.localdomain sudo[79265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzbyssxdltwubkewszbywaawwatnvydn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764663939.1942763-40212-207221734948271/AnsiballZ_dnf.py
Dec 02 08:25:39 np0005541913.localdomain sudo[79265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 08:25:39 np0005541913.localdomain python3[79267]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Dec 02 08:25:42 np0005541913.localdomain sudo[79265]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:25:43 np0005541913.localdomain podman[79284]: 2025-12-02 08:25:43.421680718 +0000 UTC m=+0.063715980 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-type=git, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com)
Dec 02 08:25:43 np0005541913.localdomain sudo[79302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:25:43 np0005541913.localdomain sudo[79302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:25:43 np0005541913.localdomain sudo[79302]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:43 np0005541913.localdomain sudo[79328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:25:43 np0005541913.localdomain sudo[79328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:25:43 np0005541913.localdomain podman[79284]: 2025-12-02 08:25:43.608689326 +0000 UTC m=+0.250724628 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:25:43 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:25:44 np0005541913.localdomain sudo[79328]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:44 np0005541913.localdomain sudo[79374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:25:44 np0005541913.localdomain sudo[79374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:25:44 np0005541913.localdomain sudo[79374]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:47 np0005541913.localdomain sudo[79463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uosvzxnrwyavdtunjsffmqvbacqsfejr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764663946.7594306-40266-64891137474897/AnsiballZ_iptables.py
Dec 02 08:25:47 np0005541913.localdomain sudo[79463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 08:25:47 np0005541913.localdomain python3[79465]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None
Dec 02 08:25:47 np0005541913.localdomain kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 02 08:25:47 np0005541913.localdomain systemd-journald[47611]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation.
Dec 02 08:25:47 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 08:25:47 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 08:25:47 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 08:25:47 np0005541913.localdomain sudo[79463]: pam_unix(sudo:session): session closed for user root
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:25:50 np0005541913.localdomain podman[79535]: 2025-12-02 08:25:50.44919462 +0000 UTC m=+0.085984991 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=nova_migration_target, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible)
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: tmp-crun.R09bUh.mount: Deactivated successfully.
Dec 02 08:25:50 np0005541913.localdomain podman[79536]: 2025-12-02 08:25:50.529264981 +0000 UTC m=+0.160741338 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1)
Dec 02 08:25:50 np0005541913.localdomain podman[79546]: 2025-12-02 08:25:50.581978174 +0000 UTC m=+0.210106881 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Dec 02 08:25:50 np0005541913.localdomain podman[79536]: 2025-12-02 08:25:50.60480279 +0000 UTC m=+0.236279157 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4)
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:25:50 np0005541913.localdomain podman[79546]: 2025-12-02 08:25:50.643200667 +0000 UTC m=+0.271329364 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:25:50 np0005541913.localdomain podman[79534]: 2025-12-02 08:25:50.506940789 +0000 UTC m=+0.142970740 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, com.redhat.component=openstack-cron-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible)
Dec 02 08:25:50 np0005541913.localdomain podman[79534]: 2025-12-02 08:25:50.691189592 +0000 UTC m=+0.327219563 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:25:50 np0005541913.localdomain podman[79624]: 2025-12-02 08:25:50.692038424 +0000 UTC m=+0.060480592 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-nova-compute)
Dec 02 08:25:50 np0005541913.localdomain podman[79624]: 2025-12-02 08:25:50.778109038 +0000 UTC m=+0.146551236 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:25:50 np0005541913.localdomain podman[79535]: 2025-12-02 08:25:50.800155463 +0000 UTC m=+0.436945864 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:25:50 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:25:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:25:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:25:54 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:25:54 np0005541913.localdomain recover_tripleo_nova_virtqemud[79663]: 62312
Dec 02 08:25:54 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:25:54 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:25:54 np0005541913.localdomain podman[79651]: 2025-12-02 08:25:54.423679235 +0000 UTC m=+0.066774133 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:25:54 np0005541913.localdomain podman[79650]: 2025-12-02 08:25:54.482714398 +0000 UTC m=+0.124741687 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:25:54 np0005541913.localdomain podman[79650]: 2025-12-02 08:25:54.512365769 +0000 UTC m=+0.154393048 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Dec 02 08:25:54 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:25:54 np0005541913.localdomain podman[79651]: 2025-12-02 08:25:54.568316729 +0000 UTC m=+0.211411597 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4)
Dec 02 08:25:54 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:26:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:26:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:26:01 np0005541913.localdomain podman[79699]: 2025-12-02 08:26:01.445401612 +0000 UTC m=+0.087875194 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, managed_by=tripleo_ansible)
Dec 02 08:26:01 np0005541913.localdomain podman[79699]: 2025-12-02 08:26:01.460223241 +0000 UTC m=+0.102696893 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, version=17.1.12, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid)
Dec 02 08:26:01 np0005541913.localdomain podman[79698]: 2025-12-02 08:26:01.477845487 +0000 UTC m=+0.120527804 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=)
Dec 02 08:26:01 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:26:01 np0005541913.localdomain podman[79698]: 2025-12-02 08:26:01.488042972 +0000 UTC m=+0.130725299 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git)
Dec 02 08:26:01 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:26:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:26:14 np0005541913.localdomain podman[79740]: 2025-12-02 08:26:14.446321766 +0000 UTC m=+0.083374652 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com)
Dec 02 08:26:14 np0005541913.localdomain podman[79740]: 2025-12-02 08:26:14.662058998 +0000 UTC m=+0.299111754 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:26:14 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:26:21 np0005541913.localdomain podman[79772]: 2025-12-02 08:26:21.485771279 +0000 UTC m=+0.097640525 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:11:48Z, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git)
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: tmp-crun.8JDPRH.mount: Deactivated successfully.
Dec 02 08:26:21 np0005541913.localdomain podman[79773]: 2025-12-02 08:26:21.544826634 +0000 UTC m=+0.151498861 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public)
Dec 02 08:26:21 np0005541913.localdomain podman[79773]: 2025-12-02 08:26:21.567938097 +0000 UTC m=+0.174610334 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:26:21 np0005541913.localdomain podman[79769]: 2025-12-02 08:26:21.522724927 +0000 UTC m=+0.138864979 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 02 08:26:21 np0005541913.localdomain podman[79769]: 2025-12-02 08:26:21.605813109 +0000 UTC m=+0.221953131 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:26:21 np0005541913.localdomain podman[79772]: 2025-12-02 08:26:21.618864381 +0000 UTC m=+0.230733567 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1)
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:26:21 np0005541913.localdomain podman[79770]: 2025-12-02 08:26:21.670975048 +0000 UTC m=+0.286952726 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, version=17.1.12, managed_by=tripleo_ansible, container_name=nova_migration_target, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:26:21 np0005541913.localdomain podman[79771]: 2025-12-02 08:26:21.620592829 +0000 UTC m=+0.234201993 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:26:21 np0005541913.localdomain podman[79771]: 2025-12-02 08:26:21.751144911 +0000 UTC m=+0.364754075 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4)
Dec 02 08:26:21 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:26:22 np0005541913.localdomain podman[79770]: 2025-12-02 08:26:22.025947378 +0000 UTC m=+0.641925066 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 02 08:26:22 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:26:22 np0005541913.localdomain systemd[1]: tmp-crun.zVmcNf.mount: Deactivated successfully.
Dec 02 08:26:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:26:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:26:25 np0005541913.localdomain podman[79890]: 2025-12-02 08:26:25.442630599 +0000 UTC m=+0.084753899 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, container_name=ovn_metadata_agent, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 02 08:26:25 np0005541913.localdomain podman[79890]: 2025-12-02 08:26:25.477180221 +0000 UTC m=+0.119303511 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:26:25 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:26:25 np0005541913.localdomain podman[79891]: 2025-12-02 08:26:25.48829413 +0000 UTC m=+0.130174294 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, vcs-type=git, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:26:25 np0005541913.localdomain podman[79891]: 2025-12-02 08:26:25.538095155 +0000 UTC m=+0.179975338 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:26:25 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:26:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:26:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:26:32 np0005541913.localdomain podman[79940]: 2025-12-02 08:26:32.443637504 +0000 UTC m=+0.083467704 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com)
Dec 02 08:26:32 np0005541913.localdomain podman[79940]: 2025-12-02 08:26:32.479455001 +0000 UTC m=+0.119285181 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044)
Dec 02 08:26:32 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:26:32 np0005541913.localdomain podman[79939]: 2025-12-02 08:26:32.496163682 +0000 UTC m=+0.138340815 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd)
Dec 02 08:26:32 np0005541913.localdomain podman[79939]: 2025-12-02 08:26:32.50796282 +0000 UTC m=+0.150139943 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 08:26:32 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:26:44 np0005541913.localdomain sudo[79976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:26:44 np0005541913.localdomain sudo[79976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:26:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:26:44 np0005541913.localdomain sudo[79976]: pam_unix(sudo:session): session closed for user root
Dec 02 08:26:45 np0005541913.localdomain sudo[79992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:26:45 np0005541913.localdomain sudo[79992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:26:45 np0005541913.localdomain podman[79991]: 2025-12-02 08:26:45.042761507 +0000 UTC m=+0.075438047 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, vcs-type=git)
Dec 02 08:26:45 np0005541913.localdomain podman[79991]: 2025-12-02 08:26:45.196530716 +0000 UTC m=+0.229207196 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true)
Dec 02 08:26:45 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:26:45 np0005541913.localdomain sudo[79992]: pam_unix(sudo:session): session closed for user root
Dec 02 08:26:46 np0005541913.localdomain sshd[78895]: pam_unix(sshd:session): session closed for user zuul
Dec 02 08:26:46 np0005541913.localdomain systemd[1]: session-34.scope: Deactivated successfully.
Dec 02 08:26:46 np0005541913.localdomain systemd[1]: session-34.scope: Consumed 5.803s CPU time.
Dec 02 08:26:46 np0005541913.localdomain systemd-logind[757]: Session 34 logged out. Waiting for processes to exit.
Dec 02 08:26:46 np0005541913.localdomain systemd-logind[757]: Removed session 34.
Dec 02 08:26:48 np0005541913.localdomain sudo[80067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:26:48 np0005541913.localdomain sudo[80067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:26:48 np0005541913.localdomain sudo[80067]: pam_unix(sudo:session): session closed for user root
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:26:52 np0005541913.localdomain podman[80127]: 2025-12-02 08:26:52.458996309 +0000 UTC m=+0.090511603 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Dec 02 08:26:52 np0005541913.localdomain podman[80127]: 2025-12-02 08:26:52.471965719 +0000 UTC m=+0.103481033 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: tmp-crun.mKKSF1.mount: Deactivated successfully.
Dec 02 08:26:52 np0005541913.localdomain podman[80130]: 2025-12-02 08:26:52.484075336 +0000 UTC m=+0.103292869 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, tcib_managed=true)
Dec 02 08:26:52 np0005541913.localdomain podman[80128]: 2025-12-02 08:26:52.488150386 +0000 UTC m=+0.120204895 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:26:52 np0005541913.localdomain podman[80130]: 2025-12-02 08:26:52.511859816 +0000 UTC m=+0.131077319 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:26:52 np0005541913.localdomain podman[80129]: 2025-12-02 08:26:52.566265864 +0000 UTC m=+0.193175614 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 02 08:26:52 np0005541913.localdomain podman[80141]: 2025-12-02 08:26:52.609773669 +0000 UTC m=+0.228179110 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:26:52 np0005541913.localdomain podman[80129]: 2025-12-02 08:26:52.621190507 +0000 UTC m=+0.248100287 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team)
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:26:52 np0005541913.localdomain podman[80141]: 2025-12-02 08:26:52.639464159 +0000 UTC m=+0.257869600 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, version=17.1.12)
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:26:52 np0005541913.localdomain podman[80128]: 2025-12-02 08:26:52.855031367 +0000 UTC m=+0.487085896 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4)
Dec 02 08:26:52 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:26:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:26:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:26:56 np0005541913.localdomain systemd[1]: tmp-crun.GLYFPr.mount: Deactivated successfully.
Dec 02 08:26:56 np0005541913.localdomain podman[80245]: 2025-12-02 08:26:56.437815071 +0000 UTC m=+0.082352743 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=)
Dec 02 08:26:56 np0005541913.localdomain systemd[1]: tmp-crun.gCejVv.mount: Deactivated successfully.
Dec 02 08:26:56 np0005541913.localdomain podman[80246]: 2025-12-02 08:26:56.493059502 +0000 UTC m=+0.130487273 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T23:34:05Z, tcib_managed=true, release=1761123044)
Dec 02 08:26:56 np0005541913.localdomain podman[80245]: 2025-12-02 08:26:56.501965283 +0000 UTC m=+0.146502955 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Dec 02 08:26:56 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:26:56 np0005541913.localdomain podman[80246]: 2025-12-02 08:26:56.543098213 +0000 UTC m=+0.180525994 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller)
Dec 02 08:26:56 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:26:59 np0005541913.localdomain sshd[80292]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 08:26:59 np0005541913.localdomain sshd[80292]: Accepted publickey for zuul from 38.102.83.114 port 39524 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 08:26:59 np0005541913.localdomain systemd-logind[757]: New session 35 of user zuul.
Dec 02 08:26:59 np0005541913.localdomain systemd[1]: Started Session 35 of User zuul.
Dec 02 08:26:59 np0005541913.localdomain sshd[80292]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 08:26:59 np0005541913.localdomain sudo[80309]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buwdxquwswyxjkeniayisnlpwexrviyi ; /usr/bin/python3
Dec 02 08:26:59 np0005541913.localdomain sudo[80309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 08:26:59 np0005541913.localdomain python3[80311]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 08:27:02 np0005541913.localdomain sudo[80309]: pam_unix(sudo:session): session closed for user root
Dec 02 08:27:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:27:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:27:03 np0005541913.localdomain podman[80314]: 2025-12-02 08:27:03.436412023 +0000 UTC m=+0.074611324 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public)
Dec 02 08:27:03 np0005541913.localdomain podman[80314]: 2025-12-02 08:27:03.447894233 +0000 UTC m=+0.086093534 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:27:03 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:27:03 np0005541913.localdomain podman[80313]: 2025-12-02 08:27:03.530345858 +0000 UTC m=+0.168501068 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, release=1761123044, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:27:03 np0005541913.localdomain podman[80313]: 2025-12-02 08:27:03.566070653 +0000 UTC m=+0.204225883 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:27:03 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:27:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:27:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 4435 writes, 20K keys, 4435 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4435 writes, 447 syncs, 9.92 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:27:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:27:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.2 total, 600.0 interval
                                                          Cumulative writes: 5176 writes, 22K keys, 5176 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5176 writes, 608 syncs, 8.51 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:27:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:27:15 np0005541913.localdomain podman[80354]: 2025-12-02 08:27:15.441148742 +0000 UTC m=+0.081597913 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, url=https://www.redhat.com, container_name=metrics_qdr, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, config_id=tripleo_step1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true)
Dec 02 08:27:15 np0005541913.localdomain podman[80354]: 2025-12-02 08:27:15.650010798 +0000 UTC m=+0.290459949 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, config_id=tripleo_step1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1)
Dec 02 08:27:15 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: tmp-crun.5e00ty.mount: Deactivated successfully.
Dec 02 08:27:23 np0005541913.localdomain podman[80383]: 2025-12-02 08:27:23.503462101 +0000 UTC m=+0.140510813 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, managed_by=tripleo_ansible)
Dec 02 08:27:23 np0005541913.localdomain podman[80387]: 2025-12-02 08:27:23.462837754 +0000 UTC m=+0.093143985 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:27:23 np0005541913.localdomain podman[80385]: 2025-12-02 08:27:23.43079629 +0000 UTC m=+0.064323958 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, config_id=tripleo_step5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:27:23 np0005541913.localdomain podman[80387]: 2025-12-02 08:27:23.543702336 +0000 UTC m=+0.174008577 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:27:23 np0005541913.localdomain podman[80385]: 2025-12-02 08:27:23.564015084 +0000 UTC m=+0.197542772 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step5, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:27:23 np0005541913.localdomain podman[80386]: 2025-12-02 08:27:23.542991967 +0000 UTC m=+0.175917968 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:27:23 np0005541913.localdomain podman[80383]: 2025-12-02 08:27:23.615083593 +0000 UTC m=+0.252132355 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:27:23 np0005541913.localdomain podman[80384]: 2025-12-02 08:27:23.694419594 +0000 UTC m=+0.331821817 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:27:23 np0005541913.localdomain podman[80386]: 2025-12-02 08:27:23.724000082 +0000 UTC m=+0.356926133 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:27:23 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:27:24 np0005541913.localdomain podman[80384]: 2025-12-02 08:27:24.075137779 +0000 UTC m=+0.712540002 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:27:24 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:27:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:27:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:27:27 np0005541913.localdomain systemd[1]: tmp-crun.FBQzCU.mount: Deactivated successfully.
Dec 02 08:27:27 np0005541913.localdomain podman[80500]: 2025-12-02 08:27:27.427842484 +0000 UTC m=+0.070266258 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4)
Dec 02 08:27:27 np0005541913.localdomain sudo[80534]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdyukzrmeemtytcqdjkgdvrhpkyutctv ; /usr/bin/python3
Dec 02 08:27:27 np0005541913.localdomain sudo[80534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 08:27:27 np0005541913.localdomain systemd[1]: tmp-crun.Da34Gi.mount: Deactivated successfully.
Dec 02 08:27:27 np0005541913.localdomain podman[80500]: 2025-12-02 08:27:27.504058501 +0000 UTC m=+0.146482295 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:27:27 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:27:27 np0005541913.localdomain podman[80501]: 2025-12-02 08:27:27.506814065 +0000 UTC m=+0.143750301 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, container_name=ovn_controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:27:27 np0005541913.localdomain podman[80501]: 2025-12-02 08:27:27.591129561 +0000 UTC m=+0.228065837 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1)
Dec 02 08:27:27 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:27:27 np0005541913.localdomain python3[80545]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 08:27:31 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 08:27:31 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 08:27:31 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 08:27:31 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 08:27:31 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 08:27:31 np0005541913.localdomain systemd[1]: run-ra022a57c9fd94d61b5edd3b255f82757.service: Deactivated successfully.
Dec 02 08:27:31 np0005541913.localdomain systemd[1]: run-r51c86fcc3b5f49f8a74ae00e936a7bd1.service: Deactivated successfully.
Dec 02 08:27:32 np0005541913.localdomain sudo[80534]: pam_unix(sudo:session): session closed for user root
Dec 02 08:27:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:27:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:27:34 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:27:34 np0005541913.localdomain recover_tripleo_nova_virtqemud[80716]: 62312
Dec 02 08:27:34 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:27:34 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:27:34 np0005541913.localdomain systemd[1]: tmp-crun.Y9alkA.mount: Deactivated successfully.
Dec 02 08:27:34 np0005541913.localdomain podman[80713]: 2025-12-02 08:27:34.447861502 +0000 UTC m=+0.085264722 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, container_name=collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:27:34 np0005541913.localdomain podman[80713]: 2025-12-02 08:27:34.456792004 +0000 UTC m=+0.094195304 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, tcib_managed=true)
Dec 02 08:27:34 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:27:34 np0005541913.localdomain systemd[1]: tmp-crun.IqvZqF.mount: Deactivated successfully.
Dec 02 08:27:34 np0005541913.localdomain podman[80714]: 2025-12-02 08:27:34.513242817 +0000 UTC m=+0.146538225 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, tcib_managed=true)
Dec 02 08:27:34 np0005541913.localdomain podman[80714]: 2025-12-02 08:27:34.550336558 +0000 UTC m=+0.183631946 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:27:34 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:27:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:27:46 np0005541913.localdomain systemd[1]: tmp-crun.PmXE1g.mount: Deactivated successfully.
Dec 02 08:27:46 np0005541913.localdomain podman[80753]: 2025-12-02 08:27:46.451318197 +0000 UTC m=+0.095756966 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Dec 02 08:27:46 np0005541913.localdomain podman[80753]: 2025-12-02 08:27:46.621441523 +0000 UTC m=+0.265880172 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12)
Dec 02 08:27:46 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:27:48 np0005541913.localdomain sudo[80803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:27:48 np0005541913.localdomain sudo[80803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:27:48 np0005541913.localdomain sudo[80803]: pam_unix(sudo:session): session closed for user root
Dec 02 08:27:48 np0005541913.localdomain sudo[80818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 08:27:48 np0005541913.localdomain sudo[80818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:27:49 np0005541913.localdomain sudo[80818]: pam_unix(sudo:session): session closed for user root
Dec 02 08:27:49 np0005541913.localdomain sudo[80857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:27:49 np0005541913.localdomain sudo[80857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:27:49 np0005541913.localdomain sudo[80857]: pam_unix(sudo:session): session closed for user root
Dec 02 08:27:49 np0005541913.localdomain sudo[80872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:27:49 np0005541913.localdomain sudo[80872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:27:49 np0005541913.localdomain sudo[80872]: pam_unix(sudo:session): session closed for user root
Dec 02 08:27:50 np0005541913.localdomain sudo[80941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:27:50 np0005541913.localdomain sudo[80941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:27:50 np0005541913.localdomain sudo[80941]: pam_unix(sudo:session): session closed for user root
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: tmp-crun.JMA51J.mount: Deactivated successfully.
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: tmp-crun.VwICoc.mount: Deactivated successfully.
Dec 02 08:27:54 np0005541913.localdomain podman[80958]: 2025-12-02 08:27:54.543571196 +0000 UTC m=+0.149879043 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step5)
Dec 02 08:27:54 np0005541913.localdomain podman[80956]: 2025-12-02 08:27:54.50988008 +0000 UTC m=+0.119149882 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, distribution-scope=public, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, container_name=logrotate_crond, version=17.1.12)
Dec 02 08:27:54 np0005541913.localdomain podman[80960]: 2025-12-02 08:27:54.585965264 +0000 UTC m=+0.185674722 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:27:54 np0005541913.localdomain podman[80958]: 2025-12-02 08:27:54.594887731 +0000 UTC m=+0.201195578 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:27:54 np0005541913.localdomain podman[80960]: 2025-12-02 08:27:54.636901299 +0000 UTC m=+0.236610757 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi)
Dec 02 08:27:54 np0005541913.localdomain podman[80959]: 2025-12-02 08:27:54.64245466 +0000 UTC m=+0.247420822 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12)
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:27:54 np0005541913.localdomain podman[80957]: 2025-12-02 08:27:54.67740101 +0000 UTC m=+0.286750144 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:27:54 np0005541913.localdomain podman[80959]: 2025-12-02 08:27:54.689108347 +0000 UTC m=+0.294074509 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:27:54 np0005541913.localdomain podman[80956]: 2025-12-02 08:27:54.743158011 +0000 UTC m=+0.352427753 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, container_name=logrotate_crond, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, batch=17.1_20251118.1)
Dec 02 08:27:54 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:27:54 np0005541913.localdomain podman[80957]: 2025-12-02 08:27:54.993124238 +0000 UTC m=+0.602473432 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:27:55 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:27:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:27:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:27:58 np0005541913.localdomain podman[81078]: 2025-12-02 08:27:58.439142587 +0000 UTC m=+0.073883241 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, release=1761123044, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 02 08:27:58 np0005541913.localdomain podman[81077]: 2025-12-02 08:27:58.479547194 +0000 UTC m=+0.115726654 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4)
Dec 02 08:27:58 np0005541913.localdomain podman[81078]: 2025-12-02 08:27:58.511444945 +0000 UTC m=+0.146185659 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller)
Dec 02 08:27:58 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:27:58 np0005541913.localdomain podman[81077]: 2025-12-02 08:27:58.527032382 +0000 UTC m=+0.163211832 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true)
Dec 02 08:27:58 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:28:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:28:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:28:05 np0005541913.localdomain systemd[1]: tmp-crun.5YoRS7.mount: Deactivated successfully.
Dec 02 08:28:05 np0005541913.localdomain podman[81125]: 2025-12-02 08:28:05.432840712 +0000 UTC m=+0.071307594 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:51:28Z, version=17.1.12, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3)
Dec 02 08:28:05 np0005541913.localdomain podman[81125]: 2025-12-02 08:28:05.439326507 +0000 UTC m=+0.077793379 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, container_name=collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, release=1761123044)
Dec 02 08:28:05 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:28:05 np0005541913.localdomain podman[81126]: 2025-12-02 08:28:05.48154061 +0000 UTC m=+0.117798646 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, container_name=iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:28:05 np0005541913.localdomain podman[81126]: 2025-12-02 08:28:05.515203226 +0000 UTC m=+0.151461282 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:28:05 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:28:15 np0005541913.localdomain sudo[81175]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clswfeyvmfxlxtfshdlkwrjtywnodybu ; /usr/bin/python3
Dec 02 08:28:15 np0005541913.localdomain sudo[81175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 08:28:15 np0005541913.localdomain python3[81177]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 08:28:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:28:17 np0005541913.localdomain systemd[1]: tmp-crun.XVnl76.mount: Deactivated successfully.
Dec 02 08:28:17 np0005541913.localdomain podman[81181]: 2025-12-02 08:28:17.421754899 +0000 UTC m=+0.066664096 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:28:17 np0005541913.localdomain podman[81181]: 2025-12-02 08:28:17.601638503 +0000 UTC m=+0.246547680 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:28:17 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:28:19 np0005541913.localdomain rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 08:28:19 np0005541913.localdomain rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 08:28:23 np0005541913.localdomain sudo[81175]: pam_unix(sudo:session): session closed for user root
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:28:25 np0005541913.localdomain podman[81399]: 2025-12-02 08:28:25.432791262 +0000 UTC m=+0.065417364 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:28:25 np0005541913.localdomain podman[81398]: 2025-12-02 08:28:25.441595037 +0000 UTC m=+0.073247565 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:28:25 np0005541913.localdomain podman[81399]: 2025-12-02 08:28:25.485980005 +0000 UTC m=+0.118606097 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:28:25 np0005541913.localdomain podman[81398]: 2025-12-02 08:28:25.495011195 +0000 UTC m=+0.126663703 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:28:25 np0005541913.localdomain podman[81397]: 2025-12-02 08:28:25.538671385 +0000 UTC m=+0.176395417 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z)
Dec 02 08:28:25 np0005541913.localdomain podman[81396]: 2025-12-02 08:28:25.565701362 +0000 UTC m=+0.202757277 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044)
Dec 02 08:28:25 np0005541913.localdomain podman[81396]: 2025-12-02 08:28:25.570432033 +0000 UTC m=+0.207487898 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=)
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:28:25 np0005541913.localdomain podman[81410]: 2025-12-02 08:28:25.488917529 +0000 UTC m=+0.115950979 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true)
Dec 02 08:28:25 np0005541913.localdomain podman[81410]: 2025-12-02 08:28:25.622073526 +0000 UTC m=+0.249106946 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:28:25 np0005541913.localdomain podman[81397]: 2025-12-02 08:28:25.860519459 +0000 UTC m=+0.498243461 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, container_name=nova_migration_target, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:28:25 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:28:26 np0005541913.localdomain systemd[1]: tmp-crun.Jlcl5n.mount: Deactivated successfully.
Dec 02 08:28:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:28:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:28:29 np0005541913.localdomain podman[81515]: 2025-12-02 08:28:29.429411193 +0000 UTC m=+0.072502885 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:28:29 np0005541913.localdomain podman[81516]: 2025-12-02 08:28:29.481392905 +0000 UTC m=+0.120311780 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:28:29 np0005541913.localdomain podman[81515]: 2025-12-02 08:28:29.494109768 +0000 UTC m=+0.137201450 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Dec 02 08:28:29 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:28:29 np0005541913.localdomain podman[81516]: 2025-12-02 08:28:29.505672752 +0000 UTC m=+0.144591607 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, container_name=ovn_controller, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:28:29 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:28:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:28:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:28:36 np0005541913.localdomain podman[81564]: 2025-12-02 08:28:36.444693144 +0000 UTC m=+0.083166775 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:28:36 np0005541913.localdomain podman[81564]: 2025-12-02 08:28:36.453739915 +0000 UTC m=+0.092213546 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 02 08:28:36 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:28:36 np0005541913.localdomain podman[81563]: 2025-12-02 08:28:36.548531355 +0000 UTC m=+0.189129690 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, name=rhosp17/openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 08:28:36 np0005541913.localdomain podman[81563]: 2025-12-02 08:28:36.563213749 +0000 UTC m=+0.203812124 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:51:28Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:28:36 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:28:46 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:28:46 np0005541913.localdomain recover_tripleo_nova_virtqemud[81603]: 62312
Dec 02 08:28:46 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:28:46 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:28:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:28:48 np0005541913.localdomain systemd[1]: tmp-crun.P9gH4V.mount: Deactivated successfully.
Dec 02 08:28:48 np0005541913.localdomain podman[81604]: 2025-12-02 08:28:48.43077131 +0000 UTC m=+0.079270937 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 02 08:28:48 np0005541913.localdomain podman[81604]: 2025-12-02 08:28:48.648441235 +0000 UTC m=+0.296940892 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:28:48 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:28:50 np0005541913.localdomain sudo[81653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:28:50 np0005541913.localdomain sudo[81653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:28:50 np0005541913.localdomain sudo[81653]: pam_unix(sudo:session): session closed for user root
Dec 02 08:28:51 np0005541913.localdomain sudo[81670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:28:51 np0005541913.localdomain sudo[81670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:28:51 np0005541913.localdomain sudo[81670]: pam_unix(sudo:session): session closed for user root
Dec 02 08:28:52 np0005541913.localdomain sudo[81739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:28:52 np0005541913.localdomain sudo[81739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:28:52 np0005541913.localdomain sudo[81739]: pam_unix(sudo:session): session closed for user root
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:28:56 np0005541913.localdomain podman[81756]: 2025-12-02 08:28:56.458263192 +0000 UTC m=+0.093134090 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:28:56 np0005541913.localdomain podman[81757]: 2025-12-02 08:28:56.438970002 +0000 UTC m=+0.067408715 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, tcib_managed=true, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, container_name=nova_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:28:56 np0005541913.localdomain podman[81755]: 2025-12-02 08:28:56.504273142 +0000 UTC m=+0.139765485 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:28:56 np0005541913.localdomain podman[81755]: 2025-12-02 08:28:56.515073147 +0000 UTC m=+0.150565480 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:28:56 np0005541913.localdomain podman[81758]: 2025-12-02 08:28:56.554780536 +0000 UTC m=+0.179340121 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, version=17.1.12, io.openshift.expose-services=)
Dec 02 08:28:56 np0005541913.localdomain podman[81757]: 2025-12-02 08:28:56.572549738 +0000 UTC m=+0.200988551 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:28:56 np0005541913.localdomain podman[81758]: 2025-12-02 08:28:56.62806062 +0000 UTC m=+0.252620195 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, release=1761123044, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public)
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:28:56 np0005541913.localdomain podman[81769]: 2025-12-02 08:28:56.701273161 +0000 UTC m=+0.323207179 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 02 08:28:56 np0005541913.localdomain podman[81769]: 2025-12-02 08:28:56.729910049 +0000 UTC m=+0.351844057 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:28:56 np0005541913.localdomain podman[81756]: 2025-12-02 08:28:56.803996604 +0000 UTC m=+0.438867502 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:28:56 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:29:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:29:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:29:00 np0005541913.localdomain systemd[1]: tmp-crun.TwDO0U.mount: Deactivated successfully.
Dec 02 08:29:00 np0005541913.localdomain podman[81874]: 2025-12-02 08:29:00.423967176 +0000 UTC m=+0.066748208 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64)
Dec 02 08:29:00 np0005541913.localdomain podman[81874]: 2025-12-02 08:29:00.479239872 +0000 UTC m=+0.122020824 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, url=https://www.redhat.com, container_name=ovn_controller, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team)
Dec 02 08:29:00 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:29:00 np0005541913.localdomain podman[81873]: 2025-12-02 08:29:00.432124114 +0000 UTC m=+0.075174703 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible)
Dec 02 08:29:00 np0005541913.localdomain podman[81873]: 2025-12-02 08:29:00.566103781 +0000 UTC m=+0.209154410 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:29:00 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:29:02 np0005541913.localdomain anacron[18350]: Job `cron.monthly' started
Dec 02 08:29:02 np0005541913.localdomain anacron[18350]: Job `cron.monthly' terminated
Dec 02 08:29:02 np0005541913.localdomain anacron[18350]: Normal exit (3 jobs run)
Dec 02 08:29:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:29:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:29:08 np0005541913.localdomain systemd[1]: tmp-crun.JU34lu.mount: Deactivated successfully.
Dec 02 08:29:08 np0005541913.localdomain podman[81923]: 2025-12-02 08:29:08.510585704 +0000 UTC m=+0.079276378 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:29:08 np0005541913.localdomain podman[81924]: 2025-12-02 08:29:08.53009843 +0000 UTC m=+0.093685864 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, release=1761123044, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.)
Dec 02 08:29:08 np0005541913.localdomain podman[81924]: 2025-12-02 08:29:08.538452492 +0000 UTC m=+0.102039976 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 02 08:29:08 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:29:08 np0005541913.localdomain podman[81923]: 2025-12-02 08:29:08.59343919 +0000 UTC m=+0.162129924 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Dec 02 08:29:08 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:29:09 np0005541913.localdomain systemd[1]: tmp-crun.Iu9Vmp.mount: Deactivated successfully.
Dec 02 08:29:12 np0005541913.localdomain sudo[81974]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umqmbbzbxqjwxsdcxxequcarcswkqavu ; /usr/bin/python3
Dec 02 08:29:12 np0005541913.localdomain sudo[81974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 08:29:12 np0005541913.localdomain python3[81976]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 08:29:15 np0005541913.localdomain rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 08:29:15 np0005541913.localdomain rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 08:29:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:29:19 np0005541913.localdomain systemd[1]: tmp-crun.sWrQXk.mount: Deactivated successfully.
Dec 02 08:29:19 np0005541913.localdomain podman[82164]: 2025-12-02 08:29:19.452070987 +0000 UTC m=+0.094190876 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd)
Dec 02 08:29:19 np0005541913.localdomain podman[82164]: 2025-12-02 08:29:19.627941789 +0000 UTC m=+0.270061658 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr)
Dec 02 08:29:19 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:29:19 np0005541913.localdomain sudo[81974]: pam_unix(sudo:session): session closed for user root
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:29:27 np0005541913.localdomain podman[82196]: 2025-12-02 08:29:27.488650131 +0000 UTC m=+0.120707001 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5)
Dec 02 08:29:27 np0005541913.localdomain podman[82201]: 2025-12-02 08:29:27.441500321 +0000 UTC m=+0.071545109 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, release=1761123044)
Dec 02 08:29:27 np0005541913.localdomain podman[82194]: 2025-12-02 08:29:27.49845773 +0000 UTC m=+0.135852686 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, name=rhosp17/openstack-cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:29:27 np0005541913.localdomain podman[82194]: 2025-12-02 08:29:27.53305959 +0000 UTC m=+0.170454546 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, batch=17.1_20251118.1, name=rhosp17/openstack-cron, version=17.1.12, config_id=tripleo_step4, release=1761123044)
Dec 02 08:29:27 np0005541913.localdomain podman[82195]: 2025-12-02 08:29:27.538501468 +0000 UTC m=+0.173865702 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:29:27 np0005541913.localdomain podman[82196]: 2025-12-02 08:29:27.542653185 +0000 UTC m=+0.174710045 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, version=17.1.12)
Dec 02 08:29:27 np0005541913.localdomain podman[82208]: 2025-12-02 08:29:27.463346747 +0000 UTC m=+0.086267645 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1)
Dec 02 08:29:27 np0005541913.localdomain podman[82201]: 2025-12-02 08:29:27.572546364 +0000 UTC m=+0.202591152 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:29:27 np0005541913.localdomain podman[82208]: 2025-12-02 08:29:27.592288446 +0000 UTC m=+0.215209344 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:29:27 np0005541913.localdomain podman[82195]: 2025-12-02 08:29:27.88499293 +0000 UTC m=+0.520357214 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:29:27 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:29:28 np0005541913.localdomain systemd[1]: tmp-crun.31tALY.mount: Deactivated successfully.
Dec 02 08:29:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:29:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:29:31 np0005541913.localdomain podman[82313]: 2025-12-02 08:29:31.441907558 +0000 UTC m=+0.082177431 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 02 08:29:31 np0005541913.localdomain systemd[1]: tmp-crun.r4syXG.mount: Deactivated successfully.
Dec 02 08:29:31 np0005541913.localdomain podman[82312]: 2025-12-02 08:29:31.496815174 +0000 UTC m=+0.139704223 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 02 08:29:31 np0005541913.localdomain podman[82313]: 2025-12-02 08:29:31.519009489 +0000 UTC m=+0.159279362 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, release=1761123044, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com)
Dec 02 08:29:31 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:29:31 np0005541913.localdomain podman[82312]: 2025-12-02 08:29:31.571016682 +0000 UTC m=+0.213905631 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 02 08:29:31 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:29:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:29:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:29:39 np0005541913.localdomain systemd[1]: tmp-crun.SS5eL9.mount: Deactivated successfully.
Dec 02 08:29:39 np0005541913.localdomain podman[82357]: 2025-12-02 08:29:39.4670641 +0000 UTC m=+0.089263381 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z)
Dec 02 08:29:39 np0005541913.localdomain podman[82357]: 2025-12-02 08:29:39.478999114 +0000 UTC m=+0.101198445 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, container_name=collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, distribution-scope=public)
Dec 02 08:29:39 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:29:39 np0005541913.localdomain podman[82358]: 2025-12-02 08:29:39.574880802 +0000 UTC m=+0.193301816 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T23:44:13Z)
Dec 02 08:29:39 np0005541913.localdomain podman[82358]: 2025-12-02 08:29:39.585134373 +0000 UTC m=+0.203555397 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:29:39 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:29:42 np0005541913.localdomain python3[82409]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Dec 02 08:29:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:29:50 np0005541913.localdomain podman[82410]: 2025-12-02 08:29:50.440884384 +0000 UTC m=+0.077620244 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, container_name=metrics_qdr, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, release=1761123044)
Dec 02 08:29:50 np0005541913.localdomain podman[82410]: 2025-12-02 08:29:50.636198611 +0000 UTC m=+0.272934461 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:29:50 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:29:52 np0005541913.localdomain sudo[82484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:29:52 np0005541913.localdomain sudo[82484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:29:52 np0005541913.localdomain sudo[82484]: pam_unix(sudo:session): session closed for user root
Dec 02 08:29:52 np0005541913.localdomain sudo[82499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 08:29:52 np0005541913.localdomain sudo[82499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:29:53 np0005541913.localdomain podman[82587]: 2025-12-02 08:29:53.370395199 +0000 UTC m=+0.085394342 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 08:29:53 np0005541913.localdomain podman[82587]: 2025-12-02 08:29:53.461038175 +0000 UTC m=+0.176037358 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.buildah.version=1.41.4)
Dec 02 08:29:53 np0005541913.localdomain sudo[82499]: pam_unix(sudo:session): session closed for user root
Dec 02 08:29:53 np0005541913.localdomain sudo[82652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:29:53 np0005541913.localdomain sudo[82652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:29:53 np0005541913.localdomain sudo[82652]: pam_unix(sudo:session): session closed for user root
Dec 02 08:29:53 np0005541913.localdomain sudo[82667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:29:53 np0005541913.localdomain sudo[82667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:29:54 np0005541913.localdomain sudo[82667]: pam_unix(sudo:session): session closed for user root
Dec 02 08:29:55 np0005541913.localdomain sudo[82714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:29:55 np0005541913.localdomain sudo[82714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:29:55 np0005541913.localdomain sudo[82714]: pam_unix(sudo:session): session closed for user root
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:29:58 np0005541913.localdomain podman[82730]: 2025-12-02 08:29:58.479081129 +0000 UTC m=+0.108800678 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:36:58Z)
Dec 02 08:29:58 np0005541913.localdomain podman[82729]: 2025-12-02 08:29:58.527548351 +0000 UTC m=+0.157246460 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, batch=17.1_20251118.1)
Dec 02 08:29:58 np0005541913.localdomain podman[82733]: 2025-12-02 08:29:58.58176311 +0000 UTC m=+0.204726238 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true)
Dec 02 08:29:58 np0005541913.localdomain podman[82733]: 2025-12-02 08:29:58.609493485 +0000 UTC m=+0.232456623 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:29:58 np0005541913.localdomain podman[82731]: 2025-12-02 08:29:58.635029775 +0000 UTC m=+0.264100288 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step5, release=1761123044, container_name=nova_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:29:58 np0005541913.localdomain podman[82729]: 2025-12-02 08:29:58.646275481 +0000 UTC m=+0.275973570 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:29:58 np0005541913.localdomain podman[82731]: 2025-12-02 08:29:58.665088939 +0000 UTC m=+0.294159422 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, tcib_managed=true, build-date=2025-11-19T00:36:58Z, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:29:58 np0005541913.localdomain podman[82732]: 2025-12-02 08:29:58.732662067 +0000 UTC m=+0.357230735 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12)
Dec 02 08:29:58 np0005541913.localdomain podman[82732]: 2025-12-02 08:29:58.764125908 +0000 UTC m=+0.388694536 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, vcs-type=git, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute)
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:29:58 np0005541913.localdomain podman[82730]: 2025-12-02 08:29:58.813131513 +0000 UTC m=+0.442851092 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Dec 02 08:29:58 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:30:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:30:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:30:02 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:30:02 np0005541913.localdomain recover_tripleo_nova_virtqemud[82851]: 62312
Dec 02 08:30:02 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:30:02 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:30:02 np0005541913.localdomain podman[82848]: 2025-12-02 08:30:02.455876085 +0000 UTC m=+0.091089497 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ovn_metadata_agent, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public)
Dec 02 08:30:02 np0005541913.localdomain podman[82849]: 2025-12-02 08:30:02.500139761 +0000 UTC m=+0.135394674 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 02 08:30:02 np0005541913.localdomain podman[82848]: 2025-12-02 08:30:02.53507931 +0000 UTC m=+0.170292682 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:30:02 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:30:02 np0005541913.localdomain podman[82849]: 2025-12-02 08:30:02.55157849 +0000 UTC m=+0.186833423 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, release=1761123044, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:30:02 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:30:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:30:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:30:10 np0005541913.localdomain podman[82897]: 2025-12-02 08:30:10.455591312 +0000 UTC m=+0.094309540 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, container_name=collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, url=https://www.redhat.com)
Dec 02 08:30:10 np0005541913.localdomain podman[82897]: 2025-12-02 08:30:10.499055988 +0000 UTC m=+0.137774206 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12)
Dec 02 08:30:10 np0005541913.localdomain systemd[1]: tmp-crun.FrxGOH.mount: Deactivated successfully.
Dec 02 08:30:10 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:30:10 np0005541913.localdomain podman[82898]: 2025-12-02 08:30:10.515482414 +0000 UTC m=+0.154323465 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:30:10 np0005541913.localdomain podman[82898]: 2025-12-02 08:30:10.550254989 +0000 UTC m=+0.189096060 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, distribution-scope=public, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Dec 02 08:30:10 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:30:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:30:21 np0005541913.localdomain podman[82937]: 2025-12-02 08:30:21.450997596 +0000 UTC m=+0.088871922 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=metrics_qdr, architecture=x86_64, release=1761123044, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12)
Dec 02 08:30:21 np0005541913.localdomain podman[82937]: 2025-12-02 08:30:21.650521729 +0000 UTC m=+0.288396065 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, config_id=tripleo_step1, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:30:21 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: tmp-crun.SfbreC.mount: Deactivated successfully.
Dec 02 08:30:29 np0005541913.localdomain podman[82970]: 2025-12-02 08:30:29.483232807 +0000 UTC m=+0.111700052 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:30:29 np0005541913.localdomain podman[82968]: 2025-12-02 08:30:29.499153152 +0000 UTC m=+0.138719349 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-cron, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git)
Dec 02 08:30:29 np0005541913.localdomain podman[82976]: 2025-12-02 08:30:29.517736585 +0000 UTC m=+0.141226033 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:30:29 np0005541913.localdomain podman[82968]: 2025-12-02 08:30:29.533240979 +0000 UTC m=+0.172807106 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true)
Dec 02 08:30:29 np0005541913.localdomain podman[82976]: 2025-12-02 08:30:29.540094114 +0000 UTC m=+0.163583492 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc.)
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:30:29 np0005541913.localdomain podman[82969]: 2025-12-02 08:30:29.447757996 +0000 UTC m=+0.084675605 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, release=1761123044, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_migration_target)
Dec 02 08:30:29 np0005541913.localdomain podman[82982]: 2025-12-02 08:30:29.60994043 +0000 UTC m=+0.234364291 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:30:29 np0005541913.localdomain podman[82970]: 2025-12-02 08:30:29.66620173 +0000 UTC m=+0.294668945 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:30:29 np0005541913.localdomain podman[82982]: 2025-12-02 08:30:29.698949223 +0000 UTC m=+0.323373144 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc.)
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:30:29 np0005541913.localdomain podman[82969]: 2025-12-02 08:30:29.815061786 +0000 UTC m=+0.451979385 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:30:29 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:30:30 np0005541913.localdomain systemd[1]: tmp-crun.bu9tY5.mount: Deactivated successfully.
Dec 02 08:30:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:30:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:30:33 np0005541913.localdomain podman[83089]: 2025-12-02 08:30:33.4445391 +0000 UTC m=+0.084525950 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044)
Dec 02 08:30:33 np0005541913.localdomain podman[83089]: 2025-12-02 08:30:33.48580862 +0000 UTC m=+0.125795440 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z)
Dec 02 08:30:33 np0005541913.localdomain podman[83090]: 2025-12-02 08:30:33.500423971 +0000 UTC m=+0.133592177 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4)
Dec 02 08:30:33 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:30:33 np0005541913.localdomain podman[83090]: 2025-12-02 08:30:33.520905343 +0000 UTC m=+0.154073509 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, vcs-type=git)
Dec 02 08:30:33 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:30:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:30:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:30:41 np0005541913.localdomain podman[83135]: 2025-12-02 08:30:41.441357594 +0000 UTC m=+0.080666503 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:30:41 np0005541913.localdomain podman[83135]: 2025-12-02 08:30:41.457395672 +0000 UTC m=+0.096704571 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z)
Dec 02 08:30:41 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:30:41 np0005541913.localdomain podman[83136]: 2025-12-02 08:30:41.546710963 +0000 UTC m=+0.183529748 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, container_name=iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12)
Dec 02 08:30:41 np0005541913.localdomain podman[83136]: 2025-12-02 08:30:41.559133949 +0000 UTC m=+0.195952724 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 02 08:30:41 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:30:42 np0005541913.localdomain sshd[80295]: Received disconnect from 38.102.83.114 port 39524:11: disconnected by user
Dec 02 08:30:42 np0005541913.localdomain sshd[80295]: Disconnected from user zuul 38.102.83.114 port 39524
Dec 02 08:30:42 np0005541913.localdomain sshd[80292]: pam_unix(sshd:session): session closed for user zuul
Dec 02 08:30:42 np0005541913.localdomain systemd[1]: session-35.scope: Deactivated successfully.
Dec 02 08:30:42 np0005541913.localdomain systemd[1]: session-35.scope: Consumed 19.340s CPU time.
Dec 02 08:30:42 np0005541913.localdomain systemd-logind[757]: Session 35 logged out. Waiting for processes to exit.
Dec 02 08:30:42 np0005541913.localdomain systemd-logind[757]: Removed session 35.
Dec 02 08:30:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:30:52 np0005541913.localdomain podman[83218]: 2025-12-02 08:30:52.436286895 +0000 UTC m=+0.078531158 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com)
Dec 02 08:30:52 np0005541913.localdomain podman[83218]: 2025-12-02 08:30:52.61701388 +0000 UTC m=+0.259258193 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true)
Dec 02 08:30:52 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:30:55 np0005541913.localdomain sudo[83247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:30:55 np0005541913.localdomain sudo[83247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:30:55 np0005541913.localdomain sudo[83247]: pam_unix(sudo:session): session closed for user root
Dec 02 08:30:55 np0005541913.localdomain sudo[83262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:30:55 np0005541913.localdomain sudo[83262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:30:56 np0005541913.localdomain sudo[83262]: pam_unix(sudo:session): session closed for user root
Dec 02 08:30:56 np0005541913.localdomain sudo[83310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:30:56 np0005541913.localdomain sudo[83310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:30:56 np0005541913.localdomain sudo[83310]: pam_unix(sudo:session): session closed for user root
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: tmp-crun.iLJBEV.mount: Deactivated successfully.
Dec 02 08:31:00 np0005541913.localdomain podman[83326]: 2025-12-02 08:31:00.446898277 +0000 UTC m=+0.086128980 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:31:00 np0005541913.localdomain podman[83327]: 2025-12-02 08:31:00.450002096 +0000 UTC m=+0.081548085 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 02 08:31:00 np0005541913.localdomain podman[83325]: 2025-12-02 08:31:00.496692173 +0000 UTC m=+0.135528897 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, version=17.1.12, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, container_name=logrotate_crond, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Dec 02 08:31:00 np0005541913.localdomain podman[83325]: 2025-12-02 08:31:00.50206966 +0000 UTC m=+0.140906394 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, vcs-type=git, name=rhosp17/openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:31:00 np0005541913.localdomain podman[83338]: 2025-12-02 08:31:00.509954841 +0000 UTC m=+0.136549764 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:31:00 np0005541913.localdomain podman[83338]: 2025-12-02 08:31:00.530241527 +0000 UTC m=+0.156836500 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:31:00 np0005541913.localdomain podman[83327]: 2025-12-02 08:31:00.543368981 +0000 UTC m=+0.174914960 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, release=1761123044, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12)
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:31:00 np0005541913.localdomain podman[83339]: 2025-12-02 08:31:00.613292659 +0000 UTC m=+0.236001453 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:12:45Z)
Dec 02 08:31:00 np0005541913.localdomain podman[83339]: 2025-12-02 08:31:00.633360209 +0000 UTC m=+0.256068943 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:31:00 np0005541913.localdomain podman[83326]: 2025-12-02 08:31:00.815052049 +0000 UTC m=+0.454282782 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true)
Dec 02 08:31:00 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:31:01 np0005541913.localdomain systemd[1]: tmp-crun.qkG2g0.mount: Deactivated successfully.
Dec 02 08:31:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:31:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:31:04 np0005541913.localdomain systemd[1]: tmp-crun.sX7kJT.mount: Deactivated successfully.
Dec 02 08:31:04 np0005541913.localdomain podman[83446]: 2025-12-02 08:31:04.445750806 +0000 UTC m=+0.085947247 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team)
Dec 02 08:31:04 np0005541913.localdomain systemd[1]: tmp-crun.5LTvCC.mount: Deactivated successfully.
Dec 02 08:31:04 np0005541913.localdomain podman[83447]: 2025-12-02 08:31:04.501863533 +0000 UTC m=+0.137704532 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:31:04 np0005541913.localdomain podman[83446]: 2025-12-02 08:31:04.52026209 +0000 UTC m=+0.160458521 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, distribution-scope=public, url=https://www.redhat.com)
Dec 02 08:31:04 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:31:04 np0005541913.localdomain podman[83447]: 2025-12-02 08:31:04.555306642 +0000 UTC m=+0.191147601 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, release=1761123044, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Dec 02 08:31:04 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:31:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:31:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:31:12 np0005541913.localdomain systemd[1]: tmp-crun.djyhon.mount: Deactivated successfully.
Dec 02 08:31:12 np0005541913.localdomain podman[83494]: 2025-12-02 08:31:12.455800176 +0000 UTC m=+0.091558160 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T23:44:13Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid)
Dec 02 08:31:12 np0005541913.localdomain podman[83493]: 2025-12-02 08:31:12.458822443 +0000 UTC m=+0.098308111 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, config_id=tripleo_step3, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, architecture=x86_64)
Dec 02 08:31:12 np0005541913.localdomain podman[83493]: 2025-12-02 08:31:12.47092805 +0000 UTC m=+0.110413658 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step3)
Dec 02 08:31:12 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:31:12 np0005541913.localdomain podman[83494]: 2025-12-02 08:31:12.489905473 +0000 UTC m=+0.125663467 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid)
Dec 02 08:31:12 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:31:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:31:23 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:31:23 np0005541913.localdomain recover_tripleo_nova_virtqemud[83534]: 62312
Dec 02 08:31:23 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:31:23 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:31:23 np0005541913.localdomain podman[83532]: 2025-12-02 08:31:23.439636018 +0000 UTC m=+0.080756917 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 02 08:31:23 np0005541913.localdomain podman[83532]: 2025-12-02 08:31:23.662822877 +0000 UTC m=+0.303943796 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 02 08:31:23 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:31:31 np0005541913.localdomain podman[83565]: 2025-12-02 08:31:31.487967382 +0000 UTC m=+0.122234332 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:31:31 np0005541913.localdomain podman[83567]: 2025-12-02 08:31:31.539516 +0000 UTC m=+0.169017369 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z)
Dec 02 08:31:31 np0005541913.localdomain podman[83567]: 2025-12-02 08:31:31.558016256 +0000 UTC m=+0.187517585 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:31:31 np0005541913.localdomain podman[83566]: 2025-12-02 08:31:31.457256732 +0000 UTC m=+0.093196917 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044)
Dec 02 08:31:31 np0005541913.localdomain podman[83566]: 2025-12-02 08:31:31.640390486 +0000 UTC m=+0.276330661 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, config_id=tripleo_step5, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container)
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:31:31 np0005541913.localdomain podman[83573]: 2025-12-02 08:31:31.685429547 +0000 UTC m=+0.311383040 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:31:31 np0005541913.localdomain podman[83573]: 2025-12-02 08:31:31.710930635 +0000 UTC m=+0.336884128 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12)
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:31:31 np0005541913.localdomain podman[83564]: 2025-12-02 08:31:31.511656669 +0000 UTC m=+0.148609702 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=logrotate_crond, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Dec 02 08:31:31 np0005541913.localdomain podman[83564]: 2025-12-02 08:31:31.799075363 +0000 UTC m=+0.436028396 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., container_name=logrotate_crond, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=)
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:31:31 np0005541913.localdomain podman[83565]: 2025-12-02 08:31:31.831056367 +0000 UTC m=+0.465323337 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 02 08:31:31 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:31:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:31:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:31:35 np0005541913.localdomain systemd[1]: tmp-crun.KPcC3E.mount: Deactivated successfully.
Dec 02 08:31:35 np0005541913.localdomain podman[83684]: 2025-12-02 08:31:35.42657447 +0000 UTC m=+0.072183384 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Dec 02 08:31:35 np0005541913.localdomain systemd[1]: tmp-crun.ypykxX.mount: Deactivated successfully.
Dec 02 08:31:35 np0005541913.localdomain podman[83685]: 2025-12-02 08:31:35.450668827 +0000 UTC m=+0.089543517 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:31:35 np0005541913.localdomain podman[83685]: 2025-12-02 08:31:35.469995056 +0000 UTC m=+0.108869746 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ovn-controller, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:31:35 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:31:35 np0005541913.localdomain podman[83684]: 2025-12-02 08:31:35.496982533 +0000 UTC m=+0.142591507 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z)
Dec 02 08:31:35 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:31:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:31:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:31:43 np0005541913.localdomain systemd[1]: tmp-crun.YyFUux.mount: Deactivated successfully.
Dec 02 08:31:43 np0005541913.localdomain podman[83730]: 2025-12-02 08:31:43.448238811 +0000 UTC m=+0.086842984 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 02 08:31:43 np0005541913.localdomain podman[83729]: 2025-12-02 08:31:43.418072848 +0000 UTC m=+0.063354923 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Dec 02 08:31:43 np0005541913.localdomain podman[83730]: 2025-12-02 08:31:43.480896984 +0000 UTC m=+0.119501127 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044)
Dec 02 08:31:43 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:31:43 np0005541913.localdomain podman[83729]: 2025-12-02 08:31:43.50195666 +0000 UTC m=+0.147238705 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 08:31:43 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:31:53 np0005541913.localdomain sudo[84189]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpar957dh7/privsep.sock
Dec 02 08:31:53 np0005541913.localdomain systemd-logind[757]: Existing logind session ID 29 used by new audit session, ignoring.
Dec 02 08:31:53 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 02 08:31:53 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 02 08:31:53 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 02 08:31:53 np0005541913.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Queued start job for default target Main User Target.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Created slice User Application Slice.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Reached target Paths.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Reached target Timers.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Starting D-Bus User Message Bus Socket...
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Starting Create User's Volatile Files and Directories...
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Listening on D-Bus User Message Bus Socket.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Reached target Sockets.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Finished Create User's Volatile Files and Directories.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Reached target Basic System.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Reached target Main User Target.
Dec 02 08:31:53 np0005541913.localdomain systemd[84191]: Startup finished in 149ms.
Dec 02 08:31:53 np0005541913.localdomain systemd[1]: Started User Manager for UID 0.
Dec 02 08:31:53 np0005541913.localdomain systemd[1]: Started Session c11 of User root.
Dec 02 08:31:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:31:53 np0005541913.localdomain sudo[84189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Dec 02 08:31:53 np0005541913.localdomain podman[84207]: 2025-12-02 08:31:53.764983689 +0000 UTC m=+0.063513857 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:31:53 np0005541913.localdomain podman[84207]: 2025-12-02 08:31:53.973931679 +0000 UTC m=+0.272461817 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc.)
Dec 02 08:31:53 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:31:54 np0005541913.localdomain sudo[84189]: pam_unix(sudo:session): session closed for user root
Dec 02 08:31:54 np0005541913.localdomain kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 02 08:31:54 np0005541913.localdomain kernel: device tap4a318f6a-b3 entered promiscuous mode
Dec 02 08:31:54 np0005541913.localdomain NetworkManager[5965]: <info>  [1764664314.7696] manager: (tap4a318f6a-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/13)
Dec 02 08:31:54 np0005541913.localdomain systemd-udevd[84255]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 08:31:54 np0005541913.localdomain NetworkManager[5965]: <info>  [1764664314.7871] device (tap4a318f6a-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 02 08:31:54 np0005541913.localdomain NetworkManager[5965]: <info>  [1764664314.7880] device (tap4a318f6a-b3): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Dec 02 08:31:54 np0005541913.localdomain systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 02 08:31:54 np0005541913.localdomain systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 02 08:31:54 np0005541913.localdomain systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 02 08:31:54 np0005541913.localdomain systemd-machined[84262]: New machine qemu-1-instance-00000002.
Dec 02 08:31:54 np0005541913.localdomain systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Dec 02 08:31:54 np0005541913.localdomain NetworkManager[5965]: <info>  [1764664314.9986] manager: (tap595e1c9b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/14)
Dec 02 08:31:55 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap595e1c9b-71: link becomes ready
Dec 02 08:31:55 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap595e1c9b-70: link becomes ready
Dec 02 08:31:55 np0005541913.localdomain NetworkManager[5965]: <info>  [1764664315.0409] device (tap595e1c9b-70): carrier: link connected
Dec 02 08:31:55 np0005541913.localdomain kernel: device tap595e1c9b-70 entered promiscuous mode
Dec 02 08:31:56 np0005541913.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 02 08:31:56 np0005541913.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 02 08:31:56 np0005541913.localdomain systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 02 08:31:56 np0005541913.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 02 08:31:57 np0005541913.localdomain sudo[84373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:31:57 np0005541913.localdomain sudo[84373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:31:57 np0005541913.localdomain sudo[84373]: pam_unix(sudo:session): session closed for user root
Dec 02 08:31:57 np0005541913.localdomain sudo[84388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:31:57 np0005541913.localdomain sudo[84388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:31:57 np0005541913.localdomain sudo[84408]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf ip netns exec ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8 haproxy -f /var/lib/neutron/ovn-metadata-proxy/595e1c9b-709c-41d2-9212-0b18b13291a8.conf
Dec 02 08:31:57 np0005541913.localdomain sudo[84408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Dec 02 08:31:57 np0005541913.localdomain podman[84443]: 2025-12-02 08:31:57.49145496 +0000 UTC m=+0.029820745 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 02 08:31:57 np0005541913.localdomain podman[84443]: 2025-12-02 08:31:57.541297662 +0000 UTC m=+0.079663397 container create 7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:31:57 np0005541913.localdomain systemd[1]: Started libpod-conmon-7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0.scope.
Dec 02 08:31:57 np0005541913.localdomain systemd[1]: tmp-crun.d71t3a.mount: Deactivated successfully.
Dec 02 08:31:57 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:31:57 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f61ced3f88a0be87d665800e8e7cb17559a616ee2c3a746c87a603ddb5549d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 08:31:57 np0005541913.localdomain podman[84443]: 2025-12-02 08:31:57.609419003 +0000 UTC m=+0.147784738 container init 7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 02 08:31:57 np0005541913.localdomain podman[84443]: 2025-12-02 08:31:57.616208769 +0000 UTC m=+0.154574494 container start 7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, architecture=x86_64, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1)
Dec 02 08:31:57 np0005541913.localdomain sudo[84408]: pam_unix(sudo:session): session closed for user root
Dec 02 08:31:57 np0005541913.localdomain sudo[84388]: pam_unix(sudo:session): session closed for user root
Dec 02 08:31:57 np0005541913.localdomain setroubleshoot[84359]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l c62ace7d-fc71-492d-8738-6cc52b8f8f8f
Dec 02 08:31:57 np0005541913.localdomain setroubleshoot[84359]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.
                                                                
                                                                *****  Plugin qemu_file_image (98.8 confidence) suggests   *******************
                                                                
                                                                If max_map_count is a virtualization target
                                                                Then you need to change the label on max_map_count'
                                                                Do
                                                                # semanage fcontext -a -t virt_image_t 'max_map_count'
                                                                # restorecon -v 'max_map_count'
                                                                
                                                                *****  Plugin catchall (2.13 confidence) suggests   **************************
                                                                
                                                                If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.
                                                                Then you should report this as a bug.
                                                                You can generate a local policy module to allow this access.
                                                                Do
                                                                allow this access for now by executing:
                                                                # ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm
                                                                # semodule -X 300 -i my-qemukvm.pp
                                                                
Dec 02 08:31:58 np0005541913.localdomain sudo[84483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:31:58 np0005541913.localdomain sudo[84483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:31:58 np0005541913.localdomain sudo[84483]: pam_unix(sudo:session): session closed for user root
Dec 02 08:32:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:32:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:32:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:32:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:32:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:32:02 np0005541913.localdomain podman[84499]: 2025-12-02 08:32:02.460847755 +0000 UTC m=+0.096385844 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:32:02 np0005541913.localdomain podman[84499]: 2025-12-02 08:32:02.469215704 +0000 UTC m=+0.104753793 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z)
Dec 02 08:32:02 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:32:02 np0005541913.localdomain podman[84514]: 2025-12-02 08:32:02.51409526 +0000 UTC m=+0.137317893 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:32:02 np0005541913.localdomain podman[84501]: 2025-12-02 08:32:02.566900903 +0000 UTC m=+0.197083446 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Dec 02 08:32:02 np0005541913.localdomain podman[84501]: 2025-12-02 08:32:02.606735282 +0000 UTC m=+0.236917825 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:32:02 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:32:02 np0005541913.localdomain podman[84502]: 2025-12-02 08:32:02.623034187 +0000 UTC m=+0.249978371 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 02 08:32:02 np0005541913.localdomain podman[84500]: 2025-12-02 08:32:02.669103697 +0000 UTC m=+0.304519593 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=nova_migration_target, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:32:02 np0005541913.localdomain podman[84502]: 2025-12-02 08:32:02.689021801 +0000 UTC m=+0.315965955 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com)
Dec 02 08:32:02 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:32:02 np0005541913.localdomain podman[84514]: 2025-12-02 08:32:02.744908608 +0000 UTC m=+0.368131231 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:32:02 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:32:03 np0005541913.localdomain podman[84500]: 2025-12-02 08:32:03.05604534 +0000 UTC m=+0.691461246 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, release=1761123044, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4)
Dec 02 08:32:03 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:32:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:32:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:32:06 np0005541913.localdomain systemd[1]: tmp-crun.XjWpvF.mount: Deactivated successfully.
Dec 02 08:32:06 np0005541913.localdomain podman[84619]: 2025-12-02 08:32:06.443663831 +0000 UTC m=+0.077587971 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:32:06 np0005541913.localdomain systemd[1]: tmp-crun.wslz9a.mount: Deactivated successfully.
Dec 02 08:32:06 np0005541913.localdomain podman[84620]: 2025-12-02 08:32:06.504415292 +0000 UTC m=+0.135522905 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:32:06 np0005541913.localdomain podman[84619]: 2025-12-02 08:32:06.513030537 +0000 UTC m=+0.146954617 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 02 08:32:06 np0005541913.localdomain podman[84620]: 2025-12-02 08:32:06.524901081 +0000 UTC m=+0.156008664 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:32:06 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:32:06 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:32:07 np0005541913.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 02 08:32:07 np0005541913.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.012s CPU time.
Dec 02 08:32:07 np0005541913.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 02 08:32:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:32:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:32:14 np0005541913.localdomain podman[84667]: 2025-12-02 08:32:14.44107409 +0000 UTC m=+0.084285144 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 02 08:32:14 np0005541913.localdomain podman[84667]: 2025-12-02 08:32:14.453921081 +0000 UTC m=+0.097132125 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:32:14 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:32:14 np0005541913.localdomain podman[84668]: 2025-12-02 08:32:14.537951497 +0000 UTC m=+0.178661022 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, release=1761123044, name=rhosp17/openstack-iscsid)
Dec 02 08:32:14 np0005541913.localdomain podman[84668]: 2025-12-02 08:32:14.573008106 +0000 UTC m=+0.213717671 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 02 08:32:14 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:32:15 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34900 [02/Dec/2025:08:32:14.483] listener listener/metadata 0/0/0/1171/1171 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Dec 02 08:32:15 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34908 [02/Dec/2025:08:32:15.733] listener listener/metadata 0/0/0/9/9 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Dec 02 08:32:15 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34918 [02/Dec/2025:08:32:15.780] listener listener/metadata 0/0/0/11/11 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Dec 02 08:32:15 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34924 [02/Dec/2025:08:32:15.826] listener listener/metadata 0/0/0/11/11 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Dec 02 08:32:15 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34940 [02/Dec/2025:08:32:15.876] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Dec 02 08:32:15 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34944 [02/Dec/2025:08:32:15.927] listener listener/metadata 0/0/0/12/12 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Dec 02 08:32:15 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34952 [02/Dec/2025:08:32:15.975] listener listener/metadata 0/0/0/12/12 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Dec 02 08:32:16 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34954 [02/Dec/2025:08:32:16.030] listener listener/metadata 0/0/0/10/10 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Dec 02 08:32:16 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34956 [02/Dec/2025:08:32:16.080] listener listener/metadata 0/0/0/8/8 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Dec 02 08:32:16 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34962 [02/Dec/2025:08:32:16.127] listener listener/metadata 0/0/0/10/10 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Dec 02 08:32:16 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34978 [02/Dec/2025:08:32:16.176] listener listener/metadata 0/0/0/12/12 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Dec 02 08:32:16 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34986 [02/Dec/2025:08:32:16.216] listener listener/metadata 0/0/0/11/11 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Dec 02 08:32:16 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:34990 [02/Dec/2025:08:32:16.296] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Dec 02 08:32:16 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:35000 [02/Dec/2025:08:32:16.363] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Dec 02 08:32:16 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:35002 [02/Dec/2025:08:32:16.427] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Dec 02 08:32:16 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[84474]: 192.168.0.102:35004 [02/Dec/2025:08:32:16.492] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Dec 02 08:32:20 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 02 08:32:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:32:24 np0005541913.localdomain podman[84704]: 2025-12-02 08:32:24.450854011 +0000 UTC m=+0.094543755 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 02 08:32:24 np0005541913.localdomain podman[84704]: 2025-12-02 08:32:24.648137721 +0000 UTC m=+0.291827475 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:32:24 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:32:24 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:32:24 np0005541913.localdomain recover_tripleo_nova_virtqemud[84734]: 62312
Dec 02 08:32:24 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:32:24 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: tmp-crun.tcRnxu.mount: Deactivated successfully.
Dec 02 08:32:33 np0005541913.localdomain podman[84743]: 2025-12-02 08:32:33.477800616 +0000 UTC m=+0.101762472 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible)
Dec 02 08:32:33 np0005541913.localdomain podman[84735]: 2025-12-02 08:32:33.436802266 +0000 UTC m=+0.074641661 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, distribution-scope=public, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 02 08:32:33 np0005541913.localdomain podman[84735]: 2025-12-02 08:32:33.519995519 +0000 UTC m=+0.157834914 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public)
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:32:33 np0005541913.localdomain podman[84736]: 2025-12-02 08:32:33.535794131 +0000 UTC m=+0.168713211 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, container_name=nova_migration_target, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:32:33 np0005541913.localdomain podman[84743]: 2025-12-02 08:32:33.56136664 +0000 UTC m=+0.185328466 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.12, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:32:33 np0005541913.localdomain podman[84737]: 2025-12-02 08:32:33.638357874 +0000 UTC m=+0.270540724 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:32:33 np0005541913.localdomain podman[84737]: 2025-12-02 08:32:33.66495754 +0000 UTC m=+0.297140400 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-type=git, version=17.1.12, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5)
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:32:33 np0005541913.localdomain podman[84755]: 2025-12-02 08:32:33.677982757 +0000 UTC m=+0.297451100 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12)
Dec 02 08:32:33 np0005541913.localdomain podman[84755]: 2025-12-02 08:32:33.701230742 +0000 UTC m=+0.320699125 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:32:33 np0005541913.localdomain podman[84736]: 2025-12-02 08:32:33.898299587 +0000 UTC m=+0.531218627 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true)
Dec 02 08:32:33 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:32:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:32:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:32:37 np0005541913.localdomain systemd[1]: tmp-crun.ODducB.mount: Deactivated successfully.
Dec 02 08:32:37 np0005541913.localdomain podman[84858]: 2025-12-02 08:32:37.4521149 +0000 UTC m=+0.090655948 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 02 08:32:37 np0005541913.localdomain systemd[1]: tmp-crun.L0A4wW.mount: Deactivated successfully.
Dec 02 08:32:37 np0005541913.localdomain podman[84859]: 2025-12-02 08:32:37.503307248 +0000 UTC m=+0.139017369 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com)
Dec 02 08:32:37 np0005541913.localdomain podman[84858]: 2025-12-02 08:32:37.522935415 +0000 UTC m=+0.161476513 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, version=17.1.12, container_name=ovn_metadata_agent, batch=17.1_20251118.1)
Dec 02 08:32:37 np0005541913.localdomain podman[84859]: 2025-12-02 08:32:37.530899713 +0000 UTC m=+0.166609794 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public)
Dec 02 08:32:37 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:32:37 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:32:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:32:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:32:45 np0005541913.localdomain podman[84906]: 2025-12-02 08:32:45.44364078 +0000 UTC m=+0.077836828 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, container_name=iscsid, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12)
Dec 02 08:32:45 np0005541913.localdomain podman[84906]: 2025-12-02 08:32:45.479034526 +0000 UTC m=+0.113230554 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., container_name=iscsid, url=https://www.redhat.com)
Dec 02 08:32:45 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:32:45 np0005541913.localdomain podman[84905]: 2025-12-02 08:32:45.491793805 +0000 UTC m=+0.127959068 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 02 08:32:45 np0005541913.localdomain podman[84905]: 2025-12-02 08:32:45.530176305 +0000 UTC m=+0.166341628 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:32:45 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:32:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 08:32:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 08:32:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:32:55 np0005541913.localdomain systemd[1]: tmp-crun.0UiMFA.mount: Deactivated successfully.
Dec 02 08:32:55 np0005541913.localdomain podman[84990]: 2025-12-02 08:32:55.417760326 +0000 UTC m=+0.062044207 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step1)
Dec 02 08:32:55 np0005541913.localdomain podman[84990]: 2025-12-02 08:32:55.572043942 +0000 UTC m=+0.216327823 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, architecture=x86_64, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Dec 02 08:32:55 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:32:58 np0005541913.localdomain sudo[85019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:32:58 np0005541913.localdomain sudo[85019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:32:58 np0005541913.localdomain sudo[85019]: pam_unix(sudo:session): session closed for user root
Dec 02 08:32:58 np0005541913.localdomain sudo[85034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:32:58 np0005541913.localdomain sudo[85034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:32:59 np0005541913.localdomain sudo[85034]: pam_unix(sudo:session): session closed for user root
Dec 02 08:32:59 np0005541913.localdomain sudo[85080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:32:59 np0005541913.localdomain sudo[85080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:32:59 np0005541913.localdomain sudo[85080]: pam_unix(sudo:session): session closed for user root
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:33:04 np0005541913.localdomain podman[85099]: 2025-12-02 08:33:04.457757977 +0000 UTC m=+0.085396154 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: tmp-crun.HjzqDL.mount: Deactivated successfully.
Dec 02 08:33:04 np0005541913.localdomain podman[85096]: 2025-12-02 08:33:04.507200538 +0000 UTC m=+0.138232419 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 02 08:33:04 np0005541913.localdomain podman[85098]: 2025-12-02 08:33:04.557820521 +0000 UTC m=+0.184977735 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4)
Dec 02 08:33:04 np0005541913.localdomain podman[85098]: 2025-12-02 08:33:04.613976836 +0000 UTC m=+0.241134010 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com)
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:33:04 np0005541913.localdomain podman[85099]: 2025-12-02 08:33:04.630365594 +0000 UTC m=+0.258003841 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:33:04 np0005541913.localdomain podman[85095]: 2025-12-02 08:33:04.617088541 +0000 UTC m=+0.247246587 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 02 08:33:04 np0005541913.localdomain podman[85095]: 2025-12-02 08:33:04.702990928 +0000 UTC m=+0.333148934 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=logrotate_crond)
Dec 02 08:33:04 np0005541913.localdomain podman[85097]: 2025-12-02 08:33:04.711516911 +0000 UTC m=+0.338979874 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step5, distribution-scope=public)
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:33:04 np0005541913.localdomain podman[85097]: 2025-12-02 08:33:04.76932491 +0000 UTC m=+0.396787853 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step5, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:33:04 np0005541913.localdomain podman[85096]: 2025-12-02 08:33:04.889092944 +0000 UTC m=+0.520124795 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z)
Dec 02 08:33:04 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:33:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:33:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:33:08 np0005541913.localdomain podman[85218]: 2025-12-02 08:33:08.457337803 +0000 UTC m=+0.091016169 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Dec 02 08:33:08 np0005541913.localdomain systemd[1]: tmp-crun.Y5q4x2.mount: Deactivated successfully.
Dec 02 08:33:08 np0005541913.localdomain podman[85219]: 2025-12-02 08:33:08.518268497 +0000 UTC m=+0.148481798 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible)
Dec 02 08:33:08 np0005541913.localdomain podman[85218]: 2025-12-02 08:33:08.534185382 +0000 UTC m=+0.167863728 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1)
Dec 02 08:33:08 np0005541913.localdomain podman[85219]: 2025-12-02 08:33:08.542998773 +0000 UTC m=+0.173212084 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:33:08 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:33:08 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:33:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:33:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:33:16 np0005541913.localdomain podman[85269]: 2025-12-02 08:33:16.460663435 +0000 UTC m=+0.090095634 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 02 08:33:16 np0005541913.localdomain podman[85269]: 2025-12-02 08:33:16.500120363 +0000 UTC m=+0.129552552 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, version=17.1.12)
Dec 02 08:33:16 np0005541913.localdomain systemd[1]: tmp-crun.dftgzn.mount: Deactivated successfully.
Dec 02 08:33:16 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:33:16 np0005541913.localdomain podman[85268]: 2025-12-02 08:33:16.520381816 +0000 UTC m=+0.152262901 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container)
Dec 02 08:33:16 np0005541913.localdomain podman[85268]: 2025-12-02 08:33:16.53298551 +0000 UTC m=+0.164866665 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container)
Dec 02 08:33:16 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:33:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:33:26 np0005541913.localdomain podman[85306]: 2025-12-02 08:33:26.454345847 +0000 UTC m=+0.093219498 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 02 08:33:26 np0005541913.localdomain podman[85306]: 2025-12-02 08:33:26.651108643 +0000 UTC m=+0.289982294 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true)
Dec 02 08:33:26 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:33:35 np0005541913.localdomain podman[85336]: 2025-12-02 08:33:35.44611011 +0000 UTC m=+0.088747446 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 02 08:33:35 np0005541913.localdomain podman[85336]: 2025-12-02 08:33:35.454937451 +0000 UTC m=+0.097574787 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: tmp-crun.XGUMgP.mount: Deactivated successfully.
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:33:35 np0005541913.localdomain podman[85338]: 2025-12-02 08:33:35.493468034 +0000 UTC m=+0.126186469 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-type=git, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute)
Dec 02 08:33:35 np0005541913.localdomain podman[85345]: 2025-12-02 08:33:35.467179105 +0000 UTC m=+0.097024831 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:33:35 np0005541913.localdomain podman[85344]: 2025-12-02 08:33:35.522428645 +0000 UTC m=+0.154226395 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public)
Dec 02 08:33:35 np0005541913.localdomain podman[85345]: 2025-12-02 08:33:35.547824269 +0000 UTC m=+0.177669995 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:33:35 np0005541913.localdomain podman[85344]: 2025-12-02 08:33:35.572883304 +0000 UTC m=+0.204680974 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, version=17.1.12, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute)
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:33:35 np0005541913.localdomain podman[85338]: 2025-12-02 08:33:35.62322195 +0000 UTC m=+0.255940395 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:33:35 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:33:35 np0005541913.localdomain podman[85337]: 2025-12-02 08:33:35.704912042 +0000 UTC m=+0.343359143 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:36:58Z, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Dec 02 08:33:36 np0005541913.localdomain podman[85337]: 2025-12-02 08:33:36.128925209 +0000 UTC m=+0.767372290 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:33:36 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:33:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:33:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:33:39 np0005541913.localdomain systemd[1]: tmp-crun.DbO9s4.mount: Deactivated successfully.
Dec 02 08:33:39 np0005541913.localdomain podman[85450]: 2025-12-02 08:33:39.466637047 +0000 UTC m=+0.106247025 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vendor=Red Hat, Inc.)
Dec 02 08:33:39 np0005541913.localdomain podman[85450]: 2025-12-02 08:33:39.496991706 +0000 UTC m=+0.136601654 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 02 08:33:39 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:33:39 np0005541913.localdomain podman[85449]: 2025-12-02 08:33:39.542744646 +0000 UTC m=+0.186095526 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, architecture=x86_64)
Dec 02 08:33:39 np0005541913.localdomain podman[85449]: 2025-12-02 08:33:39.585061662 +0000 UTC m=+0.228412592 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:33:39 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:33:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:33:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:33:47 np0005541913.localdomain podman[85497]: 2025-12-02 08:33:47.450247728 +0000 UTC m=+0.086725451 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, name=rhosp17/openstack-collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:33:47 np0005541913.localdomain podman[85497]: 2025-12-02 08:33:47.46059297 +0000 UTC m=+0.097070753 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=)
Dec 02 08:33:47 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:33:47 np0005541913.localdomain systemd[1]: tmp-crun.YhGCvL.mount: Deactivated successfully.
Dec 02 08:33:47 np0005541913.localdomain podman[85498]: 2025-12-02 08:33:47.555532574 +0000 UTC m=+0.188118211 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:33:47 np0005541913.localdomain podman[85498]: 2025-12-02 08:33:47.564683775 +0000 UTC m=+0.197269392 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1)
Dec 02 08:33:47 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:33:56 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:33:56 np0005541913.localdomain recover_tripleo_nova_virtqemud[85582]: 62312
Dec 02 08:33:56 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:33:56 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:33:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:33:57 np0005541913.localdomain podman[85583]: 2025-12-02 08:33:57.429921473 +0000 UTC m=+0.066240930 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=)
Dec 02 08:33:57 np0005541913.localdomain podman[85583]: 2025-12-02 08:33:57.616186964 +0000 UTC m=+0.252506431 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc.)
Dec 02 08:33:57 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:34:00 np0005541913.localdomain sudo[85612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:34:00 np0005541913.localdomain sudo[85612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:34:00 np0005541913.localdomain sudo[85612]: pam_unix(sudo:session): session closed for user root
Dec 02 08:34:00 np0005541913.localdomain sudo[85627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:34:00 np0005541913.localdomain sudo[85627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:34:00 np0005541913.localdomain sudo[85627]: pam_unix(sudo:session): session closed for user root
Dec 02 08:34:01 np0005541913.localdomain sudo[85673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:34:01 np0005541913.localdomain sudo[85673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:34:01 np0005541913.localdomain sudo[85673]: pam_unix(sudo:session): session closed for user root
Dec 02 08:34:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:34:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:34:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:34:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:34:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:34:06 np0005541913.localdomain podman[85689]: 2025-12-02 08:34:06.470660815 +0000 UTC m=+0.103420667 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1761123044, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team)
Dec 02 08:34:06 np0005541913.localdomain podman[85689]: 2025-12-02 08:34:06.505129937 +0000 UTC m=+0.137889799 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, distribution-scope=public, container_name=logrotate_crond, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:34:06 np0005541913.localdomain podman[85691]: 2025-12-02 08:34:06.517478884 +0000 UTC m=+0.145250560 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4)
Dec 02 08:34:06 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:34:06 np0005541913.localdomain podman[85691]: 2025-12-02 08:34:06.56529676 +0000 UTC m=+0.193068466 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Dec 02 08:34:06 np0005541913.localdomain podman[85692]: 2025-12-02 08:34:06.565363022 +0000 UTC m=+0.187687700 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:34:06 np0005541913.localdomain podman[85692]: 2025-12-02 08:34:06.615273466 +0000 UTC m=+0.237598114 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute)
Dec 02 08:34:06 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:34:06 np0005541913.localdomain podman[85690]: 2025-12-02 08:34:06.633922046 +0000 UTC m=+0.265075185 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:34:07 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:34:07 np0005541913.localdomain podman[85690]: 2025-12-02 08:34:07.040140437 +0000 UTC m=+0.671293576 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:34:07 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:34:07 np0005541913.localdomain podman[85704]: 2025-12-02 08:34:07.070159937 +0000 UTC m=+0.689760990 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, vcs-type=git, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:34:07 np0005541913.localdomain podman[85704]: 2025-12-02 08:34:07.088493768 +0000 UTC m=+0.708094881 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12)
Dec 02 08:34:07 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:34:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:34:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:34:10 np0005541913.localdomain podman[85809]: 2025-12-02 08:34:10.452731329 +0000 UTC m=+0.092864879 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:34:10 np0005541913.localdomain podman[85810]: 2025-12-02 08:34:10.509046368 +0000 UTC m=+0.148054077 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Dec 02 08:34:10 np0005541913.localdomain podman[85810]: 2025-12-02 08:34:10.530443343 +0000 UTC m=+0.169451022 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, distribution-scope=public, release=1761123044, version=17.1.12, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:34:10 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:34:10 np0005541913.localdomain podman[85809]: 2025-12-02 08:34:10.582600137 +0000 UTC m=+0.222733687 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:34:10 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:34:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:34:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:34:18 np0005541913.localdomain podman[85857]: 2025-12-02 08:34:18.498922762 +0000 UTC m=+0.130064045 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:34:18 np0005541913.localdomain podman[85857]: 2025-12-02 08:34:18.50871304 +0000 UTC m=+0.139854303 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid)
Dec 02 08:34:18 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:34:18 np0005541913.localdomain podman[85856]: 2025-12-02 08:34:18.607925311 +0000 UTC m=+0.239603098 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 08:34:18 np0005541913.localdomain podman[85856]: 2025-12-02 08:34:18.618203352 +0000 UTC m=+0.249881159 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.expose-services=)
Dec 02 08:34:18 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:34:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:34:28 np0005541913.localdomain systemd[1]: tmp-crun.rg8n1J.mount: Deactivated successfully.
Dec 02 08:34:28 np0005541913.localdomain podman[85896]: 2025-12-02 08:34:28.465747279 +0000 UTC m=+0.109948776 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Dec 02 08:34:28 np0005541913.localdomain podman[85896]: 2025-12-02 08:34:28.662513375 +0000 UTC m=+0.306714852 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:34:28 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:34:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:34:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:34:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:34:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:34:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:34:37 np0005541913.localdomain podman[85925]: 2025-12-02 08:34:37.534691669 +0000 UTC m=+0.167468587 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=logrotate_crond, name=rhosp17/openstack-cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:34:37 np0005541913.localdomain podman[85925]: 2025-12-02 08:34:37.56509011 +0000 UTC m=+0.197867038 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=logrotate_crond, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 02 08:34:37 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:34:37 np0005541913.localdomain podman[85927]: 2025-12-02 08:34:37.61888102 +0000 UTC m=+0.244680797 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step5, version=17.1.12, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:34:37 np0005541913.localdomain podman[85926]: 2025-12-02 08:34:37.671140528 +0000 UTC m=+0.302652682 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 02 08:34:37 np0005541913.localdomain podman[85927]: 2025-12-02 08:34:37.691661698 +0000 UTC m=+0.317461415 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, container_name=nova_compute, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc.)
Dec 02 08:34:37 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:34:37 np0005541913.localdomain podman[85939]: 2025-12-02 08:34:37.781834613 +0000 UTC m=+0.402512810 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public)
Dec 02 08:34:37 np0005541913.localdomain podman[85939]: 2025-12-02 08:34:37.806896067 +0000 UTC m=+0.427574334 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Dec 02 08:34:37 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:34:37 np0005541913.localdomain podman[85928]: 2025-12-02 08:34:37.822251807 +0000 UTC m=+0.445355081 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:34:37 np0005541913.localdomain podman[85928]: 2025-12-02 08:34:37.874118244 +0000 UTC m=+0.497221508 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Dec 02 08:34:37 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:34:38 np0005541913.localdomain podman[85926]: 2025-12-02 08:34:38.038238499 +0000 UTC m=+0.669750613 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:34:38 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:34:38 np0005541913.localdomain systemd[1]: tmp-crun.hbPxII.mount: Deactivated successfully.
Dec 02 08:34:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:34:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:34:41 np0005541913.localdomain systemd[1]: tmp-crun.PzMkJN.mount: Deactivated successfully.
Dec 02 08:34:41 np0005541913.localdomain podman[86043]: 2025-12-02 08:34:41.474766387 +0000 UTC m=+0.083759340 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Dec 02 08:34:41 np0005541913.localdomain podman[86043]: 2025-12-02 08:34:41.490934109 +0000 UTC m=+0.099927022 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:34:41 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:34:41 np0005541913.localdomain podman[86042]: 2025-12-02 08:34:41.577502723 +0000 UTC m=+0.190684781 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:34:41 np0005541913.localdomain podman[86042]: 2025-12-02 08:34:41.615360618 +0000 UTC m=+0.228542616 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:34:41 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:34:42 np0005541913.localdomain systemd[1]: tmp-crun.9QhACE.mount: Deactivated successfully.
Dec 02 08:34:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:34:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:34:49 np0005541913.localdomain systemd[1]: tmp-crun.bkOuTZ.mount: Deactivated successfully.
Dec 02 08:34:49 np0005541913.localdomain systemd[1]: tmp-crun.AMeEHK.mount: Deactivated successfully.
Dec 02 08:34:49 np0005541913.localdomain podman[86094]: 2025-12-02 08:34:49.496293385 +0000 UTC m=+0.134943878 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T23:44:13Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com)
Dec 02 08:34:49 np0005541913.localdomain podman[86093]: 2025-12-02 08:34:49.461932357 +0000 UTC m=+0.104769974 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:51:28Z, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3)
Dec 02 08:34:49 np0005541913.localdomain podman[86094]: 2025-12-02 08:34:49.530098469 +0000 UTC m=+0.168748972 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible)
Dec 02 08:34:49 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:34:49 np0005541913.localdomain podman[86093]: 2025-12-02 08:34:49.543547837 +0000 UTC m=+0.186385454 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., container_name=collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:34:49 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:34:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:34:59 np0005541913.localdomain systemd[1]: tmp-crun.r9eXfC.mount: Deactivated successfully.
Dec 02 08:34:59 np0005541913.localdomain podman[86179]: 2025-12-02 08:34:59.435531722 +0000 UTC m=+0.080354126 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Dec 02 08:34:59 np0005541913.localdomain podman[86179]: 2025-12-02 08:34:59.637187793 +0000 UTC m=+0.282010237 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, distribution-scope=public)
Dec 02 08:34:59 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:35:01 np0005541913.localdomain sudo[86208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:35:01 np0005541913.localdomain sudo[86208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:35:01 np0005541913.localdomain sudo[86208]: pam_unix(sudo:session): session closed for user root
Dec 02 08:35:01 np0005541913.localdomain sudo[86223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:35:01 np0005541913.localdomain sudo[86223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:35:02 np0005541913.localdomain sudo[86223]: pam_unix(sudo:session): session closed for user root
Dec 02 08:35:03 np0005541913.localdomain sudo[86270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:35:03 np0005541913.localdomain sudo[86270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:35:03 np0005541913.localdomain sudo[86270]: pam_unix(sudo:session): session closed for user root
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:35:08 np0005541913.localdomain podman[86286]: 2025-12-02 08:35:08.458182337 +0000 UTC m=+0.088118069 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: tmp-crun.0eowMB.mount: Deactivated successfully.
Dec 02 08:35:08 np0005541913.localdomain podman[86298]: 2025-12-02 08:35:08.510013322 +0000 UTC m=+0.133473577 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:35:08 np0005541913.localdomain podman[86298]: 2025-12-02 08:35:08.542909302 +0000 UTC m=+0.166369577 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc.)
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:35:08 np0005541913.localdomain podman[86287]: 2025-12-02 08:35:08.561009967 +0000 UTC m=+0.190151078 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step5, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12)
Dec 02 08:35:08 np0005541913.localdomain podman[86285]: 2025-12-02 08:35:08.609052989 +0000 UTC m=+0.241472489 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vcs-type=git, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron)
Dec 02 08:35:08 np0005541913.localdomain podman[86285]: 2025-12-02 08:35:08.618772864 +0000 UTC m=+0.251192284 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:49:32Z)
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:35:08 np0005541913.localdomain podman[86288]: 2025-12-02 08:35:08.661911804 +0000 UTC m=+0.286982854 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc.)
Dec 02 08:35:08 np0005541913.localdomain podman[86287]: 2025-12-02 08:35:08.662918901 +0000 UTC m=+0.292060012 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5)
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:35:08 np0005541913.localdomain podman[86288]: 2025-12-02 08:35:08.745101576 +0000 UTC m=+0.370172617 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute)
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:35:08 np0005541913.localdomain podman[86286]: 2025-12-02 08:35:08.874076721 +0000 UTC m=+0.504012423 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4)
Dec 02 08:35:08 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:35:09 np0005541913.localdomain systemd[1]: tmp-crun.E6E5zX.mount: Deactivated successfully.
Dec 02 08:35:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:35:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:35:12 np0005541913.localdomain systemd[1]: tmp-crun.9iJfQN.mount: Deactivated successfully.
Dec 02 08:35:12 np0005541913.localdomain podman[86405]: 2025-12-02 08:35:12.46173889 +0000 UTC m=+0.092249852 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:35:12 np0005541913.localdomain podman[86405]: 2025-12-02 08:35:12.488029878 +0000 UTC m=+0.118540890 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ovn_controller, tcib_managed=true, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:35:12 np0005541913.localdomain systemd[1]: tmp-crun.dkyVEY.mount: Deactivated successfully.
Dec 02 08:35:12 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:35:12 np0005541913.localdomain podman[86404]: 2025-12-02 08:35:12.503556822 +0000 UTC m=+0.138573817 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Dec 02 08:35:12 np0005541913.localdomain podman[86404]: 2025-12-02 08:35:12.545905779 +0000 UTC m=+0.180922764 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, tcib_managed=true, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=)
Dec 02 08:35:12 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:35:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:35:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:35:20 np0005541913.localdomain systemd[1]: tmp-crun.nHRQ1X.mount: Deactivated successfully.
Dec 02 08:35:20 np0005541913.localdomain podman[86451]: 2025-12-02 08:35:20.466712587 +0000 UTC m=+0.090127575 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 02 08:35:20 np0005541913.localdomain podman[86451]: 2025-12-02 08:35:20.507125721 +0000 UTC m=+0.130540729 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Dec 02 08:35:20 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:35:20 np0005541913.localdomain podman[86452]: 2025-12-02 08:35:20.51806512 +0000 UTC m=+0.136679586 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 02 08:35:20 np0005541913.localdomain podman[86452]: 2025-12-02 08:35:20.606132486 +0000 UTC m=+0.224746932 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, version=17.1.12, tcib_managed=true)
Dec 02 08:35:20 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:35:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:35:30 np0005541913.localdomain podman[86490]: 2025-12-02 08:35:30.437048109 +0000 UTC m=+0.081073827 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z)
Dec 02 08:35:30 np0005541913.localdomain podman[86490]: 2025-12-02 08:35:30.656071054 +0000 UTC m=+0.300096792 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 02 08:35:30 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: tmp-crun.eo1gh5.mount: Deactivated successfully.
Dec 02 08:35:39 np0005541913.localdomain podman[86520]: 2025-12-02 08:35:39.487501784 +0000 UTC m=+0.123527977 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Dec 02 08:35:39 np0005541913.localdomain podman[86519]: 2025-12-02 08:35:39.431228476 +0000 UTC m=+0.073524160 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container)
Dec 02 08:35:39 np0005541913.localdomain podman[86528]: 2025-12-02 08:35:39.463440677 +0000 UTC m=+0.091468201 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:35:39 np0005541913.localdomain podman[86519]: 2025-12-02 08:35:39.532892105 +0000 UTC m=+0.175187759 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:35:39 np0005541913.localdomain podman[86533]: 2025-12-02 08:35:39.574241354 +0000 UTC m=+0.194337041 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:35:39 np0005541913.localdomain podman[86528]: 2025-12-02 08:35:39.59786444 +0000 UTC m=+0.225891984 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute)
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:35:39 np0005541913.localdomain podman[86526]: 2025-12-02 08:35:39.547984577 +0000 UTC m=+0.178673724 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:35:39 np0005541913.localdomain podman[86533]: 2025-12-02 08:35:39.656856672 +0000 UTC m=+0.276952379 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:35:39 np0005541913.localdomain podman[86526]: 2025-12-02 08:35:39.677530897 +0000 UTC m=+0.308220054 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:35:39 np0005541913.localdomain podman[86520]: 2025-12-02 08:35:39.851953733 +0000 UTC m=+0.487979986 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.12, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Dec 02 08:35:39 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:35:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:35:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:35:43 np0005541913.localdomain podman[86642]: 2025-12-02 08:35:43.428382933 +0000 UTC m=+0.064172255 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, architecture=x86_64)
Dec 02 08:35:43 np0005541913.localdomain podman[86642]: 2025-12-02 08:35:43.456986985 +0000 UTC m=+0.092776317 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, release=1761123044, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 02 08:35:43 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:35:43 np0005541913.localdomain systemd[1]: tmp-crun.qYgwrx.mount: Deactivated successfully.
Dec 02 08:35:43 np0005541913.localdomain podman[86641]: 2025-12-02 08:35:43.548894926 +0000 UTC m=+0.183909556 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:35:43 np0005541913.localdomain podman[86641]: 2025-12-02 08:35:43.58707882 +0000 UTC m=+0.222093500 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 02 08:35:43 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:35:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:35:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:35:51 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:35:51 np0005541913.localdomain recover_tripleo_nova_virtqemud[86697]: 62312
Dec 02 08:35:51 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:35:51 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:35:51 np0005541913.localdomain systemd[1]: tmp-crun.9YrZLb.mount: Deactivated successfully.
Dec 02 08:35:51 np0005541913.localdomain podman[86690]: 2025-12-02 08:35:51.438707997 +0000 UTC m=+0.083313658 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:35:51 np0005541913.localdomain systemd[1]: tmp-crun.qlfYd8.mount: Deactivated successfully.
Dec 02 08:35:51 np0005541913.localdomain podman[86689]: 2025-12-02 08:35:51.473742334 +0000 UTC m=+0.122049687 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:35:51 np0005541913.localdomain podman[86689]: 2025-12-02 08:35:51.482075942 +0000 UTC m=+0.130383295 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd)
Dec 02 08:35:51 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:35:51 np0005541913.localdomain podman[86690]: 2025-12-02 08:35:51.498081169 +0000 UTC m=+0.142686810 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible)
Dec 02 08:35:51 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:36:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:36:01 np0005541913.localdomain systemd[1]: tmp-crun.XMMfST.mount: Deactivated successfully.
Dec 02 08:36:01 np0005541913.localdomain podman[86775]: 2025-12-02 08:36:01.446546063 +0000 UTC m=+0.079651857 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, version=17.1.12, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Dec 02 08:36:01 np0005541913.localdomain podman[86775]: 2025-12-02 08:36:01.64588971 +0000 UTC m=+0.278995454 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64)
Dec 02 08:36:01 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:36:03 np0005541913.localdomain sudo[86804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:36:03 np0005541913.localdomain sudo[86804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:36:03 np0005541913.localdomain sudo[86804]: pam_unix(sudo:session): session closed for user root
Dec 02 08:36:03 np0005541913.localdomain sudo[86819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:36:03 np0005541913.localdomain sudo[86819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:36:04 np0005541913.localdomain sudo[86819]: pam_unix(sudo:session): session closed for user root
Dec 02 08:36:04 np0005541913.localdomain sudo[86865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:36:04 np0005541913.localdomain sudo[86865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:36:04 np0005541913.localdomain sudo[86865]: pam_unix(sudo:session): session closed for user root
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:36:10 np0005541913.localdomain podman[86880]: 2025-12-02 08:36:10.436749051 +0000 UTC m=+0.073859269 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, release=1761123044, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: tmp-crun.umNOmG.mount: Deactivated successfully.
Dec 02 08:36:10 np0005541913.localdomain podman[86894]: 2025-12-02 08:36:10.508344698 +0000 UTC m=+0.131861324 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 02 08:36:10 np0005541913.localdomain podman[86882]: 2025-12-02 08:36:10.464965922 +0000 UTC m=+0.093252089 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 02 08:36:10 np0005541913.localdomain podman[86894]: 2025-12-02 08:36:10.529904919 +0000 UTC m=+0.153421555 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com)
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:36:10 np0005541913.localdomain podman[86881]: 2025-12-02 08:36:10.490746316 +0000 UTC m=+0.126095267 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:36:10 np0005541913.localdomain podman[86882]: 2025-12-02 08:36:10.546969069 +0000 UTC m=+0.175255206 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, container_name=nova_compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:36:10 np0005541913.localdomain podman[86888]: 2025-12-02 08:36:10.587576771 +0000 UTC m=+0.216145186 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true)
Dec 02 08:36:10 np0005541913.localdomain podman[86888]: 2025-12-02 08:36:10.612067128 +0000 UTC m=+0.240635533 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:36:10 np0005541913.localdomain podman[86880]: 2025-12-02 08:36:10.623215929 +0000 UTC m=+0.260326157 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:36:10 np0005541913.localdomain podman[86881]: 2025-12-02 08:36:10.857127407 +0000 UTC m=+0.492476318 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:36:10 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:36:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:36:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:36:14 np0005541913.localdomain podman[87002]: 2025-12-02 08:36:14.408249633 +0000 UTC m=+0.052455662 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, build-date=2025-11-19T00:14:25Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 02 08:36:14 np0005541913.localdomain podman[87002]: 2025-12-02 08:36:14.454963099 +0000 UTC m=+0.099169148 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 02 08:36:14 np0005541913.localdomain systemd[1]: tmp-crun.T9HKY7.mount: Deactivated successfully.
Dec 02 08:36:14 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:36:14 np0005541913.localdomain podman[87003]: 2025-12-02 08:36:14.471947097 +0000 UTC m=+0.114821662 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1)
Dec 02 08:36:14 np0005541913.localdomain podman[87003]: 2025-12-02 08:36:14.491984081 +0000 UTC m=+0.134858676 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:36:14 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:36:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:36:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:36:22 np0005541913.localdomain podman[87050]: 2025-12-02 08:36:22.426128074 +0000 UTC m=+0.068089145 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:36:22 np0005541913.localdomain podman[87050]: 2025-12-02 08:36:22.464046009 +0000 UTC m=+0.106007000 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd)
Dec 02 08:36:22 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:36:22 np0005541913.localdomain podman[87051]: 2025-12-02 08:36:22.531297202 +0000 UTC m=+0.168887944 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, container_name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:36:22 np0005541913.localdomain podman[87051]: 2025-12-02 08:36:22.541882628 +0000 UTC m=+0.179473370 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, release=1761123044, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:36:22 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:36:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:36:32 np0005541913.localdomain podman[87088]: 2025-12-02 08:36:32.435482733 +0000 UTC m=+0.080254182 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=metrics_qdr, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:36:32 np0005541913.localdomain podman[87088]: 2025-12-02 08:36:32.630923654 +0000 UTC m=+0.275695033 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:36:32 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:36:41 np0005541913.localdomain podman[87118]: 2025-12-02 08:36:41.450773984 +0000 UTC m=+0.085861022 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: tmp-crun.T3lGnH.mount: Deactivated successfully.
Dec 02 08:36:41 np0005541913.localdomain podman[87121]: 2025-12-02 08:36:41.512451597 +0000 UTC m=+0.142051986 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 02 08:36:41 np0005541913.localdomain podman[87119]: 2025-12-02 08:36:41.548065874 +0000 UTC m=+0.181779498 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute)
Dec 02 08:36:41 np0005541913.localdomain podman[87121]: 2025-12-02 08:36:41.562001035 +0000 UTC m=+0.191601424 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:36:41 np0005541913.localdomain podman[87117]: 2025-12-02 08:36:41.602636717 +0000 UTC m=+0.238918995 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:36:41 np0005541913.localdomain podman[87119]: 2025-12-02 08:36:41.616944429 +0000 UTC m=+0.250658063 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z)
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:36:41 np0005541913.localdomain podman[87120]: 2025-12-02 08:36:41.665006678 +0000 UTC m=+0.295263455 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Dec 02 08:36:41 np0005541913.localdomain podman[87117]: 2025-12-02 08:36:41.692463929 +0000 UTC m=+0.328746287 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=logrotate_crond, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container)
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:36:41 np0005541913.localdomain podman[87120]: 2025-12-02 08:36:41.71711956 +0000 UTC m=+0.347376337 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:36:41 np0005541913.localdomain podman[87118]: 2025-12-02 08:36:41.770372131 +0000 UTC m=+0.405459199 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4)
Dec 02 08:36:41 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:36:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:36:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:36:45 np0005541913.localdomain systemd[1]: tmp-crun.kZjPd7.mount: Deactivated successfully.
Dec 02 08:36:45 np0005541913.localdomain podman[87237]: 2025-12-02 08:36:45.425644108 +0000 UTC m=+0.067104591 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:36:45 np0005541913.localdomain podman[87237]: 2025-12-02 08:36:45.447948669 +0000 UTC m=+0.089409142 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=ovn_controller, release=1761123044)
Dec 02 08:36:45 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:36:45 np0005541913.localdomain podman[87236]: 2025-12-02 08:36:45.531483113 +0000 UTC m=+0.174601537 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:36:45 np0005541913.localdomain podman[87236]: 2025-12-02 08:36:45.587006911 +0000 UTC m=+0.230125295 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 08:36:45 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:36:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:36:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:36:53 np0005541913.localdomain systemd[1]: tmp-crun.7R4Tdc.mount: Deactivated successfully.
Dec 02 08:36:53 np0005541913.localdomain podman[87308]: 2025-12-02 08:36:53.447400726 +0000 UTC m=+0.091714510 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, container_name=collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com)
Dec 02 08:36:53 np0005541913.localdomain podman[87308]: 2025-12-02 08:36:53.481430733 +0000 UTC m=+0.125744517 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, container_name=collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, release=1761123044)
Dec 02 08:36:53 np0005541913.localdomain podman[87309]: 2025-12-02 08:36:53.493290581 +0000 UTC m=+0.134024025 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, distribution-scope=public)
Dec 02 08:36:53 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:36:53 np0005541913.localdomain podman[87309]: 2025-12-02 08:36:53.502446092 +0000 UTC m=+0.143179526 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:36:53 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:36:54 np0005541913.localdomain systemd[1]: tmp-crun.GQBwYt.mount: Deactivated successfully.
Dec 02 08:37:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:37:03 np0005541913.localdomain systemd[84191]: Created slice User Background Tasks Slice.
Dec 02 08:37:03 np0005541913.localdomain podman[87368]: 2025-12-02 08:37:03.437749687 +0000 UTC m=+0.077893483 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 02 08:37:03 np0005541913.localdomain systemd[84191]: Starting Cleanup of User's Temporary Files and Directories...
Dec 02 08:37:03 np0005541913.localdomain systemd[84191]: Finished Cleanup of User's Temporary Files and Directories.
Dec 02 08:37:03 np0005541913.localdomain podman[87368]: 2025-12-02 08:37:03.662032633 +0000 UTC m=+0.302176379 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:37:03 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:37:05 np0005541913.localdomain sudo[87397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:37:05 np0005541913.localdomain sudo[87397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:37:05 np0005541913.localdomain sudo[87397]: pam_unix(sudo:session): session closed for user root
Dec 02 08:37:05 np0005541913.localdomain sudo[87412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:37:05 np0005541913.localdomain sudo[87412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:37:05 np0005541913.localdomain sudo[87412]: pam_unix(sudo:session): session closed for user root
Dec 02 08:37:08 np0005541913.localdomain sudo[87459]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:37:08 np0005541913.localdomain sudo[87459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:37:08 np0005541913.localdomain sudo[87459]: pam_unix(sudo:session): session closed for user root
Dec 02 08:37:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:37:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 4776 writes, 21K keys, 4776 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4776 writes, 569 syncs, 8.39 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 341 writes, 1463 keys, 341 commit groups, 1.0 writes per commit group, ingest: 1.82 MB, 0.00 MB/s
                                                          Interval WAL: 341 writes, 122 syncs, 2.80 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:37:12 np0005541913.localdomain podman[87474]: 2025-12-02 08:37:12.467119204 +0000 UTC m=+0.102565934 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=)
Dec 02 08:37:12 np0005541913.localdomain podman[87474]: 2025-12-02 08:37:12.476962792 +0000 UTC m=+0.112409552 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=logrotate_crond, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container)
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:37:12 np0005541913.localdomain podman[87485]: 2025-12-02 08:37:12.51503942 +0000 UTC m=+0.137518193 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: tmp-crun.ll5dEb.mount: Deactivated successfully.
Dec 02 08:37:12 np0005541913.localdomain podman[87485]: 2025-12-02 08:37:12.565248745 +0000 UTC m=+0.187727508 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:37:12 np0005541913.localdomain podman[87476]: 2025-12-02 08:37:12.5650647 +0000 UTC m=+0.195760700 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step5, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:37:12 np0005541913.localdomain podman[87475]: 2025-12-02 08:37:12.622090065 +0000 UTC m=+0.255285268 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Dec 02 08:37:12 np0005541913.localdomain podman[87477]: 2025-12-02 08:37:12.681993314 +0000 UTC m=+0.306471197 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team)
Dec 02 08:37:12 np0005541913.localdomain podman[87476]: 2025-12-02 08:37:12.700325345 +0000 UTC m=+0.331021415 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:37:12 np0005541913.localdomain podman[87477]: 2025-12-02 08:37:12.737514722 +0000 UTC m=+0.361992595 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:37:12 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:37:12 np0005541913.localdomain podman[87475]: 2025-12-02 08:37:12.994146673 +0000 UTC m=+0.627341916 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Dec 02 08:37:13 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:37:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:37:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.2 total, 600.0 interval
                                                          Cumulative writes: 5722 writes, 25K keys, 5722 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5722 writes, 780 syncs, 7.34 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 546 writes, 2239 keys, 546 commit groups, 1.0 writes per commit group, ingest: 2.75 MB, 0.00 MB/s
                                                          Interval WAL: 546 writes, 172 syncs, 3.17 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:37:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:37:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:37:16 np0005541913.localdomain systemd[1]: tmp-crun.lxwVjc.mount: Deactivated successfully.
Dec 02 08:37:16 np0005541913.localdomain podman[87596]: 2025-12-02 08:37:16.459947958 +0000 UTC m=+0.098937001 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true)
Dec 02 08:37:16 np0005541913.localdomain podman[87596]: 2025-12-02 08:37:16.503253959 +0000 UTC m=+0.142242982 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public)
Dec 02 08:37:16 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:37:16 np0005541913.localdomain systemd[1]: tmp-crun.WWD2yb.mount: Deactivated successfully.
Dec 02 08:37:16 np0005541913.localdomain podman[87597]: 2025-12-02 08:37:16.55651127 +0000 UTC m=+0.196807776 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc.)
Dec 02 08:37:16 np0005541913.localdomain podman[87597]: 2025-12-02 08:37:16.578696948 +0000 UTC m=+0.218993454 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 02 08:37:16 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:37:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:37:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:37:24 np0005541913.localdomain podman[87644]: 2025-12-02 08:37:24.450791989 +0000 UTC m=+0.087507703 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z)
Dec 02 08:37:24 np0005541913.localdomain podman[87644]: 2025-12-02 08:37:24.489157385 +0000 UTC m=+0.125873109 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd)
Dec 02 08:37:24 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:37:24 np0005541913.localdomain podman[87645]: 2025-12-02 08:37:24.506632895 +0000 UTC m=+0.138960659 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:37:24 np0005541913.localdomain podman[87645]: 2025-12-02 08:37:24.518048553 +0000 UTC m=+0.150376357 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true)
Dec 02 08:37:24 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:37:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:37:34 np0005541913.localdomain systemd[1]: tmp-crun.aTFu4L.mount: Deactivated successfully.
Dec 02 08:37:34 np0005541913.localdomain podman[87684]: 2025-12-02 08:37:34.455146381 +0000 UTC m=+0.097478595 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git)
Dec 02 08:37:34 np0005541913.localdomain podman[87684]: 2025-12-02 08:37:34.65727012 +0000 UTC m=+0.299602284 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, container_name=metrics_qdr, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:37:34 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:37:36 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:37:36 np0005541913.localdomain recover_tripleo_nova_virtqemud[87714]: 62312
Dec 02 08:37:36 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:37:36 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:37:43 np0005541913.localdomain podman[87716]: 2025-12-02 08:37:43.458794359 +0000 UTC m=+0.093515665 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.expose-services=)
Dec 02 08:37:43 np0005541913.localdomain podman[87727]: 2025-12-02 08:37:43.486363993 +0000 UTC m=+0.107202329 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible)
Dec 02 08:37:43 np0005541913.localdomain podman[87715]: 2025-12-02 08:37:43.509844854 +0000 UTC m=+0.146326715 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:37:43 np0005541913.localdomain podman[87715]: 2025-12-02 08:37:43.518114572 +0000 UTC m=+0.154596433 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc.)
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:37:43 np0005541913.localdomain podman[87727]: 2025-12-02 08:37:43.573454386 +0000 UTC m=+0.194292722 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, release=1761123044, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z)
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:37:43 np0005541913.localdomain podman[87717]: 2025-12-02 08:37:43.6526485 +0000 UTC m=+0.285344746 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, release=1761123044, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Dec 02 08:37:43 np0005541913.localdomain podman[87717]: 2025-12-02 08:37:43.70388926 +0000 UTC m=+0.336585526 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:37:43 np0005541913.localdomain podman[87718]: 2025-12-02 08:37:43.714746223 +0000 UTC m=+0.340481543 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:37:43 np0005541913.localdomain podman[87718]: 2025-12-02 08:37:43.744514582 +0000 UTC m=+0.370249822 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=)
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:37:43 np0005541913.localdomain podman[87716]: 2025-12-02 08:37:43.816074825 +0000 UTC m=+0.450796121 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12)
Dec 02 08:37:43 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:37:44 np0005541913.localdomain systemd[1]: tmp-crun.WcFyCn.mount: Deactivated successfully.
Dec 02 08:37:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:37:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:37:47 np0005541913.localdomain systemd[1]: tmp-crun.6COnvA.mount: Deactivated successfully.
Dec 02 08:37:47 np0005541913.localdomain podman[87836]: 2025-12-02 08:37:47.447746697 +0000 UTC m=+0.086757376 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:37:47 np0005541913.localdomain podman[87837]: 2025-12-02 08:37:47.499231382 +0000 UTC m=+0.136470096 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044)
Dec 02 08:37:47 np0005541913.localdomain podman[87836]: 2025-12-02 08:37:47.508014393 +0000 UTC m=+0.147025012 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Dec 02 08:37:47 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:37:47 np0005541913.localdomain podman[87837]: 2025-12-02 08:37:47.525224528 +0000 UTC m=+0.162463202 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:37:47 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:37:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:37:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:37:55 np0005541913.localdomain systemd[1]: tmp-crun.wyXobx.mount: Deactivated successfully.
Dec 02 08:37:55 np0005541913.localdomain podman[87932]: 2025-12-02 08:37:55.453587373 +0000 UTC m=+0.099781193 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, version=17.1.12, vcs-type=git, com.redhat.component=openstack-collectd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, release=1761123044, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:37:55 np0005541913.localdomain podman[87932]: 2025-12-02 08:37:55.491032155 +0000 UTC m=+0.137226005 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:37:55 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:37:55 np0005541913.localdomain podman[87933]: 2025-12-02 08:37:55.540221204 +0000 UTC m=+0.179076189 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12)
Dec 02 08:37:55 np0005541913.localdomain podman[87933]: 2025-12-02 08:37:55.577075722 +0000 UTC m=+0.215930677 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:37:55 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:38:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:38:05 np0005541913.localdomain systemd[1]: tmp-crun.7TSWqh.mount: Deactivated successfully.
Dec 02 08:38:05 np0005541913.localdomain podman[87970]: 2025-12-02 08:38:05.435786856 +0000 UTC m=+0.079630305 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 02 08:38:05 np0005541913.localdomain podman[87970]: 2025-12-02 08:38:05.600437862 +0000 UTC m=+0.244281261 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step1, container_name=metrics_qdr, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible)
Dec 02 08:38:05 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:38:09 np0005541913.localdomain sudo[87999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:38:09 np0005541913.localdomain sudo[87999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:38:09 np0005541913.localdomain sudo[87999]: pam_unix(sudo:session): session closed for user root
Dec 02 08:38:09 np0005541913.localdomain sudo[88014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 08:38:09 np0005541913.localdomain sudo[88014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:38:09 np0005541913.localdomain sudo[88014]: pam_unix(sudo:session): session closed for user root
Dec 02 08:38:09 np0005541913.localdomain sudo[88049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:38:09 np0005541913.localdomain sudo[88049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:38:09 np0005541913.localdomain sudo[88049]: pam_unix(sudo:session): session closed for user root
Dec 02 08:38:09 np0005541913.localdomain sudo[88064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:38:09 np0005541913.localdomain sudo[88064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:38:10 np0005541913.localdomain sudo[88064]: pam_unix(sudo:session): session closed for user root
Dec 02 08:38:11 np0005541913.localdomain sudo[88112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:38:11 np0005541913.localdomain sudo[88112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:38:11 np0005541913.localdomain sudo[88112]: pam_unix(sudo:session): session closed for user root
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: tmp-crun.6dGbAt.mount: Deactivated successfully.
Dec 02 08:38:14 np0005541913.localdomain podman[88128]: 2025-12-02 08:38:14.462749084 +0000 UTC m=+0.093408412 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp17/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Dec 02 08:38:14 np0005541913.localdomain podman[88129]: 2025-12-02 08:38:14.522093519 +0000 UTC m=+0.150173792 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git)
Dec 02 08:38:14 np0005541913.localdomain podman[88132]: 2025-12-02 08:38:14.566882287 +0000 UTC m=+0.186917737 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=)
Dec 02 08:38:14 np0005541913.localdomain podman[88130]: 2025-12-02 08:38:14.609848448 +0000 UTC m=+0.234979517 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:38:14 np0005541913.localdomain podman[88132]: 2025-12-02 08:38:14.621985194 +0000 UTC m=+0.242020634 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:38:14 np0005541913.localdomain podman[88130]: 2025-12-02 08:38:14.647490616 +0000 UTC m=+0.272621705 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:38:14 np0005541913.localdomain podman[88129]: 2025-12-02 08:38:14.686399095 +0000 UTC m=+0.314479408 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:38:14 np0005541913.localdomain podman[88127]: 2025-12-02 08:38:14.726091224 +0000 UTC m=+0.356827434 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, architecture=x86_64)
Dec 02 08:38:14 np0005541913.localdomain podman[88127]: 2025-12-02 08:38:14.734118077 +0000 UTC m=+0.364854247 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:38:14 np0005541913.localdomain podman[88128]: 2025-12-02 08:38:14.808167422 +0000 UTC m=+0.438826780 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Dec 02 08:38:14 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:38:15 np0005541913.localdomain systemd[1]: tmp-crun.8wy6R5.mount: Deactivated successfully.
Dec 02 08:38:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:38:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:38:18 np0005541913.localdomain systemd[1]: tmp-crun.L3XbbZ.mount: Deactivated successfully.
Dec 02 08:38:18 np0005541913.localdomain podman[88246]: 2025-12-02 08:38:18.448515532 +0000 UTC m=+0.091629048 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Dec 02 08:38:18 np0005541913.localdomain podman[88247]: 2025-12-02 08:38:18.494741426 +0000 UTC m=+0.134732003 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.12)
Dec 02 08:38:18 np0005541913.localdomain podman[88247]: 2025-12-02 08:38:18.514296378 +0000 UTC m=+0.154286955 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 02 08:38:18 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:38:18 np0005541913.localdomain podman[88246]: 2025-12-02 08:38:18.565258041 +0000 UTC m=+0.208371497 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:38:18 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:38:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:38:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:38:26 np0005541913.localdomain podman[88293]: 2025-12-02 08:38:26.42382113 +0000 UTC m=+0.068325310 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Dec 02 08:38:26 np0005541913.localdomain podman[88293]: 2025-12-02 08:38:26.43133913 +0000 UTC m=+0.075843280 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-collectd)
Dec 02 08:38:26 np0005541913.localdomain systemd[1]: tmp-crun.522NAq.mount: Deactivated successfully.
Dec 02 08:38:26 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:38:26 np0005541913.localdomain podman[88294]: 2025-12-02 08:38:26.449278511 +0000 UTC m=+0.087907604 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, architecture=x86_64, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:38:26 np0005541913.localdomain podman[88294]: 2025-12-02 08:38:26.483704588 +0000 UTC m=+0.122333671 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, architecture=x86_64, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, batch=17.1_20251118.1)
Dec 02 08:38:26 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:38:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:38:36 np0005541913.localdomain systemd[1]: tmp-crun.SXZVSp.mount: Deactivated successfully.
Dec 02 08:38:36 np0005541913.localdomain podman[88333]: 2025-12-02 08:38:36.461870331 +0000 UTC m=+0.100985093 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd)
Dec 02 08:38:36 np0005541913.localdomain podman[88333]: 2025-12-02 08:38:36.698052418 +0000 UTC m=+0.337167180 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:38:36 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:38:45 np0005541913.localdomain recover_tripleo_nova_virtqemud[88393]: 62312
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: tmp-crun.7GHLol.mount: Deactivated successfully.
Dec 02 08:38:45 np0005541913.localdomain podman[88364]: 2025-12-02 08:38:45.473194234 +0000 UTC m=+0.094857429 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Dec 02 08:38:45 np0005541913.localdomain podman[88364]: 2025-12-02 08:38:45.493416503 +0000 UTC m=+0.115079708 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-nova-compute)
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: tmp-crun.muiped.mount: Deactivated successfully.
Dec 02 08:38:45 np0005541913.localdomain podman[88371]: 2025-12-02 08:38:45.588565059 +0000 UTC m=+0.201881514 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:38:45 np0005541913.localdomain podman[88363]: 2025-12-02 08:38:45.63071136 +0000 UTC m=+0.249656447 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public)
Dec 02 08:38:45 np0005541913.localdomain podman[88362]: 2025-12-02 08:38:45.678457742 +0000 UTC m=+0.304935698 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 02 08:38:45 np0005541913.localdomain podman[88362]: 2025-12-02 08:38:45.687851699 +0000 UTC m=+0.314329645 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:38:45 np0005541913.localdomain podman[88365]: 2025-12-02 08:38:45.733331994 +0000 UTC m=+0.348479404 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Dec 02 08:38:45 np0005541913.localdomain podman[88371]: 2025-12-02 08:38:45.760119959 +0000 UTC m=+0.373436414 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z)
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:38:45 np0005541913.localdomain podman[88365]: 2025-12-02 08:38:45.813355809 +0000 UTC m=+0.428503219 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, url=https://www.redhat.com)
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:38:45 np0005541913.localdomain podman[88363]: 2025-12-02 08:38:45.989959745 +0000 UTC m=+0.608904802 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Dec 02 08:38:45 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:38:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:38:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:38:49 np0005541913.localdomain systemd[1]: tmp-crun.HlW69U.mount: Deactivated successfully.
Dec 02 08:38:49 np0005541913.localdomain podman[88479]: 2025-12-02 08:38:49.451876303 +0000 UTC m=+0.085293058 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, container_name=ovn_metadata_agent, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 02 08:38:49 np0005541913.localdomain podman[88479]: 2025-12-02 08:38:49.503420051 +0000 UTC m=+0.136836776 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible)
Dec 02 08:38:49 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:38:49 np0005541913.localdomain podman[88480]: 2025-12-02 08:38:49.509211447 +0000 UTC m=+0.142070138 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:38:49 np0005541913.localdomain podman[88480]: 2025-12-02 08:38:49.592123714 +0000 UTC m=+0.224982325 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, distribution-scope=public, vcs-type=git, build-date=2025-11-18T23:34:05Z, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:38:49 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:38:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:38:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:38:57 np0005541913.localdomain podman[88572]: 2025-12-02 08:38:57.458292226 +0000 UTC m=+0.098418689 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, config_id=tripleo_step3, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, container_name=collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20251118.1)
Dec 02 08:38:57 np0005541913.localdomain podman[88572]: 2025-12-02 08:38:57.467263283 +0000 UTC m=+0.107389726 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 02 08:38:57 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:38:57 np0005541913.localdomain podman[88573]: 2025-12-02 08:38:57.540297701 +0000 UTC m=+0.177422668 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true)
Dec 02 08:38:57 np0005541913.localdomain podman[88573]: 2025-12-02 08:38:57.54822794 +0000 UTC m=+0.185352897 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-type=git, version=17.1.12, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:38:57 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:39:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:39:07 np0005541913.localdomain podman[88611]: 2025-12-02 08:39:07.438717018 +0000 UTC m=+0.078208430 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z)
Dec 02 08:39:07 np0005541913.localdomain podman[88611]: 2025-12-02 08:39:07.634071016 +0000 UTC m=+0.273562468 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 08:39:07 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:39:11 np0005541913.localdomain sudo[88640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:39:11 np0005541913.localdomain sudo[88640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:39:11 np0005541913.localdomain sudo[88640]: pam_unix(sudo:session): session closed for user root
Dec 02 08:39:11 np0005541913.localdomain sudo[88655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:39:11 np0005541913.localdomain sudo[88655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:39:11 np0005541913.localdomain sudo[88655]: pam_unix(sudo:session): session closed for user root
Dec 02 08:39:12 np0005541913.localdomain sudo[88702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:39:12 np0005541913.localdomain sudo[88702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:39:12 np0005541913.localdomain sudo[88702]: pam_unix(sudo:session): session closed for user root
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: tmp-crun.60I4Ae.mount: Deactivated successfully.
Dec 02 08:39:16 np0005541913.localdomain podman[88719]: 2025-12-02 08:39:16.458561586 +0000 UTC m=+0.094101950 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=nova_compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:39:16 np0005541913.localdomain podman[88720]: 2025-12-02 08:39:16.519923261 +0000 UTC m=+0.150500780 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-type=git, release=1761123044, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:39:16 np0005541913.localdomain podman[88717]: 2025-12-02 08:39:16.564435582 +0000 UTC m=+0.202694565 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 02 08:39:16 np0005541913.localdomain podman[88718]: 2025-12-02 08:39:16.57352367 +0000 UTC m=+0.211885386 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:39:16 np0005541913.localdomain podman[88720]: 2025-12-02 08:39:16.579957022 +0000 UTC m=+0.210534471 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute)
Dec 02 08:39:16 np0005541913.localdomain podman[88719]: 2025-12-02 08:39:16.587442241 +0000 UTC m=+0.222982635 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044)
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:39:16 np0005541913.localdomain podman[88717]: 2025-12-02 08:39:16.60092035 +0000 UTC m=+0.239179283 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:39:16 np0005541913.localdomain podman[88728]: 2025-12-02 08:39:16.658763656 +0000 UTC m=+0.285947980 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:39:16 np0005541913.localdomain podman[88728]: 2025-12-02 08:39:16.6879126 +0000 UTC m=+0.315096944 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:39:16 np0005541913.localdomain podman[88718]: 2025-12-02 08:39:16.951947588 +0000 UTC m=+0.590309304 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Dec 02 08:39:16 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:39:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:39:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:39:20 np0005541913.localdomain podman[88840]: 2025-12-02 08:39:20.436194368 +0000 UTC m=+0.070371822 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git)
Dec 02 08:39:20 np0005541913.localdomain podman[88840]: 2025-12-02 08:39:20.484205507 +0000 UTC m=+0.118382951 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:39:20 np0005541913.localdomain podman[88839]: 2025-12-02 08:39:20.486603498 +0000 UTC m=+0.123774347 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, distribution-scope=public, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:39:20 np0005541913.localdomain podman[88839]: 2025-12-02 08:39:20.529143528 +0000 UTC m=+0.166314357 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true)
Dec 02 08:39:20 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:39:20 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:39:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:39:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:39:28 np0005541913.localdomain podman[88886]: 2025-12-02 08:39:28.490444047 +0000 UTC m=+0.132234560 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:39:28 np0005541913.localdomain podman[88886]: 2025-12-02 08:39:28.503348521 +0000 UTC m=+0.145139034 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, container_name=collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd)
Dec 02 08:39:28 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:39:28 np0005541913.localdomain podman[88887]: 2025-12-02 08:39:28.464151794 +0000 UTC m=+0.102145722 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4)
Dec 02 08:39:28 np0005541913.localdomain podman[88887]: 2025-12-02 08:39:28.545125623 +0000 UTC m=+0.183119511 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, container_name=iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044)
Dec 02 08:39:28 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:39:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:39:38 np0005541913.localdomain systemd[1]: tmp-crun.9R5Ec8.mount: Deactivated successfully.
Dec 02 08:39:38 np0005541913.localdomain podman[88926]: 2025-12-02 08:39:38.444712328 +0000 UTC m=+0.085120064 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step1, io.buildah.version=1.41.4, architecture=x86_64, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Dec 02 08:39:38 np0005541913.localdomain podman[88926]: 2025-12-02 08:39:38.637136463 +0000 UTC m=+0.277544249 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:39:38 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: tmp-crun.jVgdzt.mount: Deactivated successfully.
Dec 02 08:39:47 np0005541913.localdomain podman[88956]: 2025-12-02 08:39:47.526128883 +0000 UTC m=+0.157087476 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Dec 02 08:39:47 np0005541913.localdomain podman[88964]: 2025-12-02 08:39:47.575630689 +0000 UTC m=+0.195724578 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 02 08:39:47 np0005541913.localdomain podman[88957]: 2025-12-02 08:39:47.477353265 +0000 UTC m=+0.107532449 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc.)
Dec 02 08:39:47 np0005541913.localdomain podman[88964]: 2025-12-02 08:39:47.606317682 +0000 UTC m=+0.226411591 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:39:47 np0005541913.localdomain podman[88958]: 2025-12-02 08:39:47.632504851 +0000 UTC m=+0.258304644 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:39:47 np0005541913.localdomain podman[88957]: 2025-12-02 08:39:47.664339312 +0000 UTC m=+0.294518476 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:39:47 np0005541913.localdomain podman[88958]: 2025-12-02 08:39:47.708094864 +0000 UTC m=+0.333894737 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:11:48Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:39:47 np0005541913.localdomain podman[88955]: 2025-12-02 08:39:47.709837769 +0000 UTC m=+0.343378917 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, vcs-type=git, container_name=logrotate_crond, name=rhosp17/openstack-cron, batch=17.1_20251118.1, release=1761123044)
Dec 02 08:39:47 np0005541913.localdomain podman[88955]: 2025-12-02 08:39:47.792047158 +0000 UTC m=+0.425588246 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond)
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:39:47 np0005541913.localdomain podman[88956]: 2025-12-02 08:39:47.926229266 +0000 UTC m=+0.557187789 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:39:47 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:39:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:39:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:39:51 np0005541913.localdomain systemd[1]: tmp-crun.S1y3to.mount: Deactivated successfully.
Dec 02 08:39:51 np0005541913.localdomain podman[89079]: 2025-12-02 08:39:51.459309625 +0000 UTC m=+0.098397528 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 02 08:39:51 np0005541913.localdomain systemd[1]: tmp-crun.HXR0ia.mount: Deactivated successfully.
Dec 02 08:39:51 np0005541913.localdomain podman[89080]: 2025-12-02 08:39:51.517984673 +0000 UTC m=+0.144520030 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, container_name=ovn_controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 02 08:39:51 np0005541913.localdomain podman[89079]: 2025-12-02 08:39:51.523075321 +0000 UTC m=+0.162163164 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:39:51 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:39:51 np0005541913.localdomain podman[89080]: 2025-12-02 08:39:51.544024888 +0000 UTC m=+0.170560245 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, container_name=ovn_controller, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, release=1761123044, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:39:51 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:39:56 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:39:56 np0005541913.localdomain recover_tripleo_nova_virtqemud[89149]: 62312
Dec 02 08:39:56 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:39:56 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:39:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:39:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:39:59 np0005541913.localdomain systemd[1]: tmp-crun.YGs3Me.mount: Deactivated successfully.
Dec 02 08:39:59 np0005541913.localdomain podman[89151]: 2025-12-02 08:39:59.449326374 +0000 UTC m=+0.091086104 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:39:59 np0005541913.localdomain podman[89150]: 2025-12-02 08:39:59.491706481 +0000 UTC m=+0.135539123 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=1761123044, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 02 08:39:59 np0005541913.localdomain podman[89150]: 2025-12-02 08:39:59.500275766 +0000 UTC m=+0.144108438 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.buildah.version=1.41.4, distribution-scope=public)
Dec 02 08:39:59 np0005541913.localdomain podman[89151]: 2025-12-02 08:39:59.51195831 +0000 UTC m=+0.153718090 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64)
Dec 02 08:39:59 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:39:59 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:40:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:40:09 np0005541913.localdomain podman[89188]: 2025-12-02 08:40:09.447279084 +0000 UTC m=+0.090349895 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, config_id=tripleo_step1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:40:09 np0005541913.localdomain podman[89188]: 2025-12-02 08:40:09.68187012 +0000 UTC m=+0.324940932 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Dec 02 08:40:09 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:40:12 np0005541913.localdomain sudo[89218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:40:12 np0005541913.localdomain sudo[89218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:40:12 np0005541913.localdomain sudo[89218]: pam_unix(sudo:session): session closed for user root
Dec 02 08:40:12 np0005541913.localdomain sudo[89233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 08:40:12 np0005541913.localdomain sudo[89233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:40:13 np0005541913.localdomain podman[89321]: 2025-12-02 08:40:13.746909093 +0000 UTC m=+0.068445785 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.41.4, RELEASE=main, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, version=7, ceph=True, com.redhat.component=rhceph-container)
Dec 02 08:40:13 np0005541913.localdomain podman[89321]: 2025-12-02 08:40:13.825947272 +0000 UTC m=+0.147483934 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 08:40:14 np0005541913.localdomain sudo[89233]: pam_unix(sudo:session): session closed for user root
Dec 02 08:40:14 np0005541913.localdomain sudo[89390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:40:14 np0005541913.localdomain sudo[89390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:40:14 np0005541913.localdomain sudo[89390]: pam_unix(sudo:session): session closed for user root
Dec 02 08:40:14 np0005541913.localdomain sudo[89405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:40:14 np0005541913.localdomain sudo[89405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:40:14 np0005541913.localdomain sudo[89405]: pam_unix(sudo:session): session closed for user root
Dec 02 08:40:15 np0005541913.localdomain sudo[89453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:40:15 np0005541913.localdomain sudo[89453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:40:15 np0005541913.localdomain sudo[89453]: pam_unix(sudo:session): session closed for user root
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:40:18 np0005541913.localdomain podman[89468]: 2025-12-02 08:40:18.465903299 +0000 UTC m=+0.097203427 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true)
Dec 02 08:40:18 np0005541913.localdomain podman[89468]: 2025-12-02 08:40:18.47585581 +0000 UTC m=+0.107155938 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:40:18 np0005541913.localdomain podman[89469]: 2025-12-02 08:40:18.517347435 +0000 UTC m=+0.147200988 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:40:18 np0005541913.localdomain podman[89482]: 2025-12-02 08:40:18.564379959 +0000 UTC m=+0.183855450 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z)
Dec 02 08:40:18 np0005541913.localdomain podman[89470]: 2025-12-02 08:40:18.617911037 +0000 UTC m=+0.242749903 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=nova_compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:40:18 np0005541913.localdomain podman[89470]: 2025-12-02 08:40:18.63634312 +0000 UTC m=+0.261182006 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:40:18 np0005541913.localdomain podman[89482]: 2025-12-02 08:40:18.690975946 +0000 UTC m=+0.310451447 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:12:45Z)
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:40:18 np0005541913.localdomain podman[89471]: 2025-12-02 08:40:18.784972272 +0000 UTC m=+0.407320755 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_compute)
Dec 02 08:40:18 np0005541913.localdomain podman[89471]: 2025-12-02 08:40:18.816951898 +0000 UTC m=+0.439300421 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-type=git, config_id=tripleo_step4)
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:40:18 np0005541913.localdomain podman[89469]: 2025-12-02 08:40:18.883175374 +0000 UTC m=+0.513028957 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:40:18 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:40:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:40:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:40:22 np0005541913.localdomain podman[89590]: 2025-12-02 08:40:22.441678095 +0000 UTC m=+0.078769594 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, version=17.1.12, container_name=ovn_controller, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:40:22 np0005541913.localdomain podman[89590]: 2025-12-02 08:40:22.466410077 +0000 UTC m=+0.103501606 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 02 08:40:22 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:40:22 np0005541913.localdomain systemd[1]: tmp-crun.hLa1rc.mount: Deactivated successfully.
Dec 02 08:40:22 np0005541913.localdomain podman[89589]: 2025-12-02 08:40:22.549463528 +0000 UTC m=+0.189569934 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:40:22 np0005541913.localdomain podman[89589]: 2025-12-02 08:40:22.593663171 +0000 UTC m=+0.233769587 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64)
Dec 02 08:40:22 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:40:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:40:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:40:30 np0005541913.localdomain systemd[1]: tmp-crun.dlUBdX.mount: Deactivated successfully.
Dec 02 08:40:30 np0005541913.localdomain podman[89636]: 2025-12-02 08:40:30.468186033 +0000 UTC m=+0.102273986 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=iscsid, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:40:30 np0005541913.localdomain systemd[1]: tmp-crun.u3fNCp.mount: Deactivated successfully.
Dec 02 08:40:30 np0005541913.localdomain podman[89635]: 2025-12-02 08:40:30.511744609 +0000 UTC m=+0.149980206 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Dec 02 08:40:30 np0005541913.localdomain podman[89635]: 2025-12-02 08:40:30.527025215 +0000 UTC m=+0.165260822 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 02 08:40:30 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:40:30 np0005541913.localdomain podman[89636]: 2025-12-02 08:40:30.55307692 +0000 UTC m=+0.187164873 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3)
Dec 02 08:40:30 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:40:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:40:40 np0005541913.localdomain systemd[1]: tmp-crun.QQXt3W.mount: Deactivated successfully.
Dec 02 08:40:40 np0005541913.localdomain podman[89675]: 2025-12-02 08:40:40.419754967 +0000 UTC m=+0.069014908 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:40:40 np0005541913.localdomain podman[89675]: 2025-12-02 08:40:40.622391599 +0000 UTC m=+0.271651480 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:46Z, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, url=https://www.redhat.com, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:40:40 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:40:49 np0005541913.localdomain podman[89705]: 2025-12-02 08:40:49.446946169 +0000 UTC m=+0.083454112 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute)
Dec 02 08:40:49 np0005541913.localdomain podman[89707]: 2025-12-02 08:40:49.466198014 +0000 UTC m=+0.093243290 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12)
Dec 02 08:40:49 np0005541913.localdomain podman[89705]: 2025-12-02 08:40:49.478943994 +0000 UTC m=+0.115451967 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, name=rhosp17/openstack-nova-compute)
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:40:49 np0005541913.localdomain podman[89704]: 2025-12-02 08:40:49.545344166 +0000 UTC m=+0.180489996 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:40:49 np0005541913.localdomain podman[89722]: 2025-12-02 08:40:49.574255494 +0000 UTC m=+0.192221631 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12)
Dec 02 08:40:49 np0005541913.localdomain podman[89703]: 2025-12-02 08:40:49.523725371 +0000 UTC m=+0.161197208 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:40:49 np0005541913.localdomain podman[89707]: 2025-12-02 08:40:49.600397331 +0000 UTC m=+0.227442627 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:40:49 np0005541913.localdomain podman[89703]: 2025-12-02 08:40:49.656903095 +0000 UTC m=+0.294374842 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, container_name=logrotate_crond)
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:40:49 np0005541913.localdomain podman[89722]: 2025-12-02 08:40:49.677715078 +0000 UTC m=+0.295681175 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:40:49 np0005541913.localdomain podman[89704]: 2025-12-02 08:40:49.903968495 +0000 UTC m=+0.539114335 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, container_name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:40:49 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:40:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:40:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:40:53 np0005541913.localdomain podman[89823]: 2025-12-02 08:40:53.432633572 +0000 UTC m=+0.079059221 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:40:53 np0005541913.localdomain systemd[1]: tmp-crun.xeDSHm.mount: Deactivated successfully.
Dec 02 08:40:53 np0005541913.localdomain podman[89824]: 2025-12-02 08:40:53.489782311 +0000 UTC m=+0.131330307 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 02 08:40:53 np0005541913.localdomain podman[89823]: 2025-12-02 08:40:53.513222071 +0000 UTC m=+0.159647700 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1)
Dec 02 08:40:53 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:40:53 np0005541913.localdomain podman[89824]: 2025-12-02 08:40:53.567058946 +0000 UTC m=+0.208607002 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, release=1761123044, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:40:53 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:41:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:41:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:41:01 np0005541913.localdomain podman[89896]: 2025-12-02 08:41:01.428125231 +0000 UTC m=+0.069042280 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, container_name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 02 08:41:01 np0005541913.localdomain podman[89896]: 2025-12-02 08:41:01.437065666 +0000 UTC m=+0.077982655 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, container_name=iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z)
Dec 02 08:41:01 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:41:01 np0005541913.localdomain systemd[1]: tmp-crun.ilDDyu.mount: Deactivated successfully.
Dec 02 08:41:01 np0005541913.localdomain podman[89895]: 2025-12-02 08:41:01.479867283 +0000 UTC m=+0.120388302 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, vcs-type=git, config_id=tripleo_step3)
Dec 02 08:41:01 np0005541913.localdomain podman[89895]: 2025-12-02 08:41:01.515994723 +0000 UTC m=+0.156515682 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, build-date=2025-11-18T22:51:28Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_id=tripleo_step3, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Dec 02 08:41:01 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:41:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:41:11 np0005541913.localdomain systemd[1]: tmp-crun.LyOxQA.mount: Deactivated successfully.
Dec 02 08:41:11 np0005541913.localdomain podman[89935]: 2025-12-02 08:41:11.42908054 +0000 UTC m=+0.076267162 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, version=17.1.12)
Dec 02 08:41:11 np0005541913.localdomain podman[89935]: 2025-12-02 08:41:11.64079673 +0000 UTC m=+0.287983292 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr)
Dec 02 08:41:11 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:41:15 np0005541913.localdomain sudo[89964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:41:15 np0005541913.localdomain sudo[89964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:41:15 np0005541913.localdomain sudo[89964]: pam_unix(sudo:session): session closed for user root
Dec 02 08:41:15 np0005541913.localdomain sudo[89979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:41:15 np0005541913.localdomain sudo[89979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:41:16 np0005541913.localdomain sudo[89979]: pam_unix(sudo:session): session closed for user root
Dec 02 08:41:17 np0005541913.localdomain sudo[90025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:41:17 np0005541913.localdomain sudo[90025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:41:17 np0005541913.localdomain sudo[90025]: pam_unix(sudo:session): session closed for user root
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:41:20 np0005541913.localdomain podman[90042]: 2025-12-02 08:41:20.434833701 +0000 UTC m=+0.069206683 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:41:20 np0005541913.localdomain podman[90042]: 2025-12-02 08:41:20.466406337 +0000 UTC m=+0.100779369 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step5, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc.)
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: tmp-crun.3IlJQ9.mount: Deactivated successfully.
Dec 02 08:41:20 np0005541913.localdomain podman[90044]: 2025-12-02 08:41:20.508896996 +0000 UTC m=+0.136178550 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12)
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:41:20 np0005541913.localdomain podman[90044]: 2025-12-02 08:41:20.563970212 +0000 UTC m=+0.191251706 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:41:20 np0005541913.localdomain podman[90043]: 2025-12-02 08:41:20.519890933 +0000 UTC m=+0.148489139 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:41:20 np0005541913.localdomain podman[90043]: 2025-12-02 08:41:20.603287902 +0000 UTC m=+0.231886098 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:41:20 np0005541913.localdomain podman[90041]: 2025-12-02 08:41:20.654787359 +0000 UTC m=+0.288038753 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:41:20 np0005541913.localdomain podman[90040]: 2025-12-02 08:41:20.710701817 +0000 UTC m=+0.342332590 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12)
Dec 02 08:41:20 np0005541913.localdomain podman[90040]: 2025-12-02 08:41:20.748078137 +0000 UTC m=+0.379708910 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:41:20 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:41:21 np0005541913.localdomain podman[90041]: 2025-12-02 08:41:21.033092033 +0000 UTC m=+0.666343487 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-type=git, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:41:21 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:41:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:41:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:41:24 np0005541913.localdomain systemd[1]: tmp-crun.s2HKr9.mount: Deactivated successfully.
Dec 02 08:41:24 np0005541913.localdomain podman[90163]: 2025-12-02 08:41:24.437869911 +0000 UTC m=+0.079139583 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:34:05Z)
Dec 02 08:41:24 np0005541913.localdomain podman[90163]: 2025-12-02 08:41:24.489150963 +0000 UTC m=+0.130420645 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:41:24 np0005541913.localdomain systemd[1]: tmp-crun.yXvBOg.mount: Deactivated successfully.
Dec 02 08:41:24 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:41:24 np0005541913.localdomain podman[90162]: 2025-12-02 08:41:24.49066313 +0000 UTC m=+0.131604004 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:41:24 np0005541913.localdomain podman[90162]: 2025-12-02 08:41:24.571064775 +0000 UTC m=+0.212005659 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:41:24 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:41:26 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:41:26 np0005541913.localdomain recover_tripleo_nova_virtqemud[90209]: 62312
Dec 02 08:41:26 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:41:26 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:41:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:41:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:41:32 np0005541913.localdomain podman[90211]: 2025-12-02 08:41:32.442977281 +0000 UTC m=+0.082241952 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.12, release=1761123044, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=)
Dec 02 08:41:32 np0005541913.localdomain podman[90210]: 2025-12-02 08:41:32.496687153 +0000 UTC m=+0.137630486 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, vcs-type=git, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, version=17.1.12, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:41:32 np0005541913.localdomain podman[90210]: 2025-12-02 08:41:32.508137071 +0000 UTC m=+0.149080464 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, vcs-type=git, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:41:32 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:41:32 np0005541913.localdomain podman[90211]: 2025-12-02 08:41:32.531168671 +0000 UTC m=+0.170433332 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid)
Dec 02 08:41:32 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:41:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:41:42 np0005541913.localdomain podman[90249]: 2025-12-02 08:41:42.441463508 +0000 UTC m=+0.083389691 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible)
Dec 02 08:41:42 np0005541913.localdomain podman[90249]: 2025-12-02 08:41:42.63816511 +0000 UTC m=+0.280091253 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1)
Dec 02 08:41:42 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:41:51 np0005541913.localdomain podman[90278]: 2025-12-02 08:41:51.465599201 +0000 UTC m=+0.098081640 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, container_name=logrotate_crond, name=rhosp17/openstack-cron, vendor=Red Hat, Inc.)
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: tmp-crun.ux37CB.mount: Deactivated successfully.
Dec 02 08:41:51 np0005541913.localdomain podman[90278]: 2025-12-02 08:41:51.503034654 +0000 UTC m=+0.135517073 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:41:51 np0005541913.localdomain podman[90279]: 2025-12-02 08:41:51.522561066 +0000 UTC m=+0.151046664 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z)
Dec 02 08:41:51 np0005541913.localdomain podman[90284]: 2025-12-02 08:41:51.57480022 +0000 UTC m=+0.194835125 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:41:51 np0005541913.localdomain podman[90284]: 2025-12-02 08:41:51.61092346 +0000 UTC m=+0.230958445 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com)
Dec 02 08:41:51 np0005541913.localdomain podman[90280]: 2025-12-02 08:41:51.621290611 +0000 UTC m=+0.245921742 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:41:51 np0005541913.localdomain podman[90291]: 2025-12-02 08:41:51.49416206 +0000 UTC m=+0.110676067 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12)
Dec 02 08:41:51 np0005541913.localdomain podman[90280]: 2025-12-02 08:41:51.659018191 +0000 UTC m=+0.283649332 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, container_name=nova_compute, name=rhosp17/openstack-nova-compute)
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:41:51 np0005541913.localdomain podman[90291]: 2025-12-02 08:41:51.731500636 +0000 UTC m=+0.348014633 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true)
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:41:51 np0005541913.localdomain podman[90279]: 2025-12-02 08:41:51.894987992 +0000 UTC m=+0.523473530 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=nova_migration_target, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public)
Dec 02 08:41:51 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:41:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:41:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:41:55 np0005541913.localdomain systemd[1]: tmp-crun.MUxLT0.mount: Deactivated successfully.
Dec 02 08:41:55 np0005541913.localdomain podman[90397]: 2025-12-02 08:41:55.436818833 +0000 UTC m=+0.079435441 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:41:55 np0005541913.localdomain podman[90398]: 2025-12-02 08:41:55.414694576 +0000 UTC m=+0.056944964 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ovn_controller, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z)
Dec 02 08:41:55 np0005541913.localdomain podman[90397]: 2025-12-02 08:41:55.479175149 +0000 UTC m=+0.121791807 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, tcib_managed=true, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc.)
Dec 02 08:41:55 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:41:55 np0005541913.localdomain podman[90398]: 2025-12-02 08:41:55.499434719 +0000 UTC m=+0.141685097 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:41:55 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:42:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:42:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:42:03 np0005541913.localdomain podman[90468]: 2025-12-02 08:42:03.452108129 +0000 UTC m=+0.091406084 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container)
Dec 02 08:42:03 np0005541913.localdomain podman[90469]: 2025-12-02 08:42:03.500410506 +0000 UTC m=+0.136298338 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, version=17.1.12, architecture=x86_64, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Dec 02 08:42:03 np0005541913.localdomain podman[90468]: 2025-12-02 08:42:03.518161687 +0000 UTC m=+0.157459582 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible)
Dec 02 08:42:03 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:42:03 np0005541913.localdomain podman[90469]: 2025-12-02 08:42:03.540022317 +0000 UTC m=+0.175910099 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, container_name=iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid)
Dec 02 08:42:03 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:42:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:42:13 np0005541913.localdomain podman[90507]: 2025-12-02 08:42:13.432884483 +0000 UTC m=+0.078289729 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:42:13 np0005541913.localdomain podman[90507]: 2025-12-02 08:42:13.644555319 +0000 UTC m=+0.289960565 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:42:13 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:42:17 np0005541913.localdomain sudo[90536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:42:17 np0005541913.localdomain sudo[90536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:42:17 np0005541913.localdomain sudo[90536]: pam_unix(sudo:session): session closed for user root
Dec 02 08:42:17 np0005541913.localdomain sudo[90551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:42:17 np0005541913.localdomain sudo[90551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:42:17 np0005541913.localdomain sudo[90551]: pam_unix(sudo:session): session closed for user root
Dec 02 08:42:18 np0005541913.localdomain sudo[90598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:42:18 np0005541913.localdomain sudo[90598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:42:18 np0005541913.localdomain sudo[90598]: pam_unix(sudo:session): session closed for user root
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: tmp-crun.m7Hn4B.mount: Deactivated successfully.
Dec 02 08:42:22 np0005541913.localdomain podman[90614]: 2025-12-02 08:42:22.507139634 +0000 UTC m=+0.141057927 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:36:58Z, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:42:22 np0005541913.localdomain podman[90616]: 2025-12-02 08:42:22.566687074 +0000 UTC m=+0.195317514 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:42:22 np0005541913.localdomain podman[90617]: 2025-12-02 08:42:22.616258356 +0000 UTC m=+0.240845677 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 02 08:42:22 np0005541913.localdomain podman[90616]: 2025-12-02 08:42:22.622984507 +0000 UTC m=+0.251614947 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:42:22 np0005541913.localdomain podman[90617]: 2025-12-02 08:42:22.658961621 +0000 UTC m=+0.283548932 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:12:45Z, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:42:22 np0005541913.localdomain podman[90615]: 2025-12-02 08:42:22.671436459 +0000 UTC m=+0.304310574 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 02 08:42:22 np0005541913.localdomain podman[90613]: 2025-12-02 08:42:22.483672939 +0000 UTC m=+0.118074545 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 02 08:42:22 np0005541913.localdomain podman[90615]: 2025-12-02 08:42:22.700927656 +0000 UTC m=+0.333801751 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, container_name=nova_compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:42:22 np0005541913.localdomain podman[90613]: 2025-12-02 08:42:22.719050977 +0000 UTC m=+0.353452573 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:42:22 np0005541913.localdomain podman[90614]: 2025-12-02 08:42:22.864026328 +0000 UTC m=+0.497944631 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Dec 02 08:42:22 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:42:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:42:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:42:26 np0005541913.localdomain podman[90736]: 2025-12-02 08:42:26.450709096 +0000 UTC m=+0.089672786 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:42:26 np0005541913.localdomain podman[90736]: 2025-12-02 08:42:26.499514016 +0000 UTC m=+0.138477706 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Dec 02 08:42:26 np0005541913.localdomain podman[90737]: 2025-12-02 08:42:26.502333253 +0000 UTC m=+0.137759108 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1)
Dec 02 08:42:26 np0005541913.localdomain podman[90737]: 2025-12-02 08:42:26.526041104 +0000 UTC m=+0.161466989 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4)
Dec 02 08:42:26 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:42:26 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:42:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:42:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:42:34 np0005541913.localdomain podman[90785]: 2025-12-02 08:42:34.45572224 +0000 UTC m=+0.091213778 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:42:34 np0005541913.localdomain podman[90785]: 2025-12-02 08:42:34.46386198 +0000 UTC m=+0.099353538 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z)
Dec 02 08:42:34 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:42:34 np0005541913.localdomain podman[90786]: 2025-12-02 08:42:34.503883373 +0000 UTC m=+0.135781954 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public)
Dec 02 08:42:34 np0005541913.localdomain podman[90786]: 2025-12-02 08:42:34.517266895 +0000 UTC m=+0.149165526 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., version=17.1.12)
Dec 02 08:42:34 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:42:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:42:44 np0005541913.localdomain systemd[1]: tmp-crun.gnxLQH.mount: Deactivated successfully.
Dec 02 08:42:44 np0005541913.localdomain podman[90824]: 2025-12-02 08:42:44.442171806 +0000 UTC m=+0.086912133 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, config_id=tripleo_step1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 08:42:44 np0005541913.localdomain podman[90824]: 2025-12-02 08:42:44.612633137 +0000 UTC m=+0.257373374 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 02 08:42:44 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: tmp-crun.N8SSM3.mount: Deactivated successfully.
Dec 02 08:42:53 np0005541913.localdomain podman[90855]: 2025-12-02 08:42:53.451130518 +0000 UTC m=+0.084284531 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:42:53 np0005541913.localdomain podman[90854]: 2025-12-02 08:42:53.506285381 +0000 UTC m=+0.138862448 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-type=git, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:42:53 np0005541913.localdomain podman[90859]: 2025-12-02 08:42:53.564213767 +0000 UTC m=+0.191307036 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1)
Dec 02 08:42:53 np0005541913.localdomain podman[90859]: 2025-12-02 08:42:53.587642671 +0000 UTC m=+0.214735840 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:42:53 np0005541913.localdomain podman[90853]: 2025-12-02 08:42:53.605930696 +0000 UTC m=+0.242441879 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:42:53 np0005541913.localdomain podman[90853]: 2025-12-02 08:42:53.614846557 +0000 UTC m=+0.251357740 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:42:53 np0005541913.localdomain podman[90855]: 2025-12-02 08:42:53.630980753 +0000 UTC m=+0.264134756 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team)
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:42:53 np0005541913.localdomain podman[90866]: 2025-12-02 08:42:53.479605158 +0000 UTC m=+0.103130621 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:12:45Z)
Dec 02 08:42:53 np0005541913.localdomain podman[90866]: 2025-12-02 08:42:53.713094125 +0000 UTC m=+0.336619588 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:42:53 np0005541913.localdomain podman[90854]: 2025-12-02 08:42:53.898182062 +0000 UTC m=+0.530759229 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 02 08:42:53 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:42:54 np0005541913.localdomain systemd[1]: tmp-crun.9gVOb6.mount: Deactivated successfully.
Dec 02 08:42:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:42:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:42:57 np0005541913.localdomain podman[90995]: 2025-12-02 08:42:57.451818876 +0000 UTC m=+0.087034825 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com)
Dec 02 08:42:57 np0005541913.localdomain systemd[1]: tmp-crun.dl2QoQ.mount: Deactivated successfully.
Dec 02 08:42:57 np0005541913.localdomain podman[90995]: 2025-12-02 08:42:57.499332791 +0000 UTC m=+0.134548690 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12)
Dec 02 08:42:57 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:42:57 np0005541913.localdomain podman[90994]: 2025-12-02 08:42:57.503595407 +0000 UTC m=+0.141431197 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:42:57 np0005541913.localdomain podman[90994]: 2025-12-02 08:42:57.589170612 +0000 UTC m=+0.227006382 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:42:57 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:43:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:43:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:43:05 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:43:05 np0005541913.localdomain recover_tripleo_nova_virtqemud[91055]: 62312
Dec 02 08:43:05 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:43:05 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:43:05 np0005541913.localdomain podman[91043]: 2025-12-02 08:43:05.443773748 +0000 UTC m=+0.085623638 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:43:05 np0005541913.localdomain podman[91043]: 2025-12-02 08:43:05.483123352 +0000 UTC m=+0.124973192 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc.)
Dec 02 08:43:05 np0005541913.localdomain podman[91042]: 2025-12-02 08:43:05.49042622 +0000 UTC m=+0.130728777 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, release=1761123044, tcib_managed=true, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:43:05 np0005541913.localdomain podman[91042]: 2025-12-02 08:43:05.499991349 +0000 UTC m=+0.140293896 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-collectd-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3)
Dec 02 08:43:05 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:43:05 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:43:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:43:15 np0005541913.localdomain systemd[1]: tmp-crun.XQHn80.mount: Deactivated successfully.
Dec 02 08:43:15 np0005541913.localdomain podman[91083]: 2025-12-02 08:43:15.476520157 +0000 UTC m=+0.120633125 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step1, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr)
Dec 02 08:43:15 np0005541913.localdomain podman[91083]: 2025-12-02 08:43:15.700548577 +0000 UTC m=+0.344661545 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:49:46Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:43:15 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:43:18 np0005541913.localdomain sudo[91112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:43:18 np0005541913.localdomain sudo[91112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:43:18 np0005541913.localdomain sudo[91112]: pam_unix(sudo:session): session closed for user root
Dec 02 08:43:18 np0005541913.localdomain sudo[91127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:43:18 np0005541913.localdomain sudo[91127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:43:19 np0005541913.localdomain sudo[91127]: pam_unix(sudo:session): session closed for user root
Dec 02 08:43:20 np0005541913.localdomain sudo[91174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:43:20 np0005541913.localdomain sudo[91174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:43:20 np0005541913.localdomain sudo[91174]: pam_unix(sudo:session): session closed for user root
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: tmp-crun.gVxmFR.mount: Deactivated successfully.
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: tmp-crun.ZjE8mv.mount: Deactivated successfully.
Dec 02 08:43:24 np0005541913.localdomain podman[91203]: 2025-12-02 08:43:24.491316608 +0000 UTC m=+0.109810181 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:43:24 np0005541913.localdomain podman[91190]: 2025-12-02 08:43:24.508576225 +0000 UTC m=+0.141841477 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Dec 02 08:43:24 np0005541913.localdomain podman[91203]: 2025-12-02 08:43:24.520979631 +0000 UTC m=+0.139473214 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:43:24 np0005541913.localdomain podman[91197]: 2025-12-02 08:43:24.570683236 +0000 UTC m=+0.194970736 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:43:24 np0005541913.localdomain podman[91191]: 2025-12-02 08:43:24.474703449 +0000 UTC m=+0.103438219 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z)
Dec 02 08:43:24 np0005541913.localdomain podman[91191]: 2025-12-02 08:43:24.606056153 +0000 UTC m=+0.234790883 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:43:24 np0005541913.localdomain podman[91197]: 2025-12-02 08:43:24.629083075 +0000 UTC m=+0.253370585 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:11:48Z, tcib_managed=true, release=1761123044, url=https://www.redhat.com, architecture=x86_64, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public)
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:43:24 np0005541913.localdomain podman[91189]: 2025-12-02 08:43:24.680366453 +0000 UTC m=+0.315252159 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, release=1761123044, container_name=logrotate_crond, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 08:43:24 np0005541913.localdomain podman[91189]: 2025-12-02 08:43:24.692954363 +0000 UTC m=+0.327840099 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:43:24 np0005541913.localdomain podman[91190]: 2025-12-02 08:43:24.870460546 +0000 UTC m=+0.503725848 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, container_name=nova_migration_target)
Dec 02 08:43:24 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:43:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:43:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:43:28 np0005541913.localdomain podman[91307]: 2025-12-02 08:43:28.447331418 +0000 UTC m=+0.089268955 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4)
Dec 02 08:43:28 np0005541913.localdomain podman[91307]: 2025-12-02 08:43:28.486008705 +0000 UTC m=+0.127946282 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:43:28 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:43:28 np0005541913.localdomain podman[91308]: 2025-12-02 08:43:28.507061855 +0000 UTC m=+0.143657587 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:43:28 np0005541913.localdomain podman[91308]: 2025-12-02 08:43:28.560088339 +0000 UTC m=+0.196684051 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:43:28 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:43:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:43:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:43:36 np0005541913.localdomain systemd[1]: tmp-crun.TyTrZW.mount: Deactivated successfully.
Dec 02 08:43:36 np0005541913.localdomain podman[91354]: 2025-12-02 08:43:36.444410648 +0000 UTC m=+0.085109273 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:43:36 np0005541913.localdomain podman[91354]: 2025-12-02 08:43:36.455243571 +0000 UTC m=+0.095942216 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:43:36 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:43:36 np0005541913.localdomain systemd[1]: tmp-crun.nT9o1B.mount: Deactivated successfully.
Dec 02 08:43:36 np0005541913.localdomain podman[91353]: 2025-12-02 08:43:36.563396097 +0000 UTC m=+0.203155097 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., container_name=collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4)
Dec 02 08:43:36 np0005541913.localdomain podman[91353]: 2025-12-02 08:43:36.602054463 +0000 UTC m=+0.241813433 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 02 08:43:36 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:43:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:43:46 np0005541913.localdomain podman[91393]: 2025-12-02 08:43:46.44435111 +0000 UTC m=+0.087893420 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:43:46 np0005541913.localdomain podman[91393]: 2025-12-02 08:43:46.640881196 +0000 UTC m=+0.284423486 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Dec 02 08:43:46 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: tmp-crun.P3D6Ew.mount: Deactivated successfully.
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: tmp-crun.2L0PG6.mount: Deactivated successfully.
Dec 02 08:43:55 np0005541913.localdomain podman[91426]: 2025-12-02 08:43:55.472715948 +0000 UTC m=+0.100691335 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:43:55 np0005541913.localdomain podman[91426]: 2025-12-02 08:43:55.501951949 +0000 UTC m=+0.129927386 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:43:55 np0005541913.localdomain podman[91425]: 2025-12-02 08:43:55.51825989 +0000 UTC m=+0.149246888 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git)
Dec 02 08:43:55 np0005541913.localdomain podman[91425]: 2025-12-02 08:43:55.548387935 +0000 UTC m=+0.179374883 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:43:55 np0005541913.localdomain podman[91424]: 2025-12-02 08:43:55.447778803 +0000 UTC m=+0.086332796 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com)
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:43:55 np0005541913.localdomain podman[91423]: 2025-12-02 08:43:55.568223932 +0000 UTC m=+0.205829870 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, tcib_managed=true, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.openshift.expose-services=, container_name=logrotate_crond, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:43:55 np0005541913.localdomain podman[91438]: 2025-12-02 08:43:55.622474039 +0000 UTC m=+0.243145349 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:43:55 np0005541913.localdomain podman[91438]: 2025-12-02 08:43:55.681009513 +0000 UTC m=+0.301680783 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi)
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:43:55 np0005541913.localdomain podman[91423]: 2025-12-02 08:43:55.704082557 +0000 UTC m=+0.341688445 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, release=1761123044, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:43:55 np0005541913.localdomain podman[91424]: 2025-12-02 08:43:55.832058679 +0000 UTC m=+0.470612742 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target)
Dec 02 08:43:55 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:43:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:43:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:43:59 np0005541913.localdomain podman[91539]: 2025-12-02 08:43:59.44326629 +0000 UTC m=+0.085966977 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:43:59 np0005541913.localdomain podman[91540]: 2025-12-02 08:43:59.505435452 +0000 UTC m=+0.141976682 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:43:59 np0005541913.localdomain podman[91539]: 2025-12-02 08:43:59.516676976 +0000 UTC m=+0.159377643 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 02 08:43:59 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:43:59 np0005541913.localdomain podman[91540]: 2025-12-02 08:43:59.55967043 +0000 UTC m=+0.196211600 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12)
Dec 02 08:43:59 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:44:06 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:44:06 np0005541913.localdomain recover_tripleo_nova_virtqemud[91588]: 62312
Dec 02 08:44:06 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:44:06 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:44:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:44:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:44:07 np0005541913.localdomain systemd[1]: tmp-crun.wnDMCr.mount: Deactivated successfully.
Dec 02 08:44:07 np0005541913.localdomain podman[91590]: 2025-12-02 08:44:07.455634103 +0000 UTC m=+0.090131949 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Dec 02 08:44:07 np0005541913.localdomain systemd[1]: tmp-crun.PA4WNY.mount: Deactivated successfully.
Dec 02 08:44:07 np0005541913.localdomain podman[91589]: 2025-12-02 08:44:07.500688132 +0000 UTC m=+0.137914602 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container)
Dec 02 08:44:07 np0005541913.localdomain podman[91589]: 2025-12-02 08:44:07.532869213 +0000 UTC m=+0.170095683 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:44:07 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:44:07 np0005541913.localdomain podman[91590]: 2025-12-02 08:44:07.552932476 +0000 UTC m=+0.187430282 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, container_name=iscsid)
Dec 02 08:44:07 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:44:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:44:17 np0005541913.localdomain podman[91629]: 2025-12-02 08:44:17.4650008 +0000 UTC m=+0.089630445 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:44:17 np0005541913.localdomain podman[91629]: 2025-12-02 08:44:17.673947633 +0000 UTC m=+0.298577298 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, container_name=metrics_qdr, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:44:17 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:44:20 np0005541913.localdomain sudo[91658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:44:20 np0005541913.localdomain sudo[91658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:44:20 np0005541913.localdomain sudo[91658]: pam_unix(sudo:session): session closed for user root
Dec 02 08:44:20 np0005541913.localdomain sudo[91673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:44:20 np0005541913.localdomain sudo[91673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:44:21 np0005541913.localdomain sudo[91673]: pam_unix(sudo:session): session closed for user root
Dec 02 08:44:21 np0005541913.localdomain sudo[91720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:44:21 np0005541913.localdomain sudo[91720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:44:21 np0005541913.localdomain sudo[91720]: pam_unix(sudo:session): session closed for user root
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:44:26 np0005541913.localdomain podman[91738]: 2025-12-02 08:44:26.457429486 +0000 UTC m=+0.084011794 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:44:26 np0005541913.localdomain podman[91738]: 2025-12-02 08:44:26.509960027 +0000 UTC m=+0.136542335 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:44:26 np0005541913.localdomain podman[91737]: 2025-12-02 08:44:26.510789339 +0000 UTC m=+0.136336459 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:44:26 np0005541913.localdomain podman[91736]: 2025-12-02 08:44:26.56368356 +0000 UTC m=+0.192434977 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:44:26 np0005541913.localdomain podman[91735]: 2025-12-02 08:44:26.607476105 +0000 UTC m=+0.235942154 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:44:26 np0005541913.localdomain podman[91735]: 2025-12-02 08:44:26.617927997 +0000 UTC m=+0.246394046 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:44:26 np0005541913.localdomain podman[91739]: 2025-12-02 08:44:26.663374778 +0000 UTC m=+0.283662686 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4)
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:44:26 np0005541913.localdomain podman[91739]: 2025-12-02 08:44:26.688015093 +0000 UTC m=+0.308303041 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:44:26 np0005541913.localdomain podman[91737]: 2025-12-02 08:44:26.699297849 +0000 UTC m=+0.324844989 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4)
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:44:26 np0005541913.localdomain podman[91736]: 2025-12-02 08:44:26.947085352 +0000 UTC m=+0.575836739 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:44:26 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:44:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:44:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:44:30 np0005541913.localdomain podman[91850]: 2025-12-02 08:44:30.445434101 +0000 UTC m=+0.083533141 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public)
Dec 02 08:44:30 np0005541913.localdomain podman[91850]: 2025-12-02 08:44:30.494115018 +0000 UTC m=+0.132214028 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Dec 02 08:44:30 np0005541913.localdomain podman[91851]: 2025-12-02 08:44:30.507476369 +0000 UTC m=+0.141966352 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4)
Dec 02 08:44:30 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:44:30 np0005541913.localdomain podman[91851]: 2025-12-02 08:44:30.536056522 +0000 UTC m=+0.170546515 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4)
Dec 02 08:44:30 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:44:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:44:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:44:38 np0005541913.localdomain systemd[1]: tmp-crun.X0aibn.mount: Deactivated successfully.
Dec 02 08:44:38 np0005541913.localdomain podman[91898]: 2025-12-02 08:44:38.443906159 +0000 UTC m=+0.086230514 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd)
Dec 02 08:44:38 np0005541913.localdomain podman[91898]: 2025-12-02 08:44:38.449776488 +0000 UTC m=+0.092100853 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=collectd, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:44:38 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:44:38 np0005541913.localdomain podman[91899]: 2025-12-02 08:44:38.534211812 +0000 UTC m=+0.176794424 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 02 08:44:38 np0005541913.localdomain podman[91899]: 2025-12-02 08:44:38.546951417 +0000 UTC m=+0.189534059 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container)
Dec 02 08:44:38 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:44:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:44:48 np0005541913.localdomain systemd[1]: tmp-crun.wifcyg.mount: Deactivated successfully.
Dec 02 08:44:48 np0005541913.localdomain podman[91936]: 2025-12-02 08:44:48.444122128 +0000 UTC m=+0.087304817 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 02 08:44:48 np0005541913.localdomain podman[91936]: 2025-12-02 08:44:48.621765329 +0000 UTC m=+0.264947948 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:44:48 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: tmp-crun.cqvwyB.mount: Deactivated successfully.
Dec 02 08:44:57 np0005541913.localdomain podman[91973]: 2025-12-02 08:44:57.475063103 +0000 UTC m=+0.099922074 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, version=17.1.12)
Dec 02 08:44:57 np0005541913.localdomain podman[91967]: 2025-12-02 08:44:57.445830549 +0000 UTC m=+0.079032694 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:44:57 np0005541913.localdomain podman[91973]: 2025-12-02 08:44:57.505054309 +0000 UTC m=+0.129913270 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:44:57 np0005541913.localdomain podman[91967]: 2025-12-02 08:44:57.527762249 +0000 UTC m=+0.160964424 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, config_id=tripleo_step5)
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:44:57 np0005541913.localdomain podman[91978]: 2025-12-02 08:44:57.569840579 +0000 UTC m=+0.192781919 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:44:57 np0005541913.localdomain podman[91978]: 2025-12-02 08:44:57.596972668 +0000 UTC m=+0.219913928 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:44:57 np0005541913.localdomain podman[91965]: 2025-12-02 08:44:57.61679273 +0000 UTC m=+0.256658814 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-cron, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 08:44:57 np0005541913.localdomain podman[91966]: 2025-12-02 08:44:57.571029442 +0000 UTC m=+0.205199544 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step4, architecture=x86_64, container_name=nova_migration_target, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:44:57 np0005541913.localdomain podman[91965]: 2025-12-02 08:44:57.629017398 +0000 UTC m=+0.268883502 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container)
Dec 02 08:44:57 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:44:57 np0005541913.localdomain podman[91966]: 2025-12-02 08:44:57.935151712 +0000 UTC m=+0.569321864 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:44:58 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:45:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:45:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:45:01 np0005541913.localdomain systemd[1]: tmp-crun.1weSe6.mount: Deactivated successfully.
Dec 02 08:45:01 np0005541913.localdomain podman[92086]: 2025-12-02 08:45:01.445534028 +0000 UTC m=+0.082763095 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, architecture=x86_64, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=)
Dec 02 08:45:01 np0005541913.localdomain podman[92085]: 2025-12-02 08:45:01.499746924 +0000 UTC m=+0.139403736 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:45:01 np0005541913.localdomain podman[92086]: 2025-12-02 08:45:01.550829445 +0000 UTC m=+0.188058532 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team)
Dec 02 08:45:01 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:45:01 np0005541913.localdomain podman[92085]: 2025-12-02 08:45:01.569929149 +0000 UTC m=+0.209585971 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 02 08:45:01 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:45:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:45:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:45:09 np0005541913.localdomain podman[92135]: 2025-12-02 08:45:09.428916622 +0000 UTC m=+0.070165655 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, container_name=iscsid, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:45:09 np0005541913.localdomain podman[92135]: 2025-12-02 08:45:09.467337524 +0000 UTC m=+0.108586497 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:45:09 np0005541913.localdomain systemd[1]: tmp-crun.16VdUl.mount: Deactivated successfully.
Dec 02 08:45:09 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:45:09 np0005541913.localdomain podman[92134]: 2025-12-02 08:45:09.492679465 +0000 UTC m=+0.136654552 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:45:09 np0005541913.localdomain podman[92134]: 2025-12-02 08:45:09.501275996 +0000 UTC m=+0.145251123 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=collectd, name=rhosp17/openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container)
Dec 02 08:45:09 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:45:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:45:19 np0005541913.localdomain systemd[1]: tmp-crun.rALoFN.mount: Deactivated successfully.
Dec 02 08:45:19 np0005541913.localdomain podman[92174]: 2025-12-02 08:45:19.456491651 +0000 UTC m=+0.098197989 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true)
Dec 02 08:45:19 np0005541913.localdomain podman[92174]: 2025-12-02 08:45:19.658381723 +0000 UTC m=+0.300088081 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc.)
Dec 02 08:45:19 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:45:22 np0005541913.localdomain sudo[92203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:45:22 np0005541913.localdomain sudo[92203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:45:22 np0005541913.localdomain sudo[92203]: pam_unix(sudo:session): session closed for user root
Dec 02 08:45:22 np0005541913.localdomain sudo[92218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:45:22 np0005541913.localdomain sudo[92218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:45:22 np0005541913.localdomain sudo[92218]: pam_unix(sudo:session): session closed for user root
Dec 02 08:45:26 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:45:26 np0005541913.localdomain recover_tripleo_nova_virtqemud[92265]: 62312
Dec 02 08:45:26 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:45:26 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:45:27 np0005541913.localdomain sudo[92266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:45:27 np0005541913.localdomain sudo[92266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:45:27 np0005541913.localdomain sudo[92266]: pam_unix(sudo:session): session closed for user root
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:45:28 np0005541913.localdomain podman[92284]: 2025-12-02 08:45:28.458279764 +0000 UTC m=+0.081743787 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: tmp-crun.wt97bW.mount: Deactivated successfully.
Dec 02 08:45:28 np0005541913.localdomain podman[92283]: 2025-12-02 08:45:28.525997544 +0000 UTC m=+0.151478151 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:45:28 np0005541913.localdomain podman[92284]: 2025-12-02 08:45:28.537008638 +0000 UTC m=+0.160472631 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public)
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:45:28 np0005541913.localdomain podman[92283]: 2025-12-02 08:45:28.556954315 +0000 UTC m=+0.182434992 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step5, release=1761123044, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12)
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:45:28 np0005541913.localdomain podman[92290]: 2025-12-02 08:45:28.618931309 +0000 UTC m=+0.241087416 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:45:28 np0005541913.localdomain podman[92281]: 2025-12-02 08:45:28.667845903 +0000 UTC m=+0.301929261 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:45:28 np0005541913.localdomain podman[92281]: 2025-12-02 08:45:28.675299583 +0000 UTC m=+0.309382941 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:45:28 np0005541913.localdomain podman[92282]: 2025-12-02 08:45:28.717579449 +0000 UTC m=+0.346187580 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:45:28 np0005541913.localdomain podman[92290]: 2025-12-02 08:45:28.725772259 +0000 UTC m=+0.347928366 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:45:28 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:45:29 np0005541913.localdomain podman[92282]: 2025-12-02 08:45:29.02814136 +0000 UTC m=+0.656749551 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:45:29 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:45:32 np0005541913.localdomain sshd[92403]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 08:45:32 np0005541913.localdomain sshd[92403]: error: kex_exchange_identification: Connection closed by remote host
Dec 02 08:45:32 np0005541913.localdomain sshd[92403]: Connection closed by 217.170.199.90 port 48552
Dec 02 08:45:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:45:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:45:32 np0005541913.localdomain podman[92405]: 2025-12-02 08:45:32.453886923 +0000 UTC m=+0.083962746 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:45:32 np0005541913.localdomain podman[92404]: 2025-12-02 08:45:32.512201959 +0000 UTC m=+0.142115088 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64)
Dec 02 08:45:32 np0005541913.localdomain systemd[1]: tmp-crun.w7UlE6.mount: Deactivated successfully.
Dec 02 08:45:32 np0005541913.localdomain podman[92405]: 2025-12-02 08:45:32.503070034 +0000 UTC m=+0.133145807 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, container_name=ovn_controller, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:45:32 np0005541913.localdomain podman[92404]: 2025-12-02 08:45:32.554109514 +0000 UTC m=+0.184022663 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4)
Dec 02 08:45:32 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:45:32 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:45:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:45:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:45:40 np0005541913.localdomain systemd[1]: tmp-crun.dyuYyu.mount: Deactivated successfully.
Dec 02 08:45:40 np0005541913.localdomain podman[92452]: 2025-12-02 08:45:40.453211649 +0000 UTC m=+0.090077390 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc.)
Dec 02 08:45:40 np0005541913.localdomain podman[92452]: 2025-12-02 08:45:40.469049434 +0000 UTC m=+0.105915225 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z)
Dec 02 08:45:40 np0005541913.localdomain podman[92451]: 2025-12-02 08:45:40.519652544 +0000 UTC m=+0.159393283 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1)
Dec 02 08:45:40 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:45:40 np0005541913.localdomain podman[92451]: 2025-12-02 08:45:40.579883271 +0000 UTC m=+0.219624050 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-collectd, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=collectd)
Dec 02 08:45:40 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:45:41 np0005541913.localdomain systemd[1]: tmp-crun.hylmaH.mount: Deactivated successfully.
Dec 02 08:45:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:45:50 np0005541913.localdomain podman[92488]: 2025-12-02 08:45:50.438688533 +0000 UTC m=+0.080819212 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, release=1761123044, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:45:50 np0005541913.localdomain podman[92488]: 2025-12-02 08:45:50.640576265 +0000 UTC m=+0.282706914 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr)
Dec 02 08:45:50 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:45:59 np0005541913.localdomain podman[92517]: 2025-12-02 08:45:59.458417883 +0000 UTC m=+0.091536529 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: tmp-crun.Y7lH0V.mount: Deactivated successfully.
Dec 02 08:45:59 np0005541913.localdomain podman[92518]: 2025-12-02 08:45:59.559510789 +0000 UTC m=+0.190483238 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:45:59 np0005541913.localdomain podman[92519]: 2025-12-02 08:45:59.53011272 +0000 UTC m=+0.159125236 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=)
Dec 02 08:45:59 np0005541913.localdomain podman[92518]: 2025-12-02 08:45:59.586877314 +0000 UTC m=+0.217849763 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, config_id=tripleo_step5, vendor=Red Hat, Inc., distribution-scope=public)
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:45:59 np0005541913.localdomain podman[92519]: 2025-12-02 08:45:59.612866452 +0000 UTC m=+0.241878948 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:45:59 np0005541913.localdomain podman[92516]: 2025-12-02 08:45:59.666991696 +0000 UTC m=+0.303227675 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=logrotate_crond, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_id=tripleo_step4, release=1761123044)
Dec 02 08:45:59 np0005541913.localdomain podman[92516]: 2025-12-02 08:45:59.678488875 +0000 UTC m=+0.314724844 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:45:59 np0005541913.localdomain podman[92522]: 2025-12-02 08:45:59.721915751 +0000 UTC m=+0.347183047 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:45:59 np0005541913.localdomain podman[92522]: 2025-12-02 08:45:59.75202628 +0000 UTC m=+0.377293616 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:45:59 np0005541913.localdomain podman[92517]: 2025-12-02 08:45:59.819078161 +0000 UTC m=+0.452196807 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4)
Dec 02 08:45:59 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:46:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:46:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:46:03 np0005541913.localdomain podman[92639]: 2025-12-02 08:46:03.442567755 +0000 UTC m=+0.076689291 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 02 08:46:03 np0005541913.localdomain podman[92639]: 2025-12-02 08:46:03.484157881 +0000 UTC m=+0.118279407 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:46:03 np0005541913.localdomain podman[92640]: 2025-12-02 08:46:03.49600722 +0000 UTC m=+0.125879662 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-type=git)
Dec 02 08:46:03 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:46:03 np0005541913.localdomain podman[92640]: 2025-12-02 08:46:03.543825745 +0000 UTC m=+0.173698197 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4)
Dec 02 08:46:03 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:46:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:46:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:46:11 np0005541913.localdomain podman[92685]: 2025-12-02 08:46:11.430554217 +0000 UTC m=+0.070859114 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:46:11 np0005541913.localdomain podman[92685]: 2025-12-02 08:46:11.438829679 +0000 UTC m=+0.079134596 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, version=17.1.12, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Dec 02 08:46:11 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:46:11 np0005541913.localdomain systemd[1]: tmp-crun.cBfkSW.mount: Deactivated successfully.
Dec 02 08:46:11 np0005541913.localdomain podman[92686]: 2025-12-02 08:46:11.501849961 +0000 UTC m=+0.138370777 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public)
Dec 02 08:46:11 np0005541913.localdomain podman[92686]: 2025-12-02 08:46:11.538081024 +0000 UTC m=+0.174601850 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, version=17.1.12, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 02 08:46:11 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:46:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:46:21 np0005541913.localdomain podman[92722]: 2025-12-02 08:46:21.425845122 +0000 UTC m=+0.073026462 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step1)
Dec 02 08:46:21 np0005541913.localdomain podman[92722]: 2025-12-02 08:46:21.614053627 +0000 UTC m=+0.261234967 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true)
Dec 02 08:46:21 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:46:27 np0005541913.localdomain sudo[92752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:46:27 np0005541913.localdomain sudo[92752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:46:27 np0005541913.localdomain sudo[92752]: pam_unix(sudo:session): session closed for user root
Dec 02 08:46:27 np0005541913.localdomain sudo[92767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:46:27 np0005541913.localdomain sudo[92767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:46:28 np0005541913.localdomain sudo[92767]: pam_unix(sudo:session): session closed for user root
Dec 02 08:46:28 np0005541913.localdomain sudo[92814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:46:28 np0005541913.localdomain sudo[92814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:46:28 np0005541913.localdomain sudo[92814]: pam_unix(sudo:session): session closed for user root
Dec 02 08:46:28 np0005541913.localdomain sudo[92829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 08:46:28 np0005541913.localdomain sudo[92829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:46:28 np0005541913.localdomain sudo[92829]: pam_unix(sudo:session): session closed for user root
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: tmp-crun.hqydS2.mount: Deactivated successfully.
Dec 02 08:46:30 np0005541913.localdomain podman[92863]: 2025-12-02 08:46:30.466977269 +0000 UTC m=+0.102708699 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron)
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: tmp-crun.bE1VCf.mount: Deactivated successfully.
Dec 02 08:46:30 np0005541913.localdomain podman[92866]: 2025-12-02 08:46:30.506911683 +0000 UTC m=+0.135572793 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_compute, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:46:30 np0005541913.localdomain podman[92867]: 2025-12-02 08:46:30.514705391 +0000 UTC m=+0.140607967 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:46:30 np0005541913.localdomain podman[92863]: 2025-12-02 08:46:30.525765108 +0000 UTC m=+0.161496508 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, architecture=x86_64, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044)
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:46:30 np0005541913.localdomain podman[92866]: 2025-12-02 08:46:30.537565616 +0000 UTC m=+0.166226796 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:46:30 np0005541913.localdomain podman[92867]: 2025-12-02 08:46:30.559432193 +0000 UTC m=+0.185334689 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:46:30 np0005541913.localdomain podman[92865]: 2025-12-02 08:46:30.568683902 +0000 UTC m=+0.201848613 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:46:30 np0005541913.localdomain podman[92864]: 2025-12-02 08:46:30.602329806 +0000 UTC m=+0.237399748 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=nova_migration_target, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:46:30 np0005541913.localdomain podman[92865]: 2025-12-02 08:46:30.618990763 +0000 UTC m=+0.252155424 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vcs-type=git, container_name=nova_compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, config_id=tripleo_step5)
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:46:30 np0005541913.localdomain podman[92864]: 2025-12-02 08:46:30.976996148 +0000 UTC m=+0.612066140 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_migration_target, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4)
Dec 02 08:46:30 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:46:32 np0005541913.localdomain sudo[92979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:46:32 np0005541913.localdomain sudo[92979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:46:32 np0005541913.localdomain sudo[92979]: pam_unix(sudo:session): session closed for user root
Dec 02 08:46:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:46:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:46:34 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:46:34 np0005541913.localdomain recover_tripleo_nova_virtqemud[93004]: 62312
Dec 02 08:46:34 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:46:34 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:46:34 np0005541913.localdomain systemd[1]: tmp-crun.cyK3p6.mount: Deactivated successfully.
Dec 02 08:46:34 np0005541913.localdomain systemd[1]: tmp-crun.n63NaA.mount: Deactivated successfully.
Dec 02 08:46:34 np0005541913.localdomain podman[92995]: 2025-12-02 08:46:34.623488989 +0000 UTC m=+0.259424199 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 02 08:46:34 np0005541913.localdomain podman[92994]: 2025-12-02 08:46:34.575813189 +0000 UTC m=+0.214636956 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:46:34 np0005541913.localdomain podman[92994]: 2025-12-02 08:46:34.70913089 +0000 UTC m=+0.347954637 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true)
Dec 02 08:46:34 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:46:34 np0005541913.localdomain podman[92995]: 2025-12-02 08:46:34.729506327 +0000 UTC m=+0.365441507 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public)
Dec 02 08:46:34 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:46:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:46:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:46:42 np0005541913.localdomain podman[93043]: 2025-12-02 08:46:42.447826585 +0000 UTC m=+0.089248008 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, container_name=collectd, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 02 08:46:42 np0005541913.localdomain podman[93043]: 2025-12-02 08:46:42.458883542 +0000 UTC m=+0.100304935 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3)
Dec 02 08:46:42 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:46:42 np0005541913.localdomain systemd[1]: tmp-crun.s8H7Hd.mount: Deactivated successfully.
Dec 02 08:46:42 np0005541913.localdomain podman[93044]: 2025-12-02 08:46:42.552150977 +0000 UTC m=+0.190407025 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-iscsid-container, version=17.1.12, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:46:42 np0005541913.localdomain podman[93044]: 2025-12-02 08:46:42.566994197 +0000 UTC m=+0.205250215 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, release=1761123044, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:46:42 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:46:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:46:52 np0005541913.localdomain podman[93081]: 2025-12-02 08:46:52.492859696 +0000 UTC m=+0.132010987 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team)
Dec 02 08:46:52 np0005541913.localdomain podman[93081]: 2025-12-02 08:46:52.688916372 +0000 UTC m=+0.328067603 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step1)
Dec 02 08:46:52 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: tmp-crun.czvx4g.mount: Deactivated successfully.
Dec 02 08:47:01 np0005541913.localdomain podman[93110]: 2025-12-02 08:47:01.457748686 +0000 UTC m=+0.089897826 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, release=1761123044, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:47:01 np0005541913.localdomain podman[93110]: 2025-12-02 08:47:01.498978443 +0000 UTC m=+0.131127533 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:47:01 np0005541913.localdomain podman[93113]: 2025-12-02 08:47:01.470986102 +0000 UTC m=+0.091386156 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute)
Dec 02 08:47:01 np0005541913.localdomain podman[93113]: 2025-12-02 08:47:01.550303942 +0000 UTC m=+0.170704016 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, distribution-scope=public, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64)
Dec 02 08:47:01 np0005541913.localdomain podman[93112]: 2025-12-02 08:47:01.49884284 +0000 UTC m=+0.124769583 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step5, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, io.openshift.expose-services=)
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:47:01 np0005541913.localdomain podman[93119]: 2025-12-02 08:47:01.562075088 +0000 UTC m=+0.183812308 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, architecture=x86_64, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:47:01 np0005541913.localdomain podman[93111]: 2025-12-02 08:47:01.610649883 +0000 UTC m=+0.237975683 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:47:01 np0005541913.localdomain podman[93112]: 2025-12-02 08:47:01.630444525 +0000 UTC m=+0.256371228 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vendor=Red Hat, Inc., container_name=nova_compute, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:47:01 np0005541913.localdomain podman[93119]: 2025-12-02 08:47:01.687549538 +0000 UTC m=+0.309286768 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Dec 02 08:47:01 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:47:02 np0005541913.localdomain podman[93111]: 2025-12-02 08:47:02.016980717 +0000 UTC m=+0.644306477 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_migration_target, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team)
Dec 02 08:47:02 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:47:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:47:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:47:05 np0005541913.localdomain podman[93232]: 2025-12-02 08:47:05.445868852 +0000 UTC m=+0.082814426 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public)
Dec 02 08:47:05 np0005541913.localdomain podman[93232]: 2025-12-02 08:47:05.467217855 +0000 UTC m=+0.104163369 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com)
Dec 02 08:47:05 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Deactivated successfully.
Dec 02 08:47:05 np0005541913.localdomain podman[93231]: 2025-12-02 08:47:05.54446753 +0000 UTC m=+0.184957088 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 08:47:05 np0005541913.localdomain podman[93231]: 2025-12-02 08:47:05.617362988 +0000 UTC m=+0.257852506 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team)
Dec 02 08:47:05 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 4776 writes, 21K keys, 4776 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4776 writes, 569 syncs, 8.39 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:47:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:47:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:47:13 np0005541913.localdomain systemd[1]: tmp-crun.1urBdZ.mount: Deactivated successfully.
Dec 02 08:47:13 np0005541913.localdomain podman[93280]: 2025-12-02 08:47:13.431524756 +0000 UTC m=+0.078964171 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, release=1761123044, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:47:13 np0005541913.localdomain podman[93280]: 2025-12-02 08:47:13.463781143 +0000 UTC m=+0.111220618 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 02 08:47:13 np0005541913.localdomain systemd[1]: tmp-crun.Kt1rVX.mount: Deactivated successfully.
Dec 02 08:47:13 np0005541913.localdomain podman[93281]: 2025-12-02 08:47:13.482010972 +0000 UTC m=+0.124603627 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, release=1761123044, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Dec 02 08:47:13 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:47:13 np0005541913.localdomain podman[93281]: 2025-12-02 08:47:13.49086423 +0000 UTC m=+0.133456875 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:47:13 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:47:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:47:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.2 total, 600.0 interval
                                                          Cumulative writes: 5722 writes, 25K keys, 5722 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5722 writes, 780 syncs, 7.34 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:47:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:47:23 np0005541913.localdomain podman[93319]: 2025-12-02 08:47:23.433228186 +0000 UTC m=+0.078413737 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1)
Dec 02 08:47:23 np0005541913.localdomain podman[93319]: 2025-12-02 08:47:23.629831426 +0000 UTC m=+0.275016927 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, version=17.1.12, release=1761123044, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Dec 02 08:47:23 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: tmp-crun.6lJxol.mount: Deactivated successfully.
Dec 02 08:47:32 np0005541913.localdomain podman[93349]: 2025-12-02 08:47:32.442851775 +0000 UTC m=+0.075849338 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:47:32 np0005541913.localdomain podman[93356]: 2025-12-02 08:47:32.50482242 +0000 UTC m=+0.129306864 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z)
Dec 02 08:47:32 np0005541913.localdomain podman[93348]: 2025-12-02 08:47:32.464765003 +0000 UTC m=+0.099688998 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, distribution-scope=public, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4)
Dec 02 08:47:32 np0005541913.localdomain podman[93349]: 2025-12-02 08:47:32.52681027 +0000 UTC m=+0.159807833 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step5)
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:47:32 np0005541913.localdomain podman[93350]: 2025-12-02 08:47:32.492757755 +0000 UTC m=+0.121440282 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, architecture=x86_64, release=1761123044, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:47:32 np0005541913.localdomain podman[93356]: 2025-12-02 08:47:32.582971199 +0000 UTC m=+0.207455713 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z)
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:47:32 np0005541913.localdomain podman[93347]: 2025-12-02 08:47:32.595337291 +0000 UTC m=+0.232684991 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, distribution-scope=public, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Dec 02 08:47:32 np0005541913.localdomain podman[93347]: 2025-12-02 08:47:32.606890441 +0000 UTC m=+0.244238131 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container)
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:47:32 np0005541913.localdomain podman[93350]: 2025-12-02 08:47:32.626052596 +0000 UTC m=+0.254735133 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=)
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:47:32 np0005541913.localdomain sudo[93467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:47:32 np0005541913.localdomain sudo[93467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:47:32 np0005541913.localdomain sudo[93467]: pam_unix(sudo:session): session closed for user root
Dec 02 08:47:32 np0005541913.localdomain sudo[93482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:47:32 np0005541913.localdomain sudo[93482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:47:32 np0005541913.localdomain podman[93348]: 2025-12-02 08:47:32.834911736 +0000 UTC m=+0.469835681 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, distribution-scope=public, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:47:32 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:47:33 np0005541913.localdomain sudo[93482]: pam_unix(sudo:session): session closed for user root
Dec 02 08:47:33 np0005541913.localdomain sudo[93528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:47:33 np0005541913.localdomain sudo[93528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:47:33 np0005541913.localdomain sudo[93528]: pam_unix(sudo:session): session closed for user root
Dec 02 08:47:33 np0005541913.localdomain sudo[93543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 -- inventory --format=json-pretty --filter-for-batch
Dec 02 08:47:33 np0005541913.localdomain sudo[93543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:47:34 np0005541913.localdomain podman[93599]: 
Dec 02 08:47:34 np0005541913.localdomain podman[93599]: 2025-12-02 08:47:34.22156929 +0000 UTC m=+0.050934080 container create d1ad154b99925e7db1c9dc3e9b884fef57f0e9c016d1300c44ea436e75308645 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_hopper, build-date=2025-11-26T19:44:28Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, ceph=True, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, release=1763362218)
Dec 02 08:47:34 np0005541913.localdomain systemd[1]: Started libpod-conmon-d1ad154b99925e7db1c9dc3e9b884fef57f0e9c016d1300c44ea436e75308645.scope.
Dec 02 08:47:34 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:47:34 np0005541913.localdomain podman[93599]: 2025-12-02 08:47:34.299845762 +0000 UTC m=+0.129210542 container init d1ad154b99925e7db1c9dc3e9b884fef57f0e9c016d1300c44ea436e75308645 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_hopper, distribution-scope=public, ceph=True, name=rhceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 08:47:34 np0005541913.localdomain podman[93599]: 2025-12-02 08:47:34.201014317 +0000 UTC m=+0.030379127 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 08:47:34 np0005541913.localdomain podman[93599]: 2025-12-02 08:47:34.316131399 +0000 UTC m=+0.145496199 container start d1ad154b99925e7db1c9dc3e9b884fef57f0e9c016d1300c44ea436e75308645 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_hopper, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main)
Dec 02 08:47:34 np0005541913.localdomain podman[93599]: 2025-12-02 08:47:34.316413126 +0000 UTC m=+0.145777966 container attach d1ad154b99925e7db1c9dc3e9b884fef57f0e9c016d1300c44ea436e75308645 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_hopper, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, description=Red Hat Ceph Storage 7)
Dec 02 08:47:34 np0005541913.localdomain eager_hopper[93614]: 167 167
Dec 02 08:47:34 np0005541913.localdomain systemd[1]: libpod-d1ad154b99925e7db1c9dc3e9b884fef57f0e9c016d1300c44ea436e75308645.scope: Deactivated successfully.
Dec 02 08:47:34 np0005541913.localdomain podman[93599]: 2025-12-02 08:47:34.31948659 +0000 UTC m=+0.148851430 container died d1ad154b99925e7db1c9dc3e9b884fef57f0e9c016d1300c44ea436e75308645 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_hopper, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218)
Dec 02 08:47:34 np0005541913.localdomain podman[93621]: 2025-12-02 08:47:34.432840684 +0000 UTC m=+0.101143308 container remove d1ad154b99925e7db1c9dc3e9b884fef57f0e9c016d1300c44ea436e75308645 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_hopper, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=1763362218, name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 08:47:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-de2fc7a818f86380606426cf501fef0c61cec998eca7d5052a0b9382390fe4bc-merged.mount: Deactivated successfully.
Dec 02 08:47:34 np0005541913.localdomain systemd[1]: libpod-conmon-d1ad154b99925e7db1c9dc3e9b884fef57f0e9c016d1300c44ea436e75308645.scope: Deactivated successfully.
Dec 02 08:47:34 np0005541913.localdomain podman[93641]: 
Dec 02 08:47:34 np0005541913.localdomain podman[93641]: 2025-12-02 08:47:34.66240124 +0000 UTC m=+0.078442368 container create 21e5ea614fce29ed4fe452b65b2ab3f0058dd32793b3da06fa5faf6e9e14f034 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_jang, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64)
Dec 02 08:47:34 np0005541913.localdomain systemd[1]: Started libpod-conmon-21e5ea614fce29ed4fe452b65b2ab3f0058dd32793b3da06fa5faf6e9e14f034.scope.
Dec 02 08:47:34 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 08:47:34 np0005541913.localdomain podman[93641]: 2025-12-02 08:47:34.630240965 +0000 UTC m=+0.046282123 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 08:47:34 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97fa7eb2d0580ab8ea0f84e0ec946783783d366923a5943919feb94a484a5106/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 08:47:34 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97fa7eb2d0580ab8ea0f84e0ec946783783d366923a5943919feb94a484a5106/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 08:47:34 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97fa7eb2d0580ab8ea0f84e0ec946783783d366923a5943919feb94a484a5106/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 08:47:34 np0005541913.localdomain podman[93641]: 2025-12-02 08:47:34.735851882 +0000 UTC m=+0.151893020 container init 21e5ea614fce29ed4fe452b65b2ab3f0058dd32793b3da06fa5faf6e9e14f034 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_jang, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, version=7, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 02 08:47:34 np0005541913.localdomain podman[93641]: 2025-12-02 08:47:34.747903086 +0000 UTC m=+0.163944214 container start 21e5ea614fce29ed4fe452b65b2ab3f0058dd32793b3da06fa5faf6e9e14f034 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_jang, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-type=git, release=1763362218, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_BRANCH=main)
Dec 02 08:47:34 np0005541913.localdomain podman[93641]: 2025-12-02 08:47:34.748157423 +0000 UTC m=+0.164198591 container attach 21e5ea614fce29ed4fe452b65b2ab3f0058dd32793b3da06fa5faf6e9e14f034 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_jang, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, version=7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=)
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]: [
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:     {
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         "available": false,
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         "ceph_device": false,
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         "lsm_data": {},
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         "lvs": [],
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         "path": "/dev/sr0",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         "rejected_reasons": [
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "Insufficient space (<5GB)",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "Has a FileSystem"
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         ],
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         "sys_api": {
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "actuators": null,
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "device_nodes": "sr0",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "human_readable_size": "482.00 KB",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "id_bus": "ata",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "model": "QEMU DVD-ROM",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "nr_requests": "2",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "partitions": {},
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "path": "/dev/sr0",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "removable": "1",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "rev": "2.5+",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "ro": "0",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "rotational": "1",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "sas_address": "",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "sas_device_handle": "",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "scheduler_mode": "mq-deadline",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "sectors": 0,
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "sectorsize": "2048",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "size": 493568.0,
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "support_discard": "0",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "type": "disk",
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:             "vendor": "QEMU"
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:         }
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]:     }
Dec 02 08:47:35 np0005541913.localdomain blissful_jang[93656]: ]
Dec 02 08:47:35 np0005541913.localdomain systemd[1]: libpod-21e5ea614fce29ed4fe452b65b2ab3f0058dd32793b3da06fa5faf6e9e14f034.scope: Deactivated successfully.
Dec 02 08:47:35 np0005541913.localdomain podman[93641]: 2025-12-02 08:47:35.692043475 +0000 UTC m=+1.108084603 container died 21e5ea614fce29ed4fe452b65b2ab3f0058dd32793b3da06fa5faf6e9e14f034 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_jang, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, name=rhceph, release=1763362218, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.component=rhceph-container)
Dec 02 08:47:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:47:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:47:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-97fa7eb2d0580ab8ea0f84e0ec946783783d366923a5943919feb94a484a5106-merged.mount: Deactivated successfully.
Dec 02 08:47:35 np0005541913.localdomain podman[95513]: 2025-12-02 08:47:35.782775292 +0000 UTC m=+0.083450933 container remove 21e5ea614fce29ed4fe452b65b2ab3f0058dd32793b3da06fa5faf6e9e14f034 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_jang, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_CLEAN=True, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, io.openshift.expose-services=)
Dec 02 08:47:35 np0005541913.localdomain systemd[1]: libpod-conmon-21e5ea614fce29ed4fe452b65b2ab3f0058dd32793b3da06fa5faf6e9e14f034.scope: Deactivated successfully.
Dec 02 08:47:35 np0005541913.localdomain sudo[93543]: pam_unix(sudo:session): session closed for user root
Dec 02 08:47:35 np0005541913.localdomain systemd[1]: tmp-crun.XaAAmd.mount: Deactivated successfully.
Dec 02 08:47:35 np0005541913.localdomain podman[95522]: 2025-12-02 08:47:35.86275898 +0000 UTC m=+0.142206840 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git)
Dec 02 08:47:35 np0005541913.localdomain podman[95522]: 2025-12-02 08:47:35.883161228 +0000 UTC m=+0.162609128 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, tcib_managed=true)
Dec 02 08:47:35 np0005541913.localdomain podman[95522]: unhealthy
Dec 02 08:47:35 np0005541913.localdomain podman[95519]: 2025-12-02 08:47:35.840716399 +0000 UTC m=+0.120328453 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:47:35 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:47:35 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:47:35 np0005541913.localdomain podman[95519]: 2025-12-02 08:47:35.923114951 +0000 UTC m=+0.202726985 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:47:35 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:47:38 np0005541913.localdomain sudo[95575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:47:38 np0005541913.localdomain sudo[95575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:47:38 np0005541913.localdomain sudo[95575]: pam_unix(sudo:session): session closed for user root
Dec 02 08:47:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:47:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:47:44 np0005541913.localdomain podman[95591]: 2025-12-02 08:47:44.459685086 +0000 UTC m=+0.090297766 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Dec 02 08:47:44 np0005541913.localdomain systemd[1]: tmp-crun.2P7hxy.mount: Deactivated successfully.
Dec 02 08:47:44 np0005541913.localdomain podman[95590]: 2025-12-02 08:47:44.516173643 +0000 UTC m=+0.148146439 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, release=1761123044, config_id=tripleo_step3)
Dec 02 08:47:44 np0005541913.localdomain podman[95591]: 2025-12-02 08:47:44.525573926 +0000 UTC m=+0.156186556 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:47:44 np0005541913.localdomain podman[95590]: 2025-12-02 08:47:44.529075 +0000 UTC m=+0.161047846 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, release=1761123044, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=collectd, com.redhat.component=openstack-collectd-container)
Dec 02 08:47:44 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:47:44 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:47:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:47:54 np0005541913.localdomain systemd[1]: tmp-crun.Vl07kP.mount: Deactivated successfully.
Dec 02 08:47:54 np0005541913.localdomain podman[95629]: 2025-12-02 08:47:54.453244974 +0000 UTC m=+0.097298435 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step1, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team)
Dec 02 08:47:54 np0005541913.localdomain podman[95629]: 2025-12-02 08:47:54.689182881 +0000 UTC m=+0.333236312 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, version=17.1.12, vcs-type=git)
Dec 02 08:47:54 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:48:03 np0005541913.localdomain recover_tripleo_nova_virtqemud[95685]: 62312
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: tmp-crun.ktcNvC.mount: Deactivated successfully.
Dec 02 08:48:03 np0005541913.localdomain podman[95662]: 2025-12-02 08:48:03.453759598 +0000 UTC m=+0.084723746 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 02 08:48:03 np0005541913.localdomain podman[95662]: 2025-12-02 08:48:03.48881646 +0000 UTC m=+0.119780578 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:48:03 np0005541913.localdomain podman[95660]: 2025-12-02 08:48:03.513416311 +0000 UTC m=+0.147540714 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 02 08:48:03 np0005541913.localdomain podman[95660]: 2025-12-02 08:48:03.545047821 +0000 UTC m=+0.179172254 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_compute, version=17.1.12, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4, distribution-scope=public)
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:48:03 np0005541913.localdomain podman[95659]: 2025-12-02 08:48:03.594849909 +0000 UTC m=+0.228808177 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.buildah.version=1.41.4, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com)
Dec 02 08:48:03 np0005541913.localdomain podman[95661]: 2025-12-02 08:48:03.553359854 +0000 UTC m=+0.181604129 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4)
Dec 02 08:48:03 np0005541913.localdomain podman[95658]: 2025-12-02 08:48:03.654556172 +0000 UTC m=+0.289880497 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-cron-container, release=1761123044, version=17.1.12, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public)
Dec 02 08:48:03 np0005541913.localdomain podman[95658]: 2025-12-02 08:48:03.688012921 +0000 UTC m=+0.323337276 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public)
Dec 02 08:48:03 np0005541913.localdomain podman[95661]: 2025-12-02 08:48:03.686778228 +0000 UTC m=+0.315022453 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true)
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:48:03 np0005541913.localdomain podman[95659]: 2025-12-02 08:48:03.941655714 +0000 UTC m=+0.575614032 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:48:03 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:48:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:48:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:48:06 np0005541913.localdomain systemd[1]: tmp-crun.GYZbCk.mount: Deactivated successfully.
Dec 02 08:48:06 np0005541913.localdomain podman[95778]: 2025-12-02 08:48:06.449727149 +0000 UTC m=+0.089516016 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com)
Dec 02 08:48:06 np0005541913.localdomain podman[95779]: 2025-12-02 08:48:06.480788963 +0000 UTC m=+0.121231227 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller)
Dec 02 08:48:06 np0005541913.localdomain podman[95779]: 2025-12-02 08:48:06.505975329 +0000 UTC m=+0.146417653 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Dec 02 08:48:06 np0005541913.localdomain podman[95779]: unhealthy
Dec 02 08:48:06 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:48:06 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:48:06 np0005541913.localdomain podman[95778]: 2025-12-02 08:48:06.531064763 +0000 UTC m=+0.170853680 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Dec 02 08:48:06 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Deactivated successfully.
Dec 02 08:48:07 np0005541913.localdomain systemd[1]: tmp-crun.VFUmya.mount: Deactivated successfully.
Dec 02 08:48:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:48:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:48:15 np0005541913.localdomain systemd[1]: tmp-crun.TccHNc.mount: Deactivated successfully.
Dec 02 08:48:15 np0005541913.localdomain podman[95829]: 2025-12-02 08:48:15.441397688 +0000 UTC m=+0.078924861 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, container_name=iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-iscsid, version=17.1.12, batch=17.1_20251118.1)
Dec 02 08:48:15 np0005541913.localdomain podman[95828]: 2025-12-02 08:48:15.458181009 +0000 UTC m=+0.094276574 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, container_name=collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible)
Dec 02 08:48:15 np0005541913.localdomain podman[95829]: 2025-12-02 08:48:15.477752374 +0000 UTC m=+0.115279507 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid)
Dec 02 08:48:15 np0005541913.localdomain podman[95828]: 2025-12-02 08:48:15.478445743 +0000 UTC m=+0.114541308 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:48:15 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:48:15 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:48:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:48:25 np0005541913.localdomain podman[95867]: 2025-12-02 08:48:25.446538685 +0000 UTC m=+0.089825274 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step1, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:48:25 np0005541913.localdomain podman[95867]: 2025-12-02 08:48:25.674056386 +0000 UTC m=+0.317342975 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd)
Dec 02 08:48:25 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:48:34 np0005541913.localdomain podman[95900]: 2025-12-02 08:48:34.456944468 +0000 UTC m=+0.082748594 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: tmp-crun.u6Tgem.mount: Deactivated successfully.
Dec 02 08:48:34 np0005541913.localdomain podman[95902]: 2025-12-02 08:48:34.518867041 +0000 UTC m=+0.140112505 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:48:34 np0005541913.localdomain podman[95900]: 2025-12-02 08:48:34.536374251 +0000 UTC m=+0.162178427 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, container_name=nova_compute, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: tmp-crun.VWwctY.mount: Deactivated successfully.
Dec 02 08:48:34 np0005541913.localdomain podman[95902]: 2025-12-02 08:48:34.579051187 +0000 UTC m=+0.200296691 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:48:34 np0005541913.localdomain podman[95908]: 2025-12-02 08:48:34.579237522 +0000 UTC m=+0.198108682 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044)
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:48:34 np0005541913.localdomain podman[95908]: 2025-12-02 08:48:34.606849583 +0000 UTC m=+0.225720733 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true)
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:48:34 np0005541913.localdomain podman[95898]: 2025-12-02 08:48:34.689677388 +0000 UTC m=+0.321509877 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, tcib_managed=true, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=logrotate_crond, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4)
Dec 02 08:48:34 np0005541913.localdomain podman[95899]: 2025-12-02 08:48:34.728905912 +0000 UTC m=+0.357347239 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:48:34 np0005541913.localdomain podman[95898]: 2025-12-02 08:48:34.747541962 +0000 UTC m=+0.379374430 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team)
Dec 02 08:48:34 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:48:35 np0005541913.localdomain podman[95899]: 2025-12-02 08:48:35.120110629 +0000 UTC m=+0.748551976 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:48:35 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:48:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:48:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:48:37 np0005541913.localdomain podman[96020]: 2025-12-02 08:48:37.440852142 +0000 UTC m=+0.083323409 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:48:37 np0005541913.localdomain podman[96020]: 2025-12-02 08:48:37.490091504 +0000 UTC m=+0.132562751 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:48:37 np0005541913.localdomain podman[96020]: unhealthy
Dec 02 08:48:37 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:48:37 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:48:37 np0005541913.localdomain podman[96021]: 2025-12-02 08:48:37.489879419 +0000 UTC m=+0.130390833 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible)
Dec 02 08:48:37 np0005541913.localdomain podman[96021]: 2025-12-02 08:48:37.570803272 +0000 UTC m=+0.211314676 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Dec 02 08:48:37 np0005541913.localdomain podman[96021]: unhealthy
Dec 02 08:48:37 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:48:37 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:48:39 np0005541913.localdomain sudo[96061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:48:39 np0005541913.localdomain sudo[96061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:48:39 np0005541913.localdomain sudo[96061]: pam_unix(sudo:session): session closed for user root
Dec 02 08:48:39 np0005541913.localdomain sudo[96076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 08:48:39 np0005541913.localdomain sudo[96076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:48:39 np0005541913.localdomain sudo[96076]: pam_unix(sudo:session): session closed for user root
Dec 02 08:48:39 np0005541913.localdomain sudo[96112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:48:39 np0005541913.localdomain sudo[96112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:48:39 np0005541913.localdomain sudo[96112]: pam_unix(sudo:session): session closed for user root
Dec 02 08:48:39 np0005541913.localdomain sudo[96127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:48:39 np0005541913.localdomain sudo[96127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:48:40 np0005541913.localdomain sudo[96127]: pam_unix(sudo:session): session closed for user root
Dec 02 08:48:41 np0005541913.localdomain sudo[96174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:48:41 np0005541913.localdomain sudo[96174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:48:41 np0005541913.localdomain sudo[96174]: pam_unix(sudo:session): session closed for user root
Dec 02 08:48:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:48:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:48:46 np0005541913.localdomain podman[96190]: 2025-12-02 08:48:46.4604208 +0000 UTC m=+0.098158507 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, com.redhat.component=openstack-iscsid-container, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Dec 02 08:48:46 np0005541913.localdomain podman[96190]: 2025-12-02 08:48:46.468798955 +0000 UTC m=+0.106536642 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:48:46 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:48:46 np0005541913.localdomain podman[96189]: 2025-12-02 08:48:46.434636618 +0000 UTC m=+0.075398056 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Dec 02 08:48:46 np0005541913.localdomain podman[96189]: 2025-12-02 08:48:46.519366774 +0000 UTC m=+0.160128182 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Dec 02 08:48:46 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:48:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:48:56 np0005541913.localdomain systemd[1]: tmp-crun.C11yIt.mount: Deactivated successfully.
Dec 02 08:48:56 np0005541913.localdomain podman[96228]: 2025-12-02 08:48:56.450001911 +0000 UTC m=+0.094168870 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12)
Dec 02 08:48:56 np0005541913.localdomain podman[96228]: 2025-12-02 08:48:56.643055996 +0000 UTC m=+0.287222895 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git)
Dec 02 08:48:56 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:49:05 np0005541913.localdomain podman[96258]: 2025-12-02 08:49:05.454303559 +0000 UTC m=+0.090934374 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=logrotate_crond, version=17.1.12)
Dec 02 08:49:05 np0005541913.localdomain podman[96260]: 2025-12-02 08:49:05.508639788 +0000 UTC m=+0.139872277 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Dec 02 08:49:05 np0005541913.localdomain podman[96259]: 2025-12-02 08:49:05.479025793 +0000 UTC m=+0.111947448 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:49:05 np0005541913.localdomain podman[96258]: 2025-12-02 08:49:05.539058736 +0000 UTC m=+0.175689581 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-type=git)
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:49:05 np0005541913.localdomain podman[96260]: 2025-12-02 08:49:05.560884262 +0000 UTC m=+0.192116741 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, container_name=nova_compute, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, version=17.1.12)
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:49:05 np0005541913.localdomain podman[96267]: 2025-12-02 08:49:05.605924582 +0000 UTC m=+0.227595945 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1)
Dec 02 08:49:05 np0005541913.localdomain podman[96267]: 2025-12-02 08:49:05.634953021 +0000 UTC m=+0.256624344 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, container_name=ceilometer_agent_ipmi)
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:49:05 np0005541913.localdomain podman[96261]: 2025-12-02 08:49:05.7335728 +0000 UTC m=+0.360061542 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Dec 02 08:49:05 np0005541913.localdomain podman[96261]: 2025-12-02 08:49:05.761979563 +0000 UTC m=+0.388468235 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:49:05 np0005541913.localdomain podman[96259]: 2025-12-02 08:49:05.834131361 +0000 UTC m=+0.467053046 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, release=1761123044)
Dec 02 08:49:05 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:49:06 np0005541913.localdomain systemd[1]: tmp-crun.zlmAA6.mount: Deactivated successfully.
Dec 02 08:49:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:49:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:49:08 np0005541913.localdomain podman[96376]: 2025-12-02 08:49:08.442441118 +0000 UTC m=+0.078583942 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12)
Dec 02 08:49:08 np0005541913.localdomain podman[96376]: 2025-12-02 08:49:08.487073187 +0000 UTC m=+0.123215941 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 02 08:49:08 np0005541913.localdomain podman[96376]: unhealthy
Dec 02 08:49:08 np0005541913.localdomain podman[96377]: 2025-12-02 08:49:08.498199395 +0000 UTC m=+0.138174962 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:49:08 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:49:08 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:49:08 np0005541913.localdomain podman[96377]: 2025-12-02 08:49:08.542195538 +0000 UTC m=+0.182171085 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1)
Dec 02 08:49:08 np0005541913.localdomain podman[96377]: unhealthy
Dec 02 08:49:08 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:49:08 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:49:16 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:49:16 np0005541913.localdomain recover_tripleo_nova_virtqemud[96418]: 62312
Dec 02 08:49:16 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:49:16 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:49:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:49:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:49:17 np0005541913.localdomain systemd[1]: tmp-crun.OMnJxl.mount: Deactivated successfully.
Dec 02 08:49:17 np0005541913.localdomain podman[96419]: 2025-12-02 08:49:17.460915397 +0000 UTC m=+0.101794666 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, container_name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:49:17 np0005541913.localdomain podman[96420]: 2025-12-02 08:49:17.499931944 +0000 UTC m=+0.138158741 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Dec 02 08:49:17 np0005541913.localdomain podman[96420]: 2025-12-02 08:49:17.53401907 +0000 UTC m=+0.172245827 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 02 08:49:17 np0005541913.localdomain podman[96419]: 2025-12-02 08:49:17.54186793 +0000 UTC m=+0.182747229 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Dec 02 08:49:17 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:49:17 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:49:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:49:27 np0005541913.localdomain systemd[1]: tmp-crun.WvKnvh.mount: Deactivated successfully.
Dec 02 08:49:27 np0005541913.localdomain podman[96461]: 2025-12-02 08:49:27.449977933 +0000 UTC m=+0.093153002 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:49:27 np0005541913.localdomain podman[96461]: 2025-12-02 08:49:27.639400771 +0000 UTC m=+0.282575860 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z)
Dec 02 08:49:27 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:49:36 np0005541913.localdomain podman[96488]: 2025-12-02 08:49:36.450912588 +0000 UTC m=+0.087408068 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public)
Dec 02 08:49:36 np0005541913.localdomain podman[96488]: 2025-12-02 08:49:36.485013555 +0000 UTC m=+0.121509025 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team)
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: tmp-crun.eh6ZTa.mount: Deactivated successfully.
Dec 02 08:49:36 np0005541913.localdomain podman[96490]: 2025-12-02 08:49:36.560890883 +0000 UTC m=+0.190998452 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, container_name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, release=1761123044)
Dec 02 08:49:36 np0005541913.localdomain podman[96490]: 2025-12-02 08:49:36.59205551 +0000 UTC m=+0.222163049 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step5, url=https://www.redhat.com, container_name=nova_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:49:36 np0005541913.localdomain podman[96503]: 2025-12-02 08:49:36.662533522 +0000 UTC m=+0.282689903 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 02 08:49:36 np0005541913.localdomain podman[96489]: 2025-12-02 08:49:36.709076713 +0000 UTC m=+0.342924992 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:49:36 np0005541913.localdomain podman[96491]: 2025-12-02 08:49:36.716937633 +0000 UTC m=+0.344271038 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute)
Dec 02 08:49:36 np0005541913.localdomain podman[96503]: 2025-12-02 08:49:36.739698544 +0000 UTC m=+0.359854925 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 02 08:49:36 np0005541913.localdomain podman[96491]: 2025-12-02 08:49:36.749225161 +0000 UTC m=+0.376558596 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, container_name=ceilometer_agent_compute, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute)
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:49:36 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:49:37 np0005541913.localdomain podman[96489]: 2025-12-02 08:49:37.115182009 +0000 UTC m=+0.749030298 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 02 08:49:37 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:49:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:49:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:49:39 np0005541913.localdomain podman[96611]: 2025-12-02 08:49:39.441650017 +0000 UTC m=+0.083726310 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1)
Dec 02 08:49:39 np0005541913.localdomain podman[96611]: 2025-12-02 08:49:39.48010242 +0000 UTC m=+0.122178723 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:49:39 np0005541913.localdomain podman[96611]: unhealthy
Dec 02 08:49:39 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:49:39 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:49:39 np0005541913.localdomain systemd[1]: tmp-crun.I7l1J6.mount: Deactivated successfully.
Dec 02 08:49:39 np0005541913.localdomain podman[96612]: 2025-12-02 08:49:39.500460606 +0000 UTC m=+0.139430026 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T23:34:05Z, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible)
Dec 02 08:49:39 np0005541913.localdomain podman[96612]: 2025-12-02 08:49:39.520988437 +0000 UTC m=+0.159957797 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible)
Dec 02 08:49:39 np0005541913.localdomain podman[96612]: unhealthy
Dec 02 08:49:39 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:49:39 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:49:41 np0005541913.localdomain sudo[96652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:49:41 np0005541913.localdomain sudo[96652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:49:41 np0005541913.localdomain sudo[96652]: pam_unix(sudo:session): session closed for user root
Dec 02 08:49:41 np0005541913.localdomain sudo[96667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:49:41 np0005541913.localdomain sudo[96667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:49:41 np0005541913.localdomain sudo[96667]: pam_unix(sudo:session): session closed for user root
Dec 02 08:49:42 np0005541913.localdomain sudo[96716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:49:42 np0005541913.localdomain sudo[96716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:49:42 np0005541913.localdomain sudo[96716]: pam_unix(sudo:session): session closed for user root
Dec 02 08:49:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:49:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:49:48 np0005541913.localdomain podman[96732]: 2025-12-02 08:49:48.448430381 +0000 UTC m=+0.082098626 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:49:48 np0005541913.localdomain podman[96732]: 2025-12-02 08:49:48.461983605 +0000 UTC m=+0.095651840 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-type=git, container_name=iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:49:48 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:49:48 np0005541913.localdomain podman[96731]: 2025-12-02 08:49:48.54255793 +0000 UTC m=+0.179120723 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:49:48 np0005541913.localdomain podman[96731]: 2025-12-02 08:49:48.555041784 +0000 UTC m=+0.191604637 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:49:48 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:49:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:49:58 np0005541913.localdomain systemd[1]: tmp-crun.VfTIKe.mount: Deactivated successfully.
Dec 02 08:49:58 np0005541913.localdomain podman[96767]: 2025-12-02 08:49:58.437818868 +0000 UTC m=+0.076035834 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=metrics_qdr, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Dec 02 08:49:58 np0005541913.localdomain podman[96767]: 2025-12-02 08:49:58.658108225 +0000 UTC m=+0.296325211 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd)
Dec 02 08:49:58 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: tmp-crun.eRJhKh.mount: Deactivated successfully.
Dec 02 08:50:07 np0005541913.localdomain podman[96797]: 2025-12-02 08:50:07.430573004 +0000 UTC m=+0.069643862 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, tcib_managed=true, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:50:07 np0005541913.localdomain podman[96796]: 2025-12-02 08:50:07.433871233 +0000 UTC m=+0.077401440 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible)
Dec 02 08:50:07 np0005541913.localdomain podman[96798]: 2025-12-02 08:50:07.472556852 +0000 UTC m=+0.110142329 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12)
Dec 02 08:50:07 np0005541913.localdomain podman[96805]: 2025-12-02 08:50:07.458684989 +0000 UTC m=+0.087614204 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Dec 02 08:50:07 np0005541913.localdomain podman[96796]: 2025-12-02 08:50:07.514479108 +0000 UTC m=+0.158009335 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1)
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:50:07 np0005541913.localdomain podman[96805]: 2025-12-02 08:50:07.541924365 +0000 UTC m=+0.170853600 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:50:07 np0005541913.localdomain podman[96804]: 2025-12-02 08:50:07.519243646 +0000 UTC m=+0.154908702 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible)
Dec 02 08:50:07 np0005541913.localdomain podman[96798]: 2025-12-02 08:50:07.566145305 +0000 UTC m=+0.203730812 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git)
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:50:07 np0005541913.localdomain podman[96804]: 2025-12-02 08:50:07.604380652 +0000 UTC m=+0.240045738 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:50:07 np0005541913.localdomain podman[96797]: 2025-12-02 08:50:07.827118155 +0000 UTC m=+0.466189023 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:50:07 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:50:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:50:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:50:10 np0005541913.localdomain systemd[1]: tmp-crun.Bq4GEO.mount: Deactivated successfully.
Dec 02 08:50:10 np0005541913.localdomain podman[96914]: 2025-12-02 08:50:10.441868805 +0000 UTC m=+0.080662768 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 02 08:50:10 np0005541913.localdomain podman[96914]: 2025-12-02 08:50:10.45699127 +0000 UTC m=+0.095785243 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 02 08:50:10 np0005541913.localdomain podman[96914]: unhealthy
Dec 02 08:50:10 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:50:10 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:50:10 np0005541913.localdomain podman[96915]: 2025-12-02 08:50:10.538708846 +0000 UTC m=+0.175813073 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, vcs-type=git)
Dec 02 08:50:10 np0005541913.localdomain podman[96915]: 2025-12-02 08:50:10.557001037 +0000 UTC m=+0.194105284 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4)
Dec 02 08:50:10 np0005541913.localdomain podman[96915]: unhealthy
Dec 02 08:50:10 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:50:10 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:50:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:50:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:50:19 np0005541913.localdomain systemd[1]: tmp-crun.Jh65vu.mount: Deactivated successfully.
Dec 02 08:50:19 np0005541913.localdomain podman[96953]: 2025-12-02 08:50:19.444772084 +0000 UTC m=+0.082443916 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z)
Dec 02 08:50:19 np0005541913.localdomain podman[96953]: 2025-12-02 08:50:19.484199733 +0000 UTC m=+0.121871565 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Dec 02 08:50:19 np0005541913.localdomain systemd[1]: tmp-crun.XnZAfR.mount: Deactivated successfully.
Dec 02 08:50:19 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:50:19 np0005541913.localdomain podman[96954]: 2025-12-02 08:50:19.502112744 +0000 UTC m=+0.135097899 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:50:19 np0005541913.localdomain podman[96954]: 2025-12-02 08:50:19.516046148 +0000 UTC m=+0.149031303 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:50:19 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:50:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:50:29 np0005541913.localdomain podman[96991]: 2025-12-02 08:50:29.444035468 +0000 UTC m=+0.081209632 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:50:29 np0005541913.localdomain podman[96991]: 2025-12-02 08:50:29.652038354 +0000 UTC m=+0.289212488 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:50:29 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:50:38 np0005541913.localdomain podman[97024]: 2025-12-02 08:50:38.465647711 +0000 UTC m=+0.097801628 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=nova_compute)
Dec 02 08:50:38 np0005541913.localdomain podman[97031]: 2025-12-02 08:50:38.510178666 +0000 UTC m=+0.136831835 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z)
Dec 02 08:50:38 np0005541913.localdomain podman[97024]: 2025-12-02 08:50:38.521310575 +0000 UTC m=+0.153464492 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: tmp-crun.3QYjyQ.mount: Deactivated successfully.
Dec 02 08:50:38 np0005541913.localdomain podman[97031]: 2025-12-02 08:50:38.56504352 +0000 UTC m=+0.191696719 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Dec 02 08:50:38 np0005541913.localdomain podman[97022]: 2025-12-02 08:50:38.567284171 +0000 UTC m=+0.204778401 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z)
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:50:38 np0005541913.localdomain podman[97022]: 2025-12-02 08:50:38.648158003 +0000 UTC m=+0.285652303 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=logrotate_crond, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-cron-container, release=1761123044)
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:50:38 np0005541913.localdomain podman[97023]: 2025-12-02 08:50:38.661769099 +0000 UTC m=+0.297565544 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, container_name=nova_migration_target, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, batch=17.1_20251118.1)
Dec 02 08:50:38 np0005541913.localdomain podman[97025]: 2025-12-02 08:50:38.716072207 +0000 UTC m=+0.343193349 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:50:38 np0005541913.localdomain podman[97025]: 2025-12-02 08:50:38.745010264 +0000 UTC m=+0.372131436 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team)
Dec 02 08:50:38 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:50:39 np0005541913.localdomain podman[97023]: 2025-12-02 08:50:39.029539897 +0000 UTC m=+0.665336362 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:50:39 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:50:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:50:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:50:41 np0005541913.localdomain systemd[1]: tmp-crun.gU6oNv.mount: Deactivated successfully.
Dec 02 08:50:41 np0005541913.localdomain podman[97143]: 2025-12-02 08:50:41.447602953 +0000 UTC m=+0.090501611 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, container_name=ovn_controller, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:50:41 np0005541913.localdomain podman[97143]: 2025-12-02 08:50:41.485677856 +0000 UTC m=+0.128576474 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ovn_controller)
Dec 02 08:50:41 np0005541913.localdomain podman[97142]: 2025-12-02 08:50:41.484759871 +0000 UTC m=+0.129544520 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Dec 02 08:50:41 np0005541913.localdomain podman[97143]: unhealthy
Dec 02 08:50:41 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:50:41 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:50:41 np0005541913.localdomain podman[97142]: 2025-12-02 08:50:41.571346047 +0000 UTC m=+0.216130716 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12)
Dec 02 08:50:41 np0005541913.localdomain podman[97142]: unhealthy
Dec 02 08:50:41 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:50:41 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:50:42 np0005541913.localdomain sudo[97182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:50:42 np0005541913.localdomain sudo[97182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:50:42 np0005541913.localdomain sudo[97182]: pam_unix(sudo:session): session closed for user root
Dec 02 08:50:42 np0005541913.localdomain sudo[97197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 08:50:42 np0005541913.localdomain sudo[97197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:50:43 np0005541913.localdomain systemd[1]: tmp-crun.qF8tpw.mount: Deactivated successfully.
Dec 02 08:50:43 np0005541913.localdomain podman[97282]: 2025-12-02 08:50:43.705139619 +0000 UTC m=+0.104926250 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1763362218, RELEASE=main, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4)
Dec 02 08:50:43 np0005541913.localdomain podman[97282]: 2025-12-02 08:50:43.835018017 +0000 UTC m=+0.234804688 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1763362218, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vcs-type=git, name=rhceph)
Dec 02 08:50:44 np0005541913.localdomain sudo[97197]: pam_unix(sudo:session): session closed for user root
Dec 02 08:50:44 np0005541913.localdomain sudo[97347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:50:44 np0005541913.localdomain sudo[97347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:50:44 np0005541913.localdomain sudo[97347]: pam_unix(sudo:session): session closed for user root
Dec 02 08:50:44 np0005541913.localdomain sudo[97362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:50:44 np0005541913.localdomain sudo[97362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:50:44 np0005541913.localdomain sudo[97362]: pam_unix(sudo:session): session closed for user root
Dec 02 08:50:45 np0005541913.localdomain sudo[97409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:50:45 np0005541913.localdomain sudo[97409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:50:45 np0005541913.localdomain sudo[97409]: pam_unix(sudo:session): session closed for user root
Dec 02 08:50:46 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:50:46 np0005541913.localdomain recover_tripleo_nova_virtqemud[97425]: 62312
Dec 02 08:50:46 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:50:46 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:50:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:50:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:50:50 np0005541913.localdomain systemd[1]: tmp-crun.lJE2qB.mount: Deactivated successfully.
Dec 02 08:50:50 np0005541913.localdomain podman[97427]: 2025-12-02 08:50:50.459098865 +0000 UTC m=+0.098230400 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com)
Dec 02 08:50:50 np0005541913.localdomain podman[97427]: 2025-12-02 08:50:50.469956596 +0000 UTC m=+0.109088101 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:50:50 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:50:50 np0005541913.localdomain podman[97426]: 2025-12-02 08:50:50.554522697 +0000 UTC m=+0.194087593 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, container_name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd)
Dec 02 08:50:50 np0005541913.localdomain podman[97426]: 2025-12-02 08:50:50.568598785 +0000 UTC m=+0.208163661 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc.)
Dec 02 08:50:50 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:51:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:51:00 np0005541913.localdomain podman[97464]: 2025-12-02 08:51:00.461272784 +0000 UTC m=+0.100128591 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, tcib_managed=true, config_id=tripleo_step1)
Dec 02 08:51:00 np0005541913.localdomain podman[97464]: 2025-12-02 08:51:00.660121505 +0000 UTC m=+0.298977332 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:51:00 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:51:09 np0005541913.localdomain podman[97494]: 2025-12-02 08:51:09.451187817 +0000 UTC m=+0.082953728 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:51:09 np0005541913.localdomain podman[97496]: 2025-12-02 08:51:09.499832244 +0000 UTC m=+0.127376573 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=)
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: tmp-crun.jRrpi3.mount: Deactivated successfully.
Dec 02 08:51:09 np0005541913.localdomain podman[97496]: 2025-12-02 08:51:09.554722798 +0000 UTC m=+0.182267137 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Dec 02 08:51:09 np0005541913.localdomain podman[97495]: 2025-12-02 08:51:09.554119282 +0000 UTC m=+0.182835232 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step5, container_name=nova_compute, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:51:09 np0005541913.localdomain podman[97502]: 2025-12-02 08:51:09.612015517 +0000 UTC m=+0.234518490 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi)
Dec 02 08:51:09 np0005541913.localdomain podman[97495]: 2025-12-02 08:51:09.641101308 +0000 UTC m=+0.269817288 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:51:09 np0005541913.localdomain podman[97493]: 2025-12-02 08:51:09.653634875 +0000 UTC m=+0.289457306 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 02 08:51:09 np0005541913.localdomain podman[97502]: 2025-12-02 08:51:09.665044412 +0000 UTC m=+0.287547395 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z)
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:51:09 np0005541913.localdomain podman[97493]: 2025-12-02 08:51:09.715431475 +0000 UTC m=+0.351253906 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, architecture=x86_64)
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:51:09 np0005541913.localdomain podman[97494]: 2025-12-02 08:51:09.805083963 +0000 UTC m=+0.436849904 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible)
Dec 02 08:51:09 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:51:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:51:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:51:12 np0005541913.localdomain podman[97609]: 2025-12-02 08:51:12.446899201 +0000 UTC m=+0.085993471 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, build-date=2025-11-19T00:14:25Z, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 02 08:51:12 np0005541913.localdomain systemd[1]: tmp-crun.evKj3I.mount: Deactivated successfully.
Dec 02 08:51:12 np0005541913.localdomain podman[97610]: 2025-12-02 08:51:12.493631345 +0000 UTC m=+0.130637689 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:51:12 np0005541913.localdomain podman[97609]: 2025-12-02 08:51:12.514552728 +0000 UTC m=+0.153646998 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, config_id=tripleo_step4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:51:12 np0005541913.localdomain podman[97609]: unhealthy
Dec 02 08:51:12 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:51:12 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:51:12 np0005541913.localdomain podman[97610]: 2025-12-02 08:51:12.538005967 +0000 UTC m=+0.175012341 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team)
Dec 02 08:51:12 np0005541913.localdomain podman[97610]: unhealthy
Dec 02 08:51:12 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:51:12 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:51:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:51:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:51:21 np0005541913.localdomain systemd[1]: tmp-crun.QmyPiz.mount: Deactivated successfully.
Dec 02 08:51:21 np0005541913.localdomain podman[97650]: 2025-12-02 08:51:21.445081114 +0000 UTC m=+0.084493820 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 02 08:51:21 np0005541913.localdomain podman[97650]: 2025-12-02 08:51:21.48891092 +0000 UTC m=+0.128323606 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3)
Dec 02 08:51:21 np0005541913.localdomain systemd[1]: tmp-crun.vDrH7u.mount: Deactivated successfully.
Dec 02 08:51:21 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:51:21 np0005541913.localdomain podman[97649]: 2025-12-02 08:51:21.495455357 +0000 UTC m=+0.135947463 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, container_name=collectd, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true)
Dec 02 08:51:21 np0005541913.localdomain podman[97649]: 2025-12-02 08:51:21.576500194 +0000 UTC m=+0.216992330 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_id=tripleo_step3, architecture=x86_64, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:51:21 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:51:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:51:31 np0005541913.localdomain systemd[1]: tmp-crun.QnmAJd.mount: Deactivated successfully.
Dec 02 08:51:31 np0005541913.localdomain podman[97689]: 2025-12-02 08:51:31.443440634 +0000 UTC m=+0.088944821 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=metrics_qdr, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:51:31 np0005541913.localdomain podman[97689]: 2025-12-02 08:51:31.639087929 +0000 UTC m=+0.284592166 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:51:31 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: tmp-crun.V3MtIl.mount: Deactivated successfully.
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: tmp-crun.gqWpOW.mount: Deactivated successfully.
Dec 02 08:51:40 np0005541913.localdomain podman[97725]: 2025-12-02 08:51:40.485878244 +0000 UTC m=+0.109746530 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:51:40 np0005541913.localdomain podman[97718]: 2025-12-02 08:51:40.496625552 +0000 UTC m=+0.130708382 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible)
Dec 02 08:51:40 np0005541913.localdomain podman[97720]: 2025-12-02 08:51:40.445993382 +0000 UTC m=+0.077579024 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true)
Dec 02 08:51:40 np0005541913.localdomain podman[97725]: 2025-12-02 08:51:40.536821742 +0000 UTC m=+0.160690038 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, version=17.1.12, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:51:40 np0005541913.localdomain podman[97717]: 2025-12-02 08:51:40.5520553 +0000 UTC m=+0.189712795 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com)
Dec 02 08:51:40 np0005541913.localdomain podman[97719]: 2025-12-02 08:51:40.60974388 +0000 UTC m=+0.240103820 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, config_id=tripleo_step5, io.buildah.version=1.41.4, container_name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:51:40 np0005541913.localdomain podman[97720]: 2025-12-02 08:51:40.62763138 +0000 UTC m=+0.259216992 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com)
Dec 02 08:51:40 np0005541913.localdomain podman[97717]: 2025-12-02 08:51:40.63468952 +0000 UTC m=+0.272347035 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, name=rhosp17/openstack-cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:51:40 np0005541913.localdomain podman[97719]: 2025-12-02 08:51:40.63951546 +0000 UTC m=+0.269875400 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:51:40 np0005541913.localdomain podman[97718]: 2025-12-02 08:51:40.841445304 +0000 UTC m=+0.475528154 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, release=1761123044, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:51:40 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:51:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:51:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:51:43 np0005541913.localdomain podman[97839]: 2025-12-02 08:51:43.44377124 +0000 UTC m=+0.083278539 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 08:51:43 np0005541913.localdomain podman[97839]: 2025-12-02 08:51:43.487079902 +0000 UTC m=+0.126587191 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:51:43 np0005541913.localdomain systemd[1]: tmp-crun.AQ10dS.mount: Deactivated successfully.
Dec 02 08:51:43 np0005541913.localdomain podman[97839]: unhealthy
Dec 02 08:51:43 np0005541913.localdomain podman[97840]: 2025-12-02 08:51:43.497545214 +0000 UTC m=+0.130907838 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com)
Dec 02 08:51:43 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:51:43 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:51:43 np0005541913.localdomain podman[97840]: 2025-12-02 08:51:43.542272295 +0000 UTC m=+0.175634909 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Dec 02 08:51:43 np0005541913.localdomain podman[97840]: unhealthy
Dec 02 08:51:43 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:51:43 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:51:45 np0005541913.localdomain sudo[97880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:51:45 np0005541913.localdomain sudo[97880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:51:45 np0005541913.localdomain sudo[97880]: pam_unix(sudo:session): session closed for user root
Dec 02 08:51:45 np0005541913.localdomain sudo[97895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:51:45 np0005541913.localdomain sudo[97895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:51:46 np0005541913.localdomain sudo[97895]: pam_unix(sudo:session): session closed for user root
Dec 02 08:51:46 np0005541913.localdomain sudo[97942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:51:46 np0005541913.localdomain sudo[97942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:51:46 np0005541913.localdomain sudo[97942]: pam_unix(sudo:session): session closed for user root
Dec 02 08:51:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:51:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:51:52 np0005541913.localdomain podman[97957]: 2025-12-02 08:51:52.481538803 +0000 UTC m=+0.118304869 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:51:52 np0005541913.localdomain podman[97957]: 2025-12-02 08:51:52.498075798 +0000 UTC m=+0.134841864 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-collectd, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:51:52 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:51:52 np0005541913.localdomain podman[97958]: 2025-12-02 08:51:52.546824757 +0000 UTC m=+0.183522460 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Dec 02 08:51:52 np0005541913.localdomain podman[97958]: 2025-12-02 08:51:52.563104864 +0000 UTC m=+0.199802567 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 02 08:51:52 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:52:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:52:02 np0005541913.localdomain systemd[1]: tmp-crun.01JRCM.mount: Deactivated successfully.
Dec 02 08:52:02 np0005541913.localdomain podman[97996]: 2025-12-02 08:52:02.46683891 +0000 UTC m=+0.099697858 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:52:02 np0005541913.localdomain podman[97996]: 2025-12-02 08:52:02.700822205 +0000 UTC m=+0.333681113 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, io.openshift.expose-services=)
Dec 02 08:52:02 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:52:11 np0005541913.localdomain recover_tripleo_nova_virtqemud[98056]: 62312
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: tmp-crun.z6j0V7.mount: Deactivated successfully.
Dec 02 08:52:11 np0005541913.localdomain podman[98026]: 2025-12-02 08:52:11.45371258 +0000 UTC m=+0.085740853 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: tmp-crun.7LYMog.mount: Deactivated successfully.
Dec 02 08:52:11 np0005541913.localdomain podman[98024]: 2025-12-02 08:52:11.469549076 +0000 UTC m=+0.105855874 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, container_name=logrotate_crond, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:52:11 np0005541913.localdomain podman[98026]: 2025-12-02 08:52:11.481014684 +0000 UTC m=+0.113042987 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:52:11 np0005541913.localdomain podman[98024]: 2025-12-02 08:52:11.506162389 +0000 UTC m=+0.142469237 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:52:11 np0005541913.localdomain podman[98030]: 2025-12-02 08:52:11.518251924 +0000 UTC m=+0.144716937 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4)
Dec 02 08:52:11 np0005541913.localdomain podman[98038]: 2025-12-02 08:52:11.570751194 +0000 UTC m=+0.193985521 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public)
Dec 02 08:52:11 np0005541913.localdomain podman[98030]: 2025-12-02 08:52:11.57691427 +0000 UTC m=+0.203379293 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute)
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:52:11 np0005541913.localdomain podman[98025]: 2025-12-02 08:52:11.623267925 +0000 UTC m=+0.257477557 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, distribution-scope=public, url=https://www.redhat.com, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4)
Dec 02 08:52:11 np0005541913.localdomain podman[98038]: 2025-12-02 08:52:11.627973721 +0000 UTC m=+0.251208018 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 02 08:52:11 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:52:12 np0005541913.localdomain podman[98025]: 2025-12-02 08:52:12.036327269 +0000 UTC m=+0.670536921 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z)
Dec 02 08:52:12 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:52:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:52:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:52:14 np0005541913.localdomain systemd[1]: tmp-crun.sT02EG.mount: Deactivated successfully.
Dec 02 08:52:14 np0005541913.localdomain podman[98143]: 2025-12-02 08:52:14.470992771 +0000 UTC m=+0.109038979 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git)
Dec 02 08:52:14 np0005541913.localdomain systemd[1]: tmp-crun.cEBywn.mount: Deactivated successfully.
Dec 02 08:52:14 np0005541913.localdomain podman[98144]: 2025-12-02 08:52:14.520746858 +0000 UTC m=+0.156772982 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, batch=17.1_20251118.1)
Dec 02 08:52:14 np0005541913.localdomain podman[98143]: 2025-12-02 08:52:14.537350944 +0000 UTC m=+0.175397162 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team)
Dec 02 08:52:14 np0005541913.localdomain podman[98143]: unhealthy
Dec 02 08:52:14 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:52:14 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:52:14 np0005541913.localdomain podman[98144]: 2025-12-02 08:52:14.565183401 +0000 UTC m=+0.201209526 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-type=git, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:52:14 np0005541913.localdomain podman[98144]: unhealthy
Dec 02 08:52:14 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:52:14 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:52:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:52:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:52:23 np0005541913.localdomain podman[98183]: 2025-12-02 08:52:23.440576245 +0000 UTC m=+0.080722369 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:52:23 np0005541913.localdomain podman[98183]: 2025-12-02 08:52:23.448364285 +0000 UTC m=+0.088510469 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 02 08:52:23 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:52:23 np0005541913.localdomain podman[98184]: 2025-12-02 08:52:23.492338946 +0000 UTC m=+0.129237653 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:52:23 np0005541913.localdomain podman[98184]: 2025-12-02 08:52:23.502792746 +0000 UTC m=+0.139691393 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:52:23 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:52:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:52:33 np0005541913.localdomain podman[98222]: 2025-12-02 08:52:33.454439438 +0000 UTC m=+0.094274073 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 02 08:52:33 np0005541913.localdomain podman[98222]: 2025-12-02 08:52:33.643043204 +0000 UTC m=+0.282877849 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Dec 02 08:52:33 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:52:42 np0005541913.localdomain podman[98253]: 2025-12-02 08:52:42.469323669 +0000 UTC m=+0.095093725 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step5, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: tmp-crun.AIX1Yo.mount: Deactivated successfully.
Dec 02 08:52:42 np0005541913.localdomain podman[98263]: 2025-12-02 08:52:42.533234915 +0000 UTC m=+0.152066435 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:52:42 np0005541913.localdomain podman[98257]: 2025-12-02 08:52:42.57177031 +0000 UTC m=+0.194387971 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Dec 02 08:52:42 np0005541913.localdomain podman[98253]: 2025-12-02 08:52:42.60375455 +0000 UTC m=+0.229524626 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:52:42 np0005541913.localdomain podman[98252]: 2025-12-02 08:52:42.622788781 +0000 UTC m=+0.252075102 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:52:42 np0005541913.localdomain podman[98251]: 2025-12-02 08:52:42.674440728 +0000 UTC m=+0.306776471 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:32Z)
Dec 02 08:52:42 np0005541913.localdomain podman[98263]: 2025-12-02 08:52:42.691732263 +0000 UTC m=+0.310563813 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:52:42 np0005541913.localdomain podman[98257]: 2025-12-02 08:52:42.701771633 +0000 UTC m=+0.324389204 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64)
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:52:42 np0005541913.localdomain podman[98251]: 2025-12-02 08:52:42.70986075 +0000 UTC m=+0.342196463 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:52:42 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:52:42 np0005541913.localdomain podman[98252]: 2025-12-02 08:52:42.991812052 +0000 UTC m=+0.621098353 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, architecture=x86_64, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1)
Dec 02 08:52:43 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:52:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:52:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:52:45 np0005541913.localdomain podman[98371]: 2025-12-02 08:52:45.446923244 +0000 UTC m=+0.080954426 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:52:45 np0005541913.localdomain systemd[1]: tmp-crun.1UDsWa.mount: Deactivated successfully.
Dec 02 08:52:45 np0005541913.localdomain podman[98371]: 2025-12-02 08:52:45.461840794 +0000 UTC m=+0.095871986 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, version=17.1.12, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:52:45 np0005541913.localdomain podman[98371]: unhealthy
Dec 02 08:52:45 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:52:45 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:52:45 np0005541913.localdomain systemd[1]: tmp-crun.3gsea6.mount: Deactivated successfully.
Dec 02 08:52:45 np0005541913.localdomain podman[98370]: 2025-12-02 08:52:45.548730758 +0000 UTC m=+0.186801179 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 08:52:45 np0005541913.localdomain podman[98370]: 2025-12-02 08:52:45.591080515 +0000 UTC m=+0.229150896 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:52:45 np0005541913.localdomain podman[98370]: unhealthy
Dec 02 08:52:45 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:52:45 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:52:47 np0005541913.localdomain sudo[98411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:52:47 np0005541913.localdomain sudo[98411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:52:47 np0005541913.localdomain sudo[98411]: pam_unix(sudo:session): session closed for user root
Dec 02 08:52:47 np0005541913.localdomain sudo[98426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:52:47 np0005541913.localdomain sudo[98426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:52:47 np0005541913.localdomain sudo[98426]: pam_unix(sudo:session): session closed for user root
Dec 02 08:52:48 np0005541913.localdomain sudo[98473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:52:48 np0005541913.localdomain sudo[98473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:52:48 np0005541913.localdomain sudo[98473]: pam_unix(sudo:session): session closed for user root
Dec 02 08:52:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:52:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:52:54 np0005541913.localdomain podman[98489]: 2025-12-02 08:52:54.442270739 +0000 UTC m=+0.081767178 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044)
Dec 02 08:52:54 np0005541913.localdomain podman[98489]: 2025-12-02 08:52:54.453943233 +0000 UTC m=+0.093439672 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, build-date=2025-11-18T23:44:13Z)
Dec 02 08:52:54 np0005541913.localdomain podman[98488]: 2025-12-02 08:52:54.489431655 +0000 UTC m=+0.128391999 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 02 08:52:54 np0005541913.localdomain podman[98488]: 2025-12-02 08:52:54.500289977 +0000 UTC m=+0.139250311 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, url=https://www.redhat.com)
Dec 02 08:52:54 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:52:54 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:53:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:53:04 np0005541913.localdomain podman[98528]: 2025-12-02 08:53:04.435978262 +0000 UTC m=+0.080260007 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64)
Dec 02 08:53:04 np0005541913.localdomain podman[98528]: 2025-12-02 08:53:04.610212491 +0000 UTC m=+0.254494156 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:53:04 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:53:13 np0005541913.localdomain podman[98556]: 2025-12-02 08:53:13.45546477 +0000 UTC m=+0.095717222 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, release=1761123044, container_name=logrotate_crond, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: tmp-crun.7fdXtN.mount: Deactivated successfully.
Dec 02 08:53:13 np0005541913.localdomain podman[98558]: 2025-12-02 08:53:13.51914586 +0000 UTC m=+0.153385850 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 02 08:53:13 np0005541913.localdomain podman[98558]: 2025-12-02 08:53:13.548063897 +0000 UTC m=+0.182303927 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64)
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:53:13 np0005541913.localdomain podman[98559]: 2025-12-02 08:53:13.569848473 +0000 UTC m=+0.199970103 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, vcs-type=git)
Dec 02 08:53:13 np0005541913.localdomain podman[98559]: 2025-12-02 08:53:13.62229213 +0000 UTC m=+0.252413770 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:11:48Z, architecture=x86_64, distribution-scope=public, release=1761123044)
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:53:13 np0005541913.localdomain podman[98557]: 2025-12-02 08:53:13.623668958 +0000 UTC m=+0.260971600 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-type=git)
Dec 02 08:53:13 np0005541913.localdomain podman[98556]: 2025-12-02 08:53:13.696205986 +0000 UTC m=+0.336458408 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=logrotate_crond, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:53:13 np0005541913.localdomain podman[98569]: 2025-12-02 08:53:13.676356923 +0000 UTC m=+0.300501023 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64)
Dec 02 08:53:13 np0005541913.localdomain podman[98569]: 2025-12-02 08:53:13.757492171 +0000 UTC m=+0.381636301 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:53:13 np0005541913.localdomain podman[98557]: 2025-12-02 08:53:13.975058705 +0000 UTC m=+0.612361387 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.openshift.expose-services=)
Dec 02 08:53:13 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:53:14 np0005541913.localdomain systemd[1]: tmp-crun.5fV9ZS.mount: Deactivated successfully.
Dec 02 08:53:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:53:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:53:16 np0005541913.localdomain systemd[1]: tmp-crun.PVWsrU.mount: Deactivated successfully.
Dec 02 08:53:16 np0005541913.localdomain podman[98678]: 2025-12-02 08:53:16.451789559 +0000 UTC m=+0.093956734 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, tcib_managed=true, url=https://www.redhat.com)
Dec 02 08:53:16 np0005541913.localdomain podman[98678]: 2025-12-02 08:53:16.491123885 +0000 UTC m=+0.133291040 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Dec 02 08:53:16 np0005541913.localdomain podman[98678]: unhealthy
Dec 02 08:53:16 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:53:16 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:53:16 np0005541913.localdomain podman[98677]: 2025-12-02 08:53:16.539085153 +0000 UTC m=+0.183423197 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 02 08:53:16 np0005541913.localdomain podman[98677]: 2025-12-02 08:53:16.560087708 +0000 UTC m=+0.204425832 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, vcs-type=git)
Dec 02 08:53:16 np0005541913.localdomain podman[98677]: unhealthy
Dec 02 08:53:16 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:53:16 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:53:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:53:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:53:25 np0005541913.localdomain systemd[1]: tmp-crun.WZMH7P.mount: Deactivated successfully.
Dec 02 08:53:25 np0005541913.localdomain podman[98717]: 2025-12-02 08:53:25.480718548 +0000 UTC m=+0.120170379 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044)
Dec 02 08:53:25 np0005541913.localdomain systemd[1]: tmp-crun.4y1e3G.mount: Deactivated successfully.
Dec 02 08:53:25 np0005541913.localdomain podman[98718]: 2025-12-02 08:53:25.503691945 +0000 UTC m=+0.141027519 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 02 08:53:25 np0005541913.localdomain podman[98717]: 2025-12-02 08:53:25.525350757 +0000 UTC m=+0.164802638 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:53:25 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:53:25 np0005541913.localdomain podman[98718]: 2025-12-02 08:53:25.539276121 +0000 UTC m=+0.176611695 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.expose-services=, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:53:25 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:53:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:53:35 np0005541913.localdomain podman[98757]: 2025-12-02 08:53:35.430793873 +0000 UTC m=+0.074124470 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:53:35 np0005541913.localdomain podman[98757]: 2025-12-02 08:53:35.618795934 +0000 UTC m=+0.262126511 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1)
Dec 02 08:53:35 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: tmp-crun.OtnRrG.mount: Deactivated successfully.
Dec 02 08:53:44 np0005541913.localdomain podman[98803]: 2025-12-02 08:53:44.489487934 +0000 UTC m=+0.111598882 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ceilometer_agent_ipmi, vcs-type=git, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:53:44 np0005541913.localdomain podman[98788]: 2025-12-02 08:53:44.457851619 +0000 UTC m=+0.096086888 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.buildah.version=1.41.4, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 02 08:53:44 np0005541913.localdomain podman[98790]: 2025-12-02 08:53:44.521691054 +0000 UTC m=+0.154399655 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:53:44 np0005541913.localdomain podman[98789]: 2025-12-02 08:53:44.563502171 +0000 UTC m=+0.195252936 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true)
Dec 02 08:53:44 np0005541913.localdomain podman[98790]: 2025-12-02 08:53:44.573037036 +0000 UTC m=+0.205745627 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:53:44 np0005541913.localdomain podman[98788]: 2025-12-02 08:53:44.595600209 +0000 UTC m=+0.233835488 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=)
Dec 02 08:53:44 np0005541913.localdomain podman[98803]: 2025-12-02 08:53:44.595836295 +0000 UTC m=+0.217947273 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:53:44 np0005541913.localdomain podman[98796]: 2025-12-02 08:53:44.724323037 +0000 UTC m=+0.354750737 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:53:44 np0005541913.localdomain podman[98796]: 2025-12-02 08:53:44.747837935 +0000 UTC m=+0.378265645 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:11:48Z, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:53:44 np0005541913.localdomain podman[98789]: 2025-12-02 08:53:44.944070486 +0000 UTC m=+0.575821291 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:53:44 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:53:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:53:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:53:47 np0005541913.localdomain systemd[1]: tmp-crun.j0tVgQ.mount: Deactivated successfully.
Dec 02 08:53:47 np0005541913.localdomain podman[98905]: 2025-12-02 08:53:47.456507254 +0000 UTC m=+0.093232591 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com)
Dec 02 08:53:47 np0005541913.localdomain podman[98905]: 2025-12-02 08:53:47.505181334 +0000 UTC m=+0.141906691 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.buildah.version=1.41.4)
Dec 02 08:53:47 np0005541913.localdomain podman[98905]: unhealthy
Dec 02 08:53:47 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:53:47 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:53:47 np0005541913.localdomain podman[98906]: 2025-12-02 08:53:47.505904344 +0000 UTC m=+0.139659211 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, vcs-type=git)
Dec 02 08:53:47 np0005541913.localdomain podman[98906]: 2025-12-02 08:53:47.586101456 +0000 UTC m=+0.219856333 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Dec 02 08:53:47 np0005541913.localdomain podman[98906]: unhealthy
Dec 02 08:53:47 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:53:47 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:53:48 np0005541913.localdomain sudo[98945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:53:48 np0005541913.localdomain sudo[98945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:53:48 np0005541913.localdomain sudo[98945]: pam_unix(sudo:session): session closed for user root
Dec 02 08:53:48 np0005541913.localdomain sudo[98960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:53:48 np0005541913.localdomain sudo[98960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:53:49 np0005541913.localdomain sudo[98960]: pam_unix(sudo:session): session closed for user root
Dec 02 08:53:50 np0005541913.localdomain sudo[99006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:53:50 np0005541913.localdomain sudo[99006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:53:50 np0005541913.localdomain sudo[99006]: pam_unix(sudo:session): session closed for user root
Dec 02 08:53:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:53:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:53:56 np0005541913.localdomain systemd[1]: tmp-crun.pOgWau.mount: Deactivated successfully.
Dec 02 08:53:56 np0005541913.localdomain podman[99021]: 2025-12-02 08:53:56.456892678 +0000 UTC m=+0.098555373 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12)
Dec 02 08:53:56 np0005541913.localdomain systemd[1]: tmp-crun.pGHy9o.mount: Deactivated successfully.
Dec 02 08:53:56 np0005541913.localdomain podman[99022]: 2025-12-02 08:53:56.503314198 +0000 UTC m=+0.142992521 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container)
Dec 02 08:53:56 np0005541913.localdomain podman[99022]: 2025-12-02 08:53:56.511407944 +0000 UTC m=+0.151086297 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 08:53:56 np0005541913.localdomain podman[99021]: 2025-12-02 08:53:56.523089836 +0000 UTC m=+0.164752461 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64)
Dec 02 08:53:56 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:53:56 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:54:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:54:06 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:54:06 np0005541913.localdomain recover_tripleo_nova_virtqemud[99062]: 62312
Dec 02 08:54:06 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:54:06 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:54:06 np0005541913.localdomain podman[99060]: 2025-12-02 08:54:06.434847363 +0000 UTC m=+0.068783688 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 02 08:54:06 np0005541913.localdomain podman[99060]: 2025-12-02 08:54:06.611986975 +0000 UTC m=+0.245923310 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4)
Dec 02 08:54:06 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:54:15 np0005541913.localdomain podman[99091]: 2025-12-02 08:54:15.470972393 +0000 UTC m=+0.102735296 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=logrotate_crond, name=rhosp17/openstack-cron, distribution-scope=public, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:54:15 np0005541913.localdomain podman[99091]: 2025-12-02 08:54:15.507048576 +0000 UTC m=+0.138811429 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: tmp-crun.Z7q7DU.mount: Deactivated successfully.
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:54:15 np0005541913.localdomain podman[99093]: 2025-12-02 08:54:15.533125882 +0000 UTC m=+0.155213386 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, distribution-scope=public, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:54:15 np0005541913.localdomain podman[99092]: 2025-12-02 08:54:15.574783805 +0000 UTC m=+0.203282360 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vendor=Red Hat, Inc.)
Dec 02 08:54:15 np0005541913.localdomain podman[99097]: 2025-12-02 08:54:15.634687356 +0000 UTC m=+0.254783736 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Dec 02 08:54:15 np0005541913.localdomain podman[99105]: 2025-12-02 08:54:15.676807141 +0000 UTC m=+0.292855394 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, release=1761123044, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4)
Dec 02 08:54:15 np0005541913.localdomain podman[99093]: 2025-12-02 08:54:15.694480443 +0000 UTC m=+0.316567917 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:54:15 np0005541913.localdomain podman[99097]: 2025-12-02 08:54:15.715002721 +0000 UTC m=+0.335099141 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:54:15 np0005541913.localdomain podman[99105]: 2025-12-02 08:54:15.737203354 +0000 UTC m=+0.353251607 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:54:15 np0005541913.localdomain podman[99092]: 2025-12-02 08:54:15.942046695 +0000 UTC m=+0.570545310 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:54:15 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:54:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:54:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:54:18 np0005541913.localdomain podman[99206]: 2025-12-02 08:54:18.429734102 +0000 UTC m=+0.073379591 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=)
Dec 02 08:54:18 np0005541913.localdomain systemd[1]: tmp-crun.eSfgLc.mount: Deactivated successfully.
Dec 02 08:54:18 np0005541913.localdomain podman[99206]: 2025-12-02 08:54:18.439558795 +0000 UTC m=+0.083204274 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible)
Dec 02 08:54:18 np0005541913.localdomain podman[99206]: unhealthy
Dec 02 08:54:18 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:54:18 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:54:18 np0005541913.localdomain podman[99205]: 2025-12-02 08:54:18.484388512 +0000 UTC m=+0.127601640 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, release=1761123044, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12)
Dec 02 08:54:18 np0005541913.localdomain podman[99205]: 2025-12-02 08:54:18.498183221 +0000 UTC m=+0.141396309 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:54:18 np0005541913.localdomain podman[99205]: unhealthy
Dec 02 08:54:18 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:54:18 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:54:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:54:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:54:27 np0005541913.localdomain systemd[1]: tmp-crun.Mzkq8r.mount: Deactivated successfully.
Dec 02 08:54:27 np0005541913.localdomain podman[99246]: 2025-12-02 08:54:27.446635867 +0000 UTC m=+0.091900776 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, distribution-scope=public, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:54:27 np0005541913.localdomain podman[99247]: 2025-12-02 08:54:27.484711794 +0000 UTC m=+0.126724977 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-type=git)
Dec 02 08:54:27 np0005541913.localdomain podman[99247]: 2025-12-02 08:54:27.491797043 +0000 UTC m=+0.133810256 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 02 08:54:27 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:54:27 np0005541913.localdomain podman[99246]: 2025-12-02 08:54:27.50929521 +0000 UTC m=+0.154560119 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, container_name=collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 02 08:54:27 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:54:28 np0005541913.localdomain systemd[1]: tmp-crun.UaZNKW.mount: Deactivated successfully.
Dec 02 08:54:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:54:37 np0005541913.localdomain systemd[1]: tmp-crun.fk8HHB.mount: Deactivated successfully.
Dec 02 08:54:37 np0005541913.localdomain podman[99284]: 2025-12-02 08:54:37.438126334 +0000 UTC m=+0.082782783 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=metrics_qdr)
Dec 02 08:54:37 np0005541913.localdomain podman[99284]: 2025-12-02 08:54:37.652086979 +0000 UTC m=+0.296743478 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 08:54:37 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: tmp-crun.faWkSd.mount: Deactivated successfully.
Dec 02 08:54:46 np0005541913.localdomain podman[99315]: 2025-12-02 08:54:46.467597754 +0000 UTC m=+0.100361431 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:54:46 np0005541913.localdomain podman[99315]: 2025-12-02 08:54:46.548541567 +0000 UTC m=+0.181305244 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step5, architecture=x86_64, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 08:54:46 np0005541913.localdomain podman[99314]: 2025-12-02 08:54:46.558681717 +0000 UTC m=+0.194485736 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:54:46 np0005541913.localdomain podman[99313]: 2025-12-02 08:54:46.608119018 +0000 UTC m=+0.247366248 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, version=17.1.12, container_name=logrotate_crond, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:54:46 np0005541913.localdomain podman[99313]: 2025-12-02 08:54:46.621015242 +0000 UTC m=+0.260262442 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 02 08:54:46 np0005541913.localdomain podman[99319]: 2025-12-02 08:54:46.662826019 +0000 UTC m=+0.292196946 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:54:46 np0005541913.localdomain podman[99319]: 2025-12-02 08:54:46.692099871 +0000 UTC m=+0.321470798 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, vcs-type=git, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:54:46 np0005541913.localdomain podman[99322]: 2025-12-02 08:54:46.531103571 +0000 UTC m=+0.156178333 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:54:46 np0005541913.localdomain podman[99322]: 2025-12-02 08:54:46.762807779 +0000 UTC m=+0.387882581 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:54:46 np0005541913.localdomain podman[99314]: 2025-12-02 08:54:46.948317704 +0000 UTC m=+0.584121693 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1)
Dec 02 08:54:46 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:54:47 np0005541913.localdomain systemd[1]: tmp-crun.9vDpsa.mount: Deactivated successfully.
Dec 02 08:54:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:54:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:54:49 np0005541913.localdomain podman[99431]: 2025-12-02 08:54:49.443079721 +0000 UTC m=+0.084326414 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 08:54:49 np0005541913.localdomain podman[99431]: 2025-12-02 08:54:49.487066046 +0000 UTC m=+0.128312699 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 02 08:54:49 np0005541913.localdomain podman[99431]: unhealthy
Dec 02 08:54:49 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:54:49 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:54:49 np0005541913.localdomain podman[99432]: 2025-12-02 08:54:49.502091367 +0000 UTC m=+0.139402035 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:34:05Z, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Dec 02 08:54:49 np0005541913.localdomain podman[99432]: 2025-12-02 08:54:49.520221382 +0000 UTC m=+0.157532020 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, tcib_managed=true)
Dec 02 08:54:49 np0005541913.localdomain podman[99432]: unhealthy
Dec 02 08:54:49 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:54:49 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:54:50 np0005541913.localdomain sudo[99470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:54:50 np0005541913.localdomain sudo[99470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:54:50 np0005541913.localdomain sudo[99470]: pam_unix(sudo:session): session closed for user root
Dec 02 08:54:50 np0005541913.localdomain sudo[99485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:54:50 np0005541913.localdomain sudo[99485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:54:51 np0005541913.localdomain sudo[99485]: pam_unix(sudo:session): session closed for user root
Dec 02 08:54:51 np0005541913.localdomain sudo[99532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:54:51 np0005541913.localdomain sudo[99532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:54:51 np0005541913.localdomain sudo[99532]: pam_unix(sudo:session): session closed for user root
Dec 02 08:54:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:54:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:54:58 np0005541913.localdomain podman[99548]: 2025-12-02 08:54:58.470924108 +0000 UTC m=+0.095452621 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:54:58 np0005541913.localdomain systemd[1]: tmp-crun.lsRpBX.mount: Deactivated successfully.
Dec 02 08:54:58 np0005541913.localdomain podman[99547]: 2025-12-02 08:54:58.526578714 +0000 UTC m=+0.150744527 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, version=17.1.12)
Dec 02 08:54:58 np0005541913.localdomain podman[99548]: 2025-12-02 08:54:58.534019882 +0000 UTC m=+0.158548415 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=)
Dec 02 08:54:58 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:54:58 np0005541913.localdomain podman[99547]: 2025-12-02 08:54:58.565037321 +0000 UTC m=+0.189203104 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 02 08:54:58 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:55:05 np0005541913.localdomain sshd[35859]: Received disconnect from 192.168.122.100 port 57858:11: disconnected by user
Dec 02 08:55:05 np0005541913.localdomain sshd[35859]: Disconnected from user tripleo-admin 192.168.122.100 port 57858
Dec 02 08:55:05 np0005541913.localdomain sshd[35839]: pam_unix(sshd:session): session closed for user tripleo-admin
Dec 02 08:55:05 np0005541913.localdomain systemd[1]: session-29.scope: Deactivated successfully.
Dec 02 08:55:05 np0005541913.localdomain systemd[1]: session-29.scope: Consumed 7min 13.621s CPU time.
Dec 02 08:55:05 np0005541913.localdomain systemd-logind[757]: Session 29 logged out. Waiting for processes to exit.
Dec 02 08:55:05 np0005541913.localdomain systemd-logind[757]: Removed session 29.
Dec 02 08:55:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:55:08 np0005541913.localdomain podman[99588]: 2025-12-02 08:55:08.43981056 +0000 UTC m=+0.083470371 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible)
Dec 02 08:55:08 np0005541913.localdomain podman[99588]: 2025-12-02 08:55:08.631790228 +0000 UTC m=+0.275450029 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=metrics_qdr, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true)
Dec 02 08:55:08 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 1003...
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Activating special unit Exit the Session...
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Removed slice User Background Tasks Slice.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Stopped target Main User Target.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Stopped target Basic System.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Stopped target Paths.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Stopped target Sockets.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Stopped target Timers.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Closed D-Bus User Message Bus Socket.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Stopped Create User's Volatile Files and Directories.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Removed slice User Application Slice.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Reached target Shutdown.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Finished Exit the Session.
Dec 02 08:55:15 np0005541913.localdomain systemd[35843]: Reached target Exit the Session.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 1003.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: user@1003.service: Consumed 4.824s CPU time, read 0B from disk, written 7.0K to disk.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Dec 02 08:55:15 np0005541913.localdomain recover_tripleo_nova_virtqemud[99618]: 62312
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Dec 02 08:55:15 np0005541913.localdomain systemd[1]: user-1003.slice: Consumed 7min 18.466s CPU time.
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:55:17 np0005541913.localdomain podman[99635]: 2025-12-02 08:55:17.467246356 +0000 UTC m=+0.089355238 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible)
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: tmp-crun.Sa8Wem.mount: Deactivated successfully.
Dec 02 08:55:17 np0005541913.localdomain podman[99622]: 2025-12-02 08:55:17.533864775 +0000 UTC m=+0.161526585 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.12)
Dec 02 08:55:17 np0005541913.localdomain podman[99627]: 2025-12-02 08:55:17.440598804 +0000 UTC m=+0.069549939 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Dec 02 08:55:17 np0005541913.localdomain podman[99635]: 2025-12-02 08:55:17.545000872 +0000 UTC m=+0.167109804 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com)
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:55:17 np0005541913.localdomain podman[99622]: 2025-12-02 08:55:17.592046019 +0000 UTC m=+0.219707829 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-type=git, managed_by=tripleo_ansible)
Dec 02 08:55:17 np0005541913.localdomain podman[99620]: 2025-12-02 08:55:17.553113019 +0000 UTC m=+0.188093535 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, release=1761123044, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:55:17 np0005541913.localdomain podman[99621]: 2025-12-02 08:55:17.608817677 +0000 UTC m=+0.242349784 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible)
Dec 02 08:55:17 np0005541913.localdomain podman[99627]: 2025-12-02 08:55:17.62615269 +0000 UTC m=+0.255103845 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:55:17 np0005541913.localdomain podman[99620]: 2025-12-02 08:55:17.639890777 +0000 UTC m=+0.274871273 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:55:17 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:55:18 np0005541913.localdomain podman[99621]: 2025-12-02 08:55:18.011452252 +0000 UTC m=+0.644984419 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute)
Dec 02 08:55:18 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:55:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:55:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:55:20 np0005541913.localdomain systemd[1]: tmp-crun.Hnqnmi.mount: Deactivated successfully.
Dec 02 08:55:20 np0005541913.localdomain podman[99739]: 2025-12-02 08:55:20.457724032 +0000 UTC m=+0.095645755 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 02 08:55:20 np0005541913.localdomain podman[99739]: 2025-12-02 08:55:20.471070769 +0000 UTC m=+0.108992502 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com)
Dec 02 08:55:20 np0005541913.localdomain podman[99739]: unhealthy
Dec 02 08:55:20 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:55:20 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:55:20 np0005541913.localdomain podman[99740]: 2025-12-02 08:55:20.436699621 +0000 UTC m=+0.075788225 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4)
Dec 02 08:55:20 np0005541913.localdomain podman[99740]: 2025-12-02 08:55:20.520259413 +0000 UTC m=+0.159347997 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, release=1761123044, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 02 08:55:20 np0005541913.localdomain podman[99740]: unhealthy
Dec 02 08:55:20 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:55:20 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:55:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:55:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:55:29 np0005541913.localdomain podman[99778]: 2025-12-02 08:55:29.436925149 +0000 UTC m=+0.077643094 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible)
Dec 02 08:55:29 np0005541913.localdomain podman[99778]: 2025-12-02 08:55:29.449934507 +0000 UTC m=+0.090652492 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:55:29 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:55:29 np0005541913.localdomain podman[99779]: 2025-12-02 08:55:29.493738748 +0000 UTC m=+0.130002734 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, container_name=iscsid, build-date=2025-11-18T23:44:13Z)
Dec 02 08:55:29 np0005541913.localdomain podman[99779]: 2025-12-02 08:55:29.506862367 +0000 UTC m=+0.143126273 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 08:55:29 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:55:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:55:39 np0005541913.localdomain podman[99816]: 2025-12-02 08:55:39.420375472 +0000 UTC m=+0.066428895 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, version=17.1.12)
Dec 02 08:55:39 np0005541913.localdomain podman[99816]: 2025-12-02 08:55:39.64118043 +0000 UTC m=+0.287233803 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.12)
Dec 02 08:55:39 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:55:48 np0005541913.localdomain podman[99851]: 2025-12-02 08:55:48.437421801 +0000 UTC m=+0.069744244 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true)
Dec 02 08:55:48 np0005541913.localdomain podman[99846]: 2025-12-02 08:55:48.456569483 +0000 UTC m=+0.091361712 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-type=git, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 02 08:55:48 np0005541913.localdomain podman[99851]: 2025-12-02 08:55:48.492140173 +0000 UTC m=+0.124462676 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git)
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:55:48 np0005541913.localdomain podman[99855]: 2025-12-02 08:55:48.513795091 +0000 UTC m=+0.140648388 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-type=git, release=1761123044, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Dec 02 08:55:48 np0005541913.localdomain podman[99845]: 2025-12-02 08:55:48.564460134 +0000 UTC m=+0.203802064 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:55:48 np0005541913.localdomain podman[99855]: 2025-12-02 08:55:48.569119229 +0000 UTC m=+0.195972596 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4)
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:55:48 np0005541913.localdomain podman[99847]: 2025-12-02 08:55:48.61557755 +0000 UTC m=+0.248366295 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044)
Dec 02 08:55:48 np0005541913.localdomain podman[99845]: 2025-12-02 08:55:48.645682494 +0000 UTC m=+0.285024504 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, url=https://www.redhat.com)
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:55:48 np0005541913.localdomain podman[99847]: 2025-12-02 08:55:48.665853533 +0000 UTC m=+0.298642238 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:55:48 np0005541913.localdomain podman[99846]: 2025-12-02 08:55:48.844932115 +0000 UTC m=+0.479724284 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Dec 02 08:55:48 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:55:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:55:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:55:51 np0005541913.localdomain systemd[1]: tmp-crun.kL5d4g.mount: Deactivated successfully.
Dec 02 08:55:51 np0005541913.localdomain podman[99964]: 2025-12-02 08:55:51.449253168 +0000 UTC m=+0.086409949 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, release=1761123044, container_name=ovn_controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible)
Dec 02 08:55:51 np0005541913.localdomain podman[99964]: 2025-12-02 08:55:51.462104912 +0000 UTC m=+0.099261723 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true)
Dec 02 08:55:51 np0005541913.localdomain podman[99964]: unhealthy
Dec 02 08:55:51 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:55:51 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:55:51 np0005541913.localdomain podman[99963]: 2025-12-02 08:55:51.551849489 +0000 UTC m=+0.189917544 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, vcs-type=git, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:55:51 np0005541913.localdomain podman[99963]: 2025-12-02 08:55:51.591042115 +0000 UTC m=+0.229110130 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible)
Dec 02 08:55:51 np0005541913.localdomain podman[99963]: unhealthy
Dec 02 08:55:51 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:55:51 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:55:52 np0005541913.localdomain sudo[100003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:55:52 np0005541913.localdomain sudo[100003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:55:52 np0005541913.localdomain sudo[100003]: pam_unix(sudo:session): session closed for user root
Dec 02 08:55:52 np0005541913.localdomain sudo[100018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:55:52 np0005541913.localdomain sudo[100018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:55:52 np0005541913.localdomain sudo[100018]: pam_unix(sudo:session): session closed for user root
Dec 02 08:55:53 np0005541913.localdomain sudo[100064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:55:53 np0005541913.localdomain sudo[100064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:55:53 np0005541913.localdomain sudo[100064]: pam_unix(sudo:session): session closed for user root
Dec 02 08:56:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:56:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:56:00 np0005541913.localdomain podman[100080]: 2025-12-02 08:56:00.446802894 +0000 UTC m=+0.079831464 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Dec 02 08:56:00 np0005541913.localdomain podman[100080]: 2025-12-02 08:56:00.460967182 +0000 UTC m=+0.093995752 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044)
Dec 02 08:56:00 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:56:00 np0005541913.localdomain systemd[1]: tmp-crun.x5EoOI.mount: Deactivated successfully.
Dec 02 08:56:00 np0005541913.localdomain podman[100079]: 2025-12-02 08:56:00.558693443 +0000 UTC m=+0.193007727 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=collectd, io.openshift.expose-services=)
Dec 02 08:56:00 np0005541913.localdomain podman[100079]: 2025-12-02 08:56:00.574963188 +0000 UTC m=+0.209277472 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:56:00 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:56:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:56:10 np0005541913.localdomain podman[100115]: 2025-12-02 08:56:10.440198261 +0000 UTC m=+0.083961694 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:56:10 np0005541913.localdomain podman[100115]: 2025-12-02 08:56:10.664169913 +0000 UTC m=+0.307933416 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Dec 02 08:56:10 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:56:19 np0005541913.localdomain podman[100164]: 2025-12-02 08:56:19.459196351 +0000 UTC m=+0.080290906 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: tmp-crun.grzU4W.mount: Deactivated successfully.
Dec 02 08:56:19 np0005541913.localdomain podman[100148]: 2025-12-02 08:56:19.506065843 +0000 UTC m=+0.137065672 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:56:19 np0005541913.localdomain podman[100164]: 2025-12-02 08:56:19.512959057 +0000 UTC m=+0.134053612 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible)
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:56:19 np0005541913.localdomain podman[100148]: 2025-12-02 08:56:19.558955025 +0000 UTC m=+0.189954834 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute)
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:56:19 np0005541913.localdomain podman[100147]: 2025-12-02 08:56:19.613080422 +0000 UTC m=+0.247978575 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true)
Dec 02 08:56:19 np0005541913.localdomain podman[100150]: 2025-12-02 08:56:19.562986714 +0000 UTC m=+0.189047811 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Dec 02 08:56:19 np0005541913.localdomain podman[100146]: 2025-12-02 08:56:19.68454998 +0000 UTC m=+0.319218777 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, name=rhosp17/openstack-cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Dec 02 08:56:19 np0005541913.localdomain podman[100146]: 2025-12-02 08:56:19.694017104 +0000 UTC m=+0.328685851 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:56:19 np0005541913.localdomain podman[100150]: 2025-12-02 08:56:19.749257938 +0000 UTC m=+0.375319035 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute)
Dec 02 08:56:19 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:56:20 np0005541913.localdomain podman[100147]: 2025-12-02 08:56:20.003755666 +0000 UTC m=+0.638653849 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z)
Dec 02 08:56:20 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:56:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:56:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:56:22 np0005541913.localdomain systemd[1]: tmp-crun.e6xRUQ.mount: Deactivated successfully.
Dec 02 08:56:22 np0005541913.localdomain podman[100265]: 2025-12-02 08:56:22.460489297 +0000 UTC m=+0.101295147 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z)
Dec 02 08:56:22 np0005541913.localdomain podman[100265]: 2025-12-02 08:56:22.473086424 +0000 UTC m=+0.113892324 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4)
Dec 02 08:56:22 np0005541913.localdomain podman[100265]: unhealthy
Dec 02 08:56:22 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:56:22 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:56:22 np0005541913.localdomain podman[100266]: 2025-12-02 08:56:22.434311708 +0000 UTC m=+0.075731754 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller)
Dec 02 08:56:22 np0005541913.localdomain podman[100266]: 2025-12-02 08:56:22.520235772 +0000 UTC m=+0.161655738 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:56:22 np0005541913.localdomain podman[100266]: unhealthy
Dec 02 08:56:22 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:56:22 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:56:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:56:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:56:31 np0005541913.localdomain systemd[1]: tmp-crun.Bsza2p.mount: Deactivated successfully.
Dec 02 08:56:31 np0005541913.localdomain podman[100301]: 2025-12-02 08:56:31.446069657 +0000 UTC m=+0.081376364 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:56:31 np0005541913.localdomain podman[100301]: 2025-12-02 08:56:31.484061251 +0000 UTC m=+0.119367978 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 02 08:56:31 np0005541913.localdomain podman[100302]: 2025-12-02 08:56:31.495044705 +0000 UTC m=+0.126289684 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-iscsid)
Dec 02 08:56:31 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:56:31 np0005541913.localdomain podman[100302]: 2025-12-02 08:56:31.506956143 +0000 UTC m=+0.138201062 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Dec 02 08:56:31 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:56:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:56:41 np0005541913.localdomain podman[100340]: 2025-12-02 08:56:41.457246409 +0000 UTC m=+0.096931759 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vcs-type=git)
Dec 02 08:56:41 np0005541913.localdomain podman[100340]: 2025-12-02 08:56:41.68715773 +0000 UTC m=+0.326843100 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, distribution-scope=public, config_id=tripleo_step1, release=1761123044, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:56:41 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:56:46 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:56:46 np0005541913.localdomain recover_tripleo_nova_virtqemud[100370]: 62312
Dec 02 08:56:46 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:56:46 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:56:50 np0005541913.localdomain podman[100373]: 2025-12-02 08:56:50.438604444 +0000 UTC m=+0.074440579 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=nova_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: tmp-crun.po652o.mount: Deactivated successfully.
Dec 02 08:56:50 np0005541913.localdomain podman[100373]: 2025-12-02 08:56:50.501877584 +0000 UTC m=+0.137713659 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, config_id=tripleo_step5, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, architecture=x86_64)
Dec 02 08:56:50 np0005541913.localdomain podman[100371]: 2025-12-02 08:56:50.501284179 +0000 UTC m=+0.140980078 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com)
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:56:50 np0005541913.localdomain podman[100390]: 2025-12-02 08:56:50.556181034 +0000 UTC m=+0.181243151 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:56:50 np0005541913.localdomain podman[100390]: 2025-12-02 08:56:50.615066087 +0000 UTC m=+0.240128194 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:56:50 np0005541913.localdomain podman[100379]: 2025-12-02 08:56:50.613896926 +0000 UTC m=+0.243498055 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:56:50 np0005541913.localdomain podman[100372]: 2025-12-02 08:56:50.66532479 +0000 UTC m=+0.302329997 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step4, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:56:50 np0005541913.localdomain podman[100371]: 2025-12-02 08:56:50.686819354 +0000 UTC m=+0.326515303 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:56:50 np0005541913.localdomain podman[100379]: 2025-12-02 08:56:50.700315525 +0000 UTC m=+0.329916724 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Dec 02 08:56:50 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:56:51 np0005541913.localdomain podman[100372]: 2025-12-02 08:56:51.074051137 +0000 UTC m=+0.711056284 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, container_name=nova_migration_target, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 02 08:56:51 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:56:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:56:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:56:53 np0005541913.localdomain systemd[1]: tmp-crun.IXkMds.mount: Deactivated successfully.
Dec 02 08:56:53 np0005541913.localdomain podman[100492]: 2025-12-02 08:56:53.435818661 +0000 UTC m=+0.079928526 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git)
Dec 02 08:56:53 np0005541913.localdomain podman[100492]: 2025-12-02 08:56:53.45525511 +0000 UTC m=+0.099365005 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:56:53 np0005541913.localdomain podman[100492]: unhealthy
Dec 02 08:56:53 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:56:53 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:56:53 np0005541913.localdomain systemd[1]: tmp-crun.wl4bjJ.mount: Deactivated successfully.
Dec 02 08:56:53 np0005541913.localdomain podman[100493]: 2025-12-02 08:56:53.551736657 +0000 UTC m=+0.188841545 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4)
Dec 02 08:56:53 np0005541913.localdomain podman[100493]: 2025-12-02 08:56:53.562711881 +0000 UTC m=+0.199816789 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-ovn-controller, version=17.1.12, managed_by=tripleo_ansible)
Dec 02 08:56:53 np0005541913.localdomain podman[100493]: unhealthy
Dec 02 08:56:53 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:56:53 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:56:53 np0005541913.localdomain sudo[100530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:56:53 np0005541913.localdomain sudo[100530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:56:53 np0005541913.localdomain sudo[100530]: pam_unix(sudo:session): session closed for user root
Dec 02 08:56:53 np0005541913.localdomain sudo[100546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:56:53 np0005541913.localdomain sudo[100546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:56:54 np0005541913.localdomain sudo[100546]: pam_unix(sudo:session): session closed for user root
Dec 02 08:56:55 np0005541913.localdomain sudo[100594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:56:55 np0005541913.localdomain sudo[100594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:56:55 np0005541913.localdomain sudo[100594]: pam_unix(sudo:session): session closed for user root
Dec 02 08:57:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:57:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:57:02 np0005541913.localdomain systemd[1]: tmp-crun.rvnIvj.mount: Deactivated successfully.
Dec 02 08:57:02 np0005541913.localdomain podman[100610]: 2025-12-02 08:57:02.432004092 +0000 UTC m=+0.076777752 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Dec 02 08:57:02 np0005541913.localdomain podman[100609]: 2025-12-02 08:57:02.445397049 +0000 UTC m=+0.089433490 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:57:02 np0005541913.localdomain podman[100609]: 2025-12-02 08:57:02.458193521 +0000 UTC m=+0.102229942 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:57:02 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:57:02 np0005541913.localdomain podman[100610]: 2025-12-02 08:57:02.472925705 +0000 UTC m=+0.117699385 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4)
Dec 02 08:57:02 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:57:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:57:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 4776 writes, 21K keys, 4776 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4776 writes, 569 syncs, 8.39 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:57:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:57:12 np0005541913.localdomain podman[100644]: 2025-12-02 08:57:12.443666985 +0000 UTC m=+0.082554945 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4)
Dec 02 08:57:12 np0005541913.localdomain podman[100644]: 2025-12-02 08:57:12.647978512 +0000 UTC m=+0.286866402 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:57:12 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:57:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 08:57:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.2 total, 600.0 interval
                                                          Cumulative writes: 5722 writes, 25K keys, 5722 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5722 writes, 780 syncs, 7.34 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:57:21 np0005541913.localdomain podman[100672]: 2025-12-02 08:57:21.446801333 +0000 UTC m=+0.085214347 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:57:21 np0005541913.localdomain podman[100672]: 2025-12-02 08:57:21.455979539 +0000 UTC m=+0.094392573 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, architecture=x86_64, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true)
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:57:21 np0005541913.localdomain podman[100686]: 2025-12-02 08:57:21.494573349 +0000 UTC m=+0.120722305 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:57:21 np0005541913.localdomain podman[100686]: 2025-12-02 08:57:21.520123671 +0000 UTC m=+0.146272667 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=)
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:57:21 np0005541913.localdomain podman[100673]: 2025-12-02 08:57:21.60502435 +0000 UTC m=+0.237935327 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4)
Dec 02 08:57:21 np0005541913.localdomain podman[100674]: 2025-12-02 08:57:21.65332338 +0000 UTC m=+0.284256723 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:57:21 np0005541913.localdomain podman[100674]: 2025-12-02 08:57:21.708920495 +0000 UTC m=+0.339853818 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:57:21 np0005541913.localdomain podman[100680]: 2025-12-02 08:57:21.720795742 +0000 UTC m=+0.346654871 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true)
Dec 02 08:57:21 np0005541913.localdomain podman[100680]: 2025-12-02 08:57:21.742886892 +0000 UTC m=+0.368746041 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, release=1761123044, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, tcib_managed=true)
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:57:21 np0005541913.localdomain podman[100673]: 2025-12-02 08:57:21.950076296 +0000 UTC m=+0.582987303 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, architecture=x86_64)
Dec 02 08:57:21 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:57:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:57:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:57:24 np0005541913.localdomain podman[100793]: 2025-12-02 08:57:24.417303637 +0000 UTC m=+0.063410906 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:57:24 np0005541913.localdomain podman[100794]: 2025-12-02 08:57:24.486136724 +0000 UTC m=+0.127543007 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 02 08:57:24 np0005541913.localdomain podman[100794]: 2025-12-02 08:57:24.501415953 +0000 UTC m=+0.142822236 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git)
Dec 02 08:57:24 np0005541913.localdomain podman[100794]: unhealthy
Dec 02 08:57:24 np0005541913.localdomain podman[100793]: 2025-12-02 08:57:24.510590758 +0000 UTC m=+0.156698027 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, container_name=ovn_metadata_agent, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Dec 02 08:57:24 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:57:24 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:57:24 np0005541913.localdomain podman[100793]: unhealthy
Dec 02 08:57:24 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:57:24 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:57:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:57:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:57:33 np0005541913.localdomain systemd[1]: tmp-crun.r607IL.mount: Deactivated successfully.
Dec 02 08:57:33 np0005541913.localdomain podman[100832]: 2025-12-02 08:57:33.440234462 +0000 UTC m=+0.080963503 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Dec 02 08:57:33 np0005541913.localdomain podman[100832]: 2025-12-02 08:57:33.448990396 +0000 UTC m=+0.089719457 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044)
Dec 02 08:57:33 np0005541913.localdomain podman[100833]: 2025-12-02 08:57:33.461333365 +0000 UTC m=+0.094465054 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, container_name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 02 08:57:33 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:57:33 np0005541913.localdomain podman[100833]: 2025-12-02 08:57:33.468851637 +0000 UTC m=+0.101983346 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, release=1761123044, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, container_name=iscsid, batch=17.1_20251118.1)
Dec 02 08:57:33 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:57:34 np0005541913.localdomain systemd[1]: tmp-crun.ahez0k.mount: Deactivated successfully.
Dec 02 08:57:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:57:43 np0005541913.localdomain systemd[1]: tmp-crun.I9ytkW.mount: Deactivated successfully.
Dec 02 08:57:43 np0005541913.localdomain podman[100869]: 2025-12-02 08:57:43.436770323 +0000 UTC m=+0.081688433 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, release=1761123044, batch=17.1_20251118.1)
Dec 02 08:57:43 np0005541913.localdomain podman[100869]: 2025-12-02 08:57:43.633758805 +0000 UTC m=+0.278676975 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, tcib_managed=true)
Dec 02 08:57:43 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:57:52 np0005541913.localdomain podman[100898]: 2025-12-02 08:57:52.45747888 +0000 UTC m=+0.093553239 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, io.buildah.version=1.41.4, container_name=logrotate_crond, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public)
Dec 02 08:57:52 np0005541913.localdomain podman[100898]: 2025-12-02 08:57:52.469965783 +0000 UTC m=+0.106040152 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: tmp-crun.64npVu.mount: Deactivated successfully.
Dec 02 08:57:52 np0005541913.localdomain podman[100901]: 2025-12-02 08:57:52.551893732 +0000 UTC m=+0.178426747 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:57:52 np0005541913.localdomain podman[100899]: 2025-12-02 08:57:52.573223412 +0000 UTC m=+0.204445312 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044)
Dec 02 08:57:52 np0005541913.localdomain podman[100901]: 2025-12-02 08:57:52.582019496 +0000 UTC m=+0.208552481 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4)
Dec 02 08:57:52 np0005541913.localdomain podman[100907]: 2025-12-02 08:57:52.616251421 +0000 UTC m=+0.239352544 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:57:52 np0005541913.localdomain podman[100907]: 2025-12-02 08:57:52.650136516 +0000 UTC m=+0.273237699 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:57:52 np0005541913.localdomain podman[100900]: 2025-12-02 08:57:52.721865722 +0000 UTC m=+0.351563361 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 02 08:57:52 np0005541913.localdomain podman[100900]: 2025-12-02 08:57:52.776154602 +0000 UTC m=+0.405852251 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:36:58Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, container_name=nova_compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, managed_by=tripleo_ansible)
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:57:52 np0005541913.localdomain podman[100899]: 2025-12-02 08:57:52.912925856 +0000 UTC m=+0.544147726 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:57:52 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:57:53 np0005541913.localdomain systemd[1]: tmp-crun.DH5JAU.mount: Deactivated successfully.
Dec 02 08:57:55 np0005541913.localdomain sudo[101020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:57:55 np0005541913.localdomain sudo[101020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:57:55 np0005541913.localdomain sudo[101020]: pam_unix(sudo:session): session closed for user root
Dec 02 08:57:55 np0005541913.localdomain recover_tripleo_nova_virtqemud[101048]: 62312
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:57:55 np0005541913.localdomain sudo[101049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:57:55 np0005541913.localdomain sudo[101049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: tmp-crun.F4u1rr.mount: Deactivated successfully.
Dec 02 08:57:55 np0005541913.localdomain podman[101035]: 2025-12-02 08:57:55.320840741 +0000 UTC m=+0.083815320 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:57:55 np0005541913.localdomain podman[101035]: 2025-12-02 08:57:55.357061089 +0000 UTC m=+0.120035718 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, release=1761123044, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: tmp-crun.JoyMvF.mount: Deactivated successfully.
Dec 02 08:57:55 np0005541913.localdomain podman[101035]: unhealthy
Dec 02 08:57:55 np0005541913.localdomain podman[101034]: 2025-12-02 08:57:55.377828433 +0000 UTC m=+0.141325155 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step4)
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:57:55 np0005541913.localdomain podman[101034]: 2025-12-02 08:57:55.421985293 +0000 UTC m=+0.185482045 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64)
Dec 02 08:57:55 np0005541913.localdomain podman[101034]: unhealthy
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:57:55 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:57:55 np0005541913.localdomain sudo[101049]: pam_unix(sudo:session): session closed for user root
Dec 02 08:57:59 np0005541913.localdomain sudo[101122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:57:59 np0005541913.localdomain sudo[101122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:57:59 np0005541913.localdomain sudo[101122]: pam_unix(sudo:session): session closed for user root
Dec 02 08:58:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:58:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:58:04 np0005541913.localdomain systemd[1]: tmp-crun.Yw8Qr1.mount: Deactivated successfully.
Dec 02 08:58:04 np0005541913.localdomain podman[101137]: 2025-12-02 08:58:04.43934414 +0000 UTC m=+0.077017808 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, release=1761123044, container_name=collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd)
Dec 02 08:58:04 np0005541913.localdomain podman[101137]: 2025-12-02 08:58:04.44906326 +0000 UTC m=+0.086736938 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-collectd)
Dec 02 08:58:04 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:58:04 np0005541913.localdomain podman[101138]: 2025-12-02 08:58:04.488047281 +0000 UTC m=+0.122299328 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z)
Dec 02 08:58:04 np0005541913.localdomain podman[101138]: 2025-12-02 08:58:04.497102783 +0000 UTC m=+0.131354830 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Dec 02 08:58:04 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:58:05 np0005541913.localdomain systemd[1]: tmp-crun.w3stW2.mount: Deactivated successfully.
Dec 02 08:58:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:58:14 np0005541913.localdomain podman[101174]: 2025-12-02 08:58:14.445017366 +0000 UTC m=+0.085773242 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:58:14 np0005541913.localdomain podman[101174]: 2025-12-02 08:58:14.658277232 +0000 UTC m=+0.299033188 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=metrics_qdr, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible)
Dec 02 08:58:14 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:58:23 np0005541913.localdomain podman[101211]: 2025-12-02 08:58:23.464887631 +0000 UTC m=+0.088383702 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: tmp-crun.N6KBqY.mount: Deactivated successfully.
Dec 02 08:58:23 np0005541913.localdomain podman[101206]: 2025-12-02 08:58:23.517782354 +0000 UTC m=+0.145033735 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:58:23 np0005541913.localdomain podman[101211]: 2025-12-02 08:58:23.522000136 +0000 UTC m=+0.145496217 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:58:23 np0005541913.localdomain podman[101206]: 2025-12-02 08:58:23.572150546 +0000 UTC m=+0.199401937 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git)
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:58:23 np0005541913.localdomain podman[101205]: 2025-12-02 08:58:23.614733173 +0000 UTC m=+0.245183340 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.12, config_id=tripleo_step5, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:58:23 np0005541913.localdomain podman[101204]: 2025-12-02 08:58:23.575898925 +0000 UTC m=+0.208707625 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com)
Dec 02 08:58:23 np0005541913.localdomain podman[101205]: 2025-12-02 08:58:23.644021616 +0000 UTC m=+0.274471783 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:58:23 np0005541913.localdomain podman[101203]: 2025-12-02 08:58:23.724176706 +0000 UTC m=+0.357963603 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:58:23 np0005541913.localdomain podman[101203]: 2025-12-02 08:58:23.733967528 +0000 UTC m=+0.367754495 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:58:23 np0005541913.localdomain podman[101204]: 2025-12-02 08:58:23.908058308 +0000 UTC m=+0.540867018 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.12, config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true)
Dec 02 08:58:23 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:58:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:58:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:58:26 np0005541913.localdomain systemd[1]: tmp-crun.EqB5Rt.mount: Deactivated successfully.
Dec 02 08:58:26 np0005541913.localdomain podman[101325]: 2025-12-02 08:58:26.443074199 +0000 UTC m=+0.084734815 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Dec 02 08:58:26 np0005541913.localdomain podman[101325]: 2025-12-02 08:58:26.460358771 +0000 UTC m=+0.102019357 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4)
Dec 02 08:58:26 np0005541913.localdomain podman[101325]: unhealthy
Dec 02 08:58:26 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:58:26 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:58:26 np0005541913.localdomain podman[101326]: 2025-12-02 08:58:26.559329684 +0000 UTC m=+0.193789607 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Dec 02 08:58:26 np0005541913.localdomain podman[101326]: 2025-12-02 08:58:26.600951466 +0000 UTC m=+0.235411389 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 08:58:26 np0005541913.localdomain podman[101326]: unhealthy
Dec 02 08:58:26 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:58:26 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:58:27 np0005541913.localdomain systemd[1]: tmp-crun.jnBu83.mount: Deactivated successfully.
Dec 02 08:58:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:58:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:58:35 np0005541913.localdomain podman[101365]: 2025-12-02 08:58:35.447583792 +0000 UTC m=+0.088643129 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:58:35 np0005541913.localdomain podman[101365]: 2025-12-02 08:58:35.455670879 +0000 UTC m=+0.096730206 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, container_name=collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:58:35 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:58:35 np0005541913.localdomain systemd[1]: tmp-crun.hq6IP0.mount: Deactivated successfully.
Dec 02 08:58:35 np0005541913.localdomain podman[101366]: 2025-12-02 08:58:35.55903759 +0000 UTC m=+0.195968306 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-type=git, container_name=iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 08:58:35 np0005541913.localdomain podman[101366]: 2025-12-02 08:58:35.568851261 +0000 UTC m=+0.205781947 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, tcib_managed=true, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 08:58:35 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:58:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:58:45 np0005541913.localdomain podman[101402]: 2025-12-02 08:58:45.442331906 +0000 UTC m=+0.086499392 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible)
Dec 02 08:58:45 np0005541913.localdomain podman[101402]: 2025-12-02 08:58:45.675883504 +0000 UTC m=+0.320051020 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:58:45 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: tmp-crun.tx66Xy.mount: Deactivated successfully.
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: tmp-crun.BDXNw7.mount: Deactivated successfully.
Dec 02 08:58:54 np0005541913.localdomain podman[101433]: 2025-12-02 08:58:54.472909106 +0000 UTC m=+0.099868008 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=)
Dec 02 08:58:54 np0005541913.localdomain podman[101432]: 2025-12-02 08:58:54.483310754 +0000 UTC m=+0.119846992 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com)
Dec 02 08:58:54 np0005541913.localdomain podman[101431]: 2025-12-02 08:58:54.441914108 +0000 UTC m=+0.081425256 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1)
Dec 02 08:58:54 np0005541913.localdomain podman[101441]: 2025-12-02 08:58:54.569754393 +0000 UTC m=+0.195232246 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:58:54 np0005541913.localdomain podman[101431]: 2025-12-02 08:58:54.576341009 +0000 UTC m=+0.215852197 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:58:54 np0005541913.localdomain podman[101445]: 2025-12-02 08:58:54.621198067 +0000 UTC m=+0.240136185 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4)
Dec 02 08:58:54 np0005541913.localdomain podman[101441]: 2025-12-02 08:58:54.630060654 +0000 UTC m=+0.255538537 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute)
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:58:54 np0005541913.localdomain podman[101433]: 2025-12-02 08:58:54.645395843 +0000 UTC m=+0.272354725 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:58:54 np0005541913.localdomain podman[101445]: 2025-12-02 08:58:54.676850553 +0000 UTC m=+0.295788651 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:58:54 np0005541913.localdomain podman[101432]: 2025-12-02 08:58:54.837169145 +0000 UTC m=+0.473705473 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:58:54 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:58:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:58:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:58:57 np0005541913.localdomain podman[101549]: 2025-12-02 08:58:57.459405597 +0000 UTC m=+0.092279316 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, container_name=ovn_controller, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:58:57 np0005541913.localdomain podman[101549]: 2025-12-02 08:58:57.501048858 +0000 UTC m=+0.133922597 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12)
Dec 02 08:58:57 np0005541913.localdomain podman[101549]: unhealthy
Dec 02 08:58:57 np0005541913.localdomain podman[101548]: 2025-12-02 08:58:57.518021081 +0000 UTC m=+0.153897010 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 08:58:57 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:58:57 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:58:57 np0005541913.localdomain podman[101548]: 2025-12-02 08:58:57.540143062 +0000 UTC m=+0.176019021 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Dec 02 08:58:57 np0005541913.localdomain podman[101548]: unhealthy
Dec 02 08:58:57 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:58:57 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:58:59 np0005541913.localdomain sudo[101587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:58:59 np0005541913.localdomain sudo[101587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:58:59 np0005541913.localdomain sudo[101587]: pam_unix(sudo:session): session closed for user root
Dec 02 08:58:59 np0005541913.localdomain sudo[101602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 08:58:59 np0005541913.localdomain sudo[101602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:59:00 np0005541913.localdomain sudo[101602]: pam_unix(sudo:session): session closed for user root
Dec 02 08:59:00 np0005541913.localdomain sudo[101638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 08:59:00 np0005541913.localdomain sudo[101638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:59:00 np0005541913.localdomain sudo[101638]: pam_unix(sudo:session): session closed for user root
Dec 02 08:59:00 np0005541913.localdomain sudo[101653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 08:59:00 np0005541913.localdomain sudo[101653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:59:00 np0005541913.localdomain sudo[101653]: pam_unix(sudo:session): session closed for user root
Dec 02 08:59:01 np0005541913.localdomain sudo[101700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 08:59:01 np0005541913.localdomain sudo[101700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 08:59:01 np0005541913.localdomain sudo[101700]: pam_unix(sudo:session): session closed for user root
Dec 02 08:59:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:59:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:59:06 np0005541913.localdomain podman[101715]: 2025-12-02 08:59:06.434724999 +0000 UTC m=+0.066268740 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Dec 02 08:59:06 np0005541913.localdomain podman[101715]: 2025-12-02 08:59:06.44672056 +0000 UTC m=+0.078264271 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 08:59:06 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:59:06 np0005541913.localdomain systemd[1]: tmp-crun.0gDZlO.mount: Deactivated successfully.
Dec 02 08:59:06 np0005541913.localdomain podman[101716]: 2025-12-02 08:59:06.500192418 +0000 UTC m=+0.132368306 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-iscsid, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3)
Dec 02 08:59:06 np0005541913.localdomain podman[101716]: 2025-12-02 08:59:06.53809937 +0000 UTC m=+0.170275258 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:59:06 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:59:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:59:16 np0005541913.localdomain podman[101754]: 2025-12-02 08:59:16.450900135 +0000 UTC m=+0.091599638 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public)
Dec 02 08:59:16 np0005541913.localdomain podman[101754]: 2025-12-02 08:59:16.678037482 +0000 UTC m=+0.318736955 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 08:59:16 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 08:59:25 np0005541913.localdomain recover_tripleo_nova_virtqemud[101815]: 62312
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: tmp-crun.v7amKo.mount: Deactivated successfully.
Dec 02 08:59:25 np0005541913.localdomain podman[101786]: 2025-12-02 08:59:25.45772013 +0000 UTC m=+0.091616168 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, version=17.1.12, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:59:25 np0005541913.localdomain podman[101786]: 2025-12-02 08:59:25.482237616 +0000 UTC m=+0.116133654 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, batch=17.1_20251118.1)
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:59:25 np0005541913.localdomain podman[101785]: 2025-12-02 08:59:25.532534529 +0000 UTC m=+0.169351905 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 08:59:25 np0005541913.localdomain podman[101784]: 2025-12-02 08:59:25.436576455 +0000 UTC m=+0.079293548 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 08:59:25 np0005541913.localdomain podman[101787]: 2025-12-02 08:59:25.487782794 +0000 UTC m=+0.121761254 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 02 08:59:25 np0005541913.localdomain podman[101784]: 2025-12-02 08:59:25.571083598 +0000 UTC m=+0.213800731 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4)
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:59:25 np0005541913.localdomain podman[101787]: 2025-12-02 08:59:25.620875728 +0000 UTC m=+0.254854198 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044)
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:59:25 np0005541913.localdomain podman[101793]: 2025-12-02 08:59:25.696737754 +0000 UTC m=+0.330845648 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 08:59:25 np0005541913.localdomain podman[101793]: 2025-12-02 08:59:25.742937739 +0000 UTC m=+0.377045623 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.12, distribution-scope=public, release=1761123044, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:59:25 np0005541913.localdomain podman[101785]: 2025-12-02 08:59:25.871435441 +0000 UTC m=+0.508252837 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12)
Dec 02 08:59:25 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:59:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:59:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:59:28 np0005541913.localdomain podman[101908]: 2025-12-02 08:59:28.431722007 +0000 UTC m=+0.075891528 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git)
Dec 02 08:59:28 np0005541913.localdomain podman[101908]: 2025-12-02 08:59:28.445974148 +0000 UTC m=+0.090143619 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z)
Dec 02 08:59:28 np0005541913.localdomain podman[101908]: unhealthy
Dec 02 08:59:28 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:59:28 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:59:28 np0005541913.localdomain systemd[1]: tmp-crun.Kcmg3r.mount: Deactivated successfully.
Dec 02 08:59:28 np0005541913.localdomain podman[101909]: 2025-12-02 08:59:28.555530724 +0000 UTC m=+0.196833678 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 08:59:28 np0005541913.localdomain podman[101909]: 2025-12-02 08:59:28.575077036 +0000 UTC m=+0.216380010 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true)
Dec 02 08:59:28 np0005541913.localdomain podman[101909]: unhealthy
Dec 02 08:59:28 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:59:28 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 08:59:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 08:59:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 08:59:37 np0005541913.localdomain podman[101946]: 2025-12-02 08:59:37.451527678 +0000 UTC m=+0.089876662 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Dec 02 08:59:37 np0005541913.localdomain podman[101947]: 2025-12-02 08:59:37.510537074 +0000 UTC m=+0.143417102 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 08:59:37 np0005541913.localdomain podman[101946]: 2025-12-02 08:59:37.534215017 +0000 UTC m=+0.172563981 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, vcs-type=git, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible)
Dec 02 08:59:37 np0005541913.localdomain podman[101947]: 2025-12-02 08:59:37.544135321 +0000 UTC m=+0.177015329 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid)
Dec 02 08:59:37 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 08:59:37 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 08:59:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 08:59:47 np0005541913.localdomain systemd[1]: tmp-crun.78sWRn.mount: Deactivated successfully.
Dec 02 08:59:47 np0005541913.localdomain podman[101987]: 2025-12-02 08:59:47.438684099 +0000 UTC m=+0.083133071 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Dec 02 08:59:47 np0005541913.localdomain podman[101987]: 2025-12-02 08:59:47.652161191 +0000 UTC m=+0.296610203 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, distribution-scope=public, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 08:59:47 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: tmp-crun.z2tkb6.mount: Deactivated successfully.
Dec 02 08:59:56 np0005541913.localdomain podman[102016]: 2025-12-02 08:59:56.466820335 +0000 UTC m=+0.104344478 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, vcs-type=git, name=rhosp17/openstack-cron, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public)
Dec 02 08:59:56 np0005541913.localdomain podman[102016]: 2025-12-02 08:59:56.47601376 +0000 UTC m=+0.113537923 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 08:59:56 np0005541913.localdomain podman[102025]: 2025-12-02 08:59:56.524700561 +0000 UTC m=+0.149694870 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible)
Dec 02 08:59:56 np0005541913.localdomain podman[102018]: 2025-12-02 08:59:56.572686253 +0000 UTC m=+0.203193379 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 08:59:56 np0005541913.localdomain podman[102025]: 2025-12-02 08:59:56.60328145 +0000 UTC m=+0.228275759 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 08:59:56 np0005541913.localdomain podman[102017]: 2025-12-02 08:59:56.619550285 +0000 UTC m=+0.253367579 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 02 08:59:56 np0005541913.localdomain podman[102019]: 2025-12-02 08:59:56.681692975 +0000 UTC m=+0.308195023 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 08:59:56 np0005541913.localdomain podman[102018]: 2025-12-02 08:59:56.698598846 +0000 UTC m=+0.329105962 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true, distribution-scope=public, config_id=tripleo_step5, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 08:59:56 np0005541913.localdomain podman[102019]: 2025-12-02 08:59:56.706933979 +0000 UTC m=+0.333436057 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 08:59:56 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 08:59:57 np0005541913.localdomain podman[102017]: 2025-12-02 08:59:57.016326582 +0000 UTC m=+0.650143886 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 02 08:59:57 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 08:59:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 08:59:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 08:59:59 np0005541913.localdomain podman[102138]: 2025-12-02 08:59:59.425830861 +0000 UTC m=+0.071799979 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible)
Dec 02 08:59:59 np0005541913.localdomain systemd[1]: tmp-crun.lVQovS.mount: Deactivated successfully.
Dec 02 08:59:59 np0005541913.localdomain podman[102139]: 2025-12-02 08:59:59.488523476 +0000 UTC m=+0.128014971 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 02 08:59:59 np0005541913.localdomain podman[102138]: 2025-12-02 08:59:59.513908894 +0000 UTC m=+0.159878072 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 08:59:59 np0005541913.localdomain podman[102138]: unhealthy
Dec 02 08:59:59 np0005541913.localdomain podman[102139]: 2025-12-02 08:59:59.525114193 +0000 UTC m=+0.164605658 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64)
Dec 02 08:59:59 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:59:59 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 08:59:59 np0005541913.localdomain podman[102139]: unhealthy
Dec 02 08:59:59 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 08:59:59 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:00:01 np0005541913.localdomain CROND[102179]: (root) CMD (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Dec 02 09:00:01 np0005541913.localdomain sudo[102182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:00:01 np0005541913.localdomain sudo[102182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:00:01 np0005541913.localdomain sudo[102182]: pam_unix(sudo:session): session closed for user root
Dec 02 09:00:01 np0005541913.localdomain sudo[102197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:00:01 np0005541913.localdomain sudo[102197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:00:02 np0005541913.localdomain sudo[102197]: pam_unix(sudo:session): session closed for user root
Dec 02 09:00:03 np0005541913.localdomain sudo[102245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:00:03 np0005541913.localdomain sudo[102245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:00:03 np0005541913.localdomain sudo[102245]: pam_unix(sudo:session): session closed for user root
Dec 02 09:00:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:00:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:00:08 np0005541913.localdomain podman[102260]: 2025-12-02 09:00:08.47057145 +0000 UTC m=+0.106458045 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, name=rhosp17/openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 09:00:08 np0005541913.localdomain podman[102261]: 2025-12-02 09:00:08.433827018 +0000 UTC m=+0.072425615 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=iscsid)
Dec 02 09:00:08 np0005541913.localdomain podman[102261]: 2025-12-02 09:00:08.51251209 +0000 UTC m=+0.151110627 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Dec 02 09:00:08 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:00:08 np0005541913.localdomain podman[102260]: 2025-12-02 09:00:08.536097341 +0000 UTC m=+0.171983946 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Dec 02 09:00:08 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:00:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:00:18 np0005541913.localdomain podman[102300]: 2025-12-02 09:00:18.440384148 +0000 UTC m=+0.084645962 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 09:00:18 np0005541913.localdomain podman[102300]: 2025-12-02 09:00:18.641372617 +0000 UTC m=+0.285634451 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:00:18 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:00:19 np0005541913.localdomain CROND[102178]: (root) CMDEND (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: tmp-crun.6V3C6q.mount: Deactivated successfully.
Dec 02 09:00:27 np0005541913.localdomain podman[102333]: 2025-12-02 09:00:27.51558692 +0000 UTC m=+0.150012548 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 09:00:27 np0005541913.localdomain podman[102332]: 2025-12-02 09:00:27.51635988 +0000 UTC m=+0.153101900 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, release=1761123044, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 09:00:27 np0005541913.localdomain podman[102337]: 2025-12-02 09:00:27.470464734 +0000 UTC m=+0.094440393 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4)
Dec 02 09:00:27 np0005541913.localdomain podman[102337]: 2025-12-02 09:00:27.555840455 +0000 UTC m=+0.179816084 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, release=1761123044, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:00:27 np0005541913.localdomain podman[102332]: 2025-12-02 09:00:27.601087443 +0000 UTC m=+0.237829383 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:00:27 np0005541913.localdomain podman[102334]: 2025-12-02 09:00:27.60582241 +0000 UTC m=+0.241241564 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, architecture=x86_64)
Dec 02 09:00:27 np0005541913.localdomain podman[102334]: 2025-12-02 09:00:27.689362262 +0000 UTC m=+0.324781426 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 09:00:27 np0005541913.localdomain podman[102351]: 2025-12-02 09:00:27.557868699 +0000 UTC m=+0.181527040 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12)
Dec 02 09:00:27 np0005541913.localdomain podman[102351]: 2025-12-02 09:00:27.743061656 +0000 UTC m=+0.366720037 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 09:00:27 np0005541913.localdomain podman[102333]: 2025-12-02 09:00:27.915980124 +0000 UTC m=+0.550405722 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 09:00:27 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:00:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:00:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:00:30 np0005541913.localdomain podman[102452]: 2025-12-02 09:00:30.44854432 +0000 UTC m=+0.087692614 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 09:00:30 np0005541913.localdomain systemd[1]: tmp-crun.hVeYl5.mount: Deactivated successfully.
Dec 02 09:00:30 np0005541913.localdomain podman[102453]: 2025-12-02 09:00:30.496704917 +0000 UTC m=+0.133903998 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:00:30 np0005541913.localdomain podman[102452]: 2025-12-02 09:00:30.517239295 +0000 UTC m=+0.156387559 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 02 09:00:30 np0005541913.localdomain podman[102452]: unhealthy
Dec 02 09:00:30 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:00:30 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:00:30 np0005541913.localdomain podman[102453]: 2025-12-02 09:00:30.535864283 +0000 UTC m=+0.173063334 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 09:00:30 np0005541913.localdomain podman[102453]: unhealthy
Dec 02 09:00:30 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:00:30 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:00:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:00:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:00:39 np0005541913.localdomain systemd[1]: tmp-crun.okNTqY.mount: Deactivated successfully.
Dec 02 09:00:39 np0005541913.localdomain podman[102494]: 2025-12-02 09:00:39.437995159 +0000 UTC m=+0.083856760 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Dec 02 09:00:39 np0005541913.localdomain podman[102495]: 2025-12-02 09:00:39.446426184 +0000 UTC m=+0.087335003 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, config_id=tripleo_step3, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:00:39 np0005541913.localdomain podman[102495]: 2025-12-02 09:00:39.459063732 +0000 UTC m=+0.099972551 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 02 09:00:39 np0005541913.localdomain podman[102494]: 2025-12-02 09:00:39.471601377 +0000 UTC m=+0.117462978 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, distribution-scope=public, vcs-type=git, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Dec 02 09:00:39 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:00:39 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:00:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:00:49 np0005541913.localdomain podman[102532]: 2025-12-02 09:00:49.441560198 +0000 UTC m=+0.085665289 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 09:00:49 np0005541913.localdomain podman[102532]: 2025-12-02 09:00:49.658988946 +0000 UTC m=+0.303094057 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, release=1761123044)
Dec 02 09:00:49 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:00:58 np0005541913.localdomain podman[102569]: 2025-12-02 09:00:58.46665954 +0000 UTC m=+0.094415443 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, container_name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:00:58 np0005541913.localdomain podman[102563]: 2025-12-02 09:00:58.514774205 +0000 UTC m=+0.153415068 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond)
Dec 02 09:00:58 np0005541913.localdomain podman[102569]: 2025-12-02 09:00:58.525154853 +0000 UTC m=+0.152910766 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, release=1761123044, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 09:00:58 np0005541913.localdomain podman[102564]: 2025-12-02 09:00:58.557860437 +0000 UTC m=+0.192877473 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4)
Dec 02 09:00:58 np0005541913.localdomain podman[102574]: 2025-12-02 09:00:58.588501585 +0000 UTC m=+0.213962456 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 09:00:58 np0005541913.localdomain podman[102563]: 2025-12-02 09:00:58.597094865 +0000 UTC m=+0.235735688 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container)
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:00:58 np0005541913.localdomain podman[102565]: 2025-12-02 09:00:58.657011145 +0000 UTC m=+0.288492537 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step5, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4)
Dec 02 09:00:58 np0005541913.localdomain podman[102565]: 2025-12-02 09:00:58.683086081 +0000 UTC m=+0.314567493 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 09:00:58 np0005541913.localdomain podman[102574]: 2025-12-02 09:00:58.735168753 +0000 UTC m=+0.360629584 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 09:00:58 np0005541913.localdomain podman[102564]: 2025-12-02 09:00:58.920059621 +0000 UTC m=+0.555076667 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 09:00:58 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:00:59 np0005541913.localdomain systemd[1]: tmp-crun.zV8sQW.mount: Deactivated successfully.
Dec 02 09:01:01 np0005541913.localdomain CROND[102681]: (root) CMD (run-parts /etc/cron.hourly)
Dec 02 09:01:01 np0005541913.localdomain run-parts[102684]: (/etc/cron.hourly) starting 0anacron
Dec 02 09:01:01 np0005541913.localdomain run-parts[102690]: (/etc/cron.hourly) finished 0anacron
Dec 02 09:01:01 np0005541913.localdomain CROND[102680]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 02 09:01:01 np0005541913.localdomain CROND[102692]: (root) CMD (run-parts /etc/cron.hourly)
Dec 02 09:01:01 np0005541913.localdomain run-parts[102695]: (/etc/cron.hourly) starting 0anacron
Dec 02 09:01:01 np0005541913.localdomain anacron[102703]: Anacron started on 2025-12-02
Dec 02 09:01:01 np0005541913.localdomain anacron[102703]: Will run job `cron.daily' in 26 min.
Dec 02 09:01:01 np0005541913.localdomain anacron[102703]: Will run job `cron.weekly' in 46 min.
Dec 02 09:01:01 np0005541913.localdomain anacron[102703]: Will run job `cron.monthly' in 66 min.
Dec 02 09:01:01 np0005541913.localdomain anacron[102703]: Jobs will be executed sequentially
Dec 02 09:01:01 np0005541913.localdomain run-parts[102705]: (/etc/cron.hourly) finished 0anacron
Dec 02 09:01:01 np0005541913.localdomain CROND[102691]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 02 09:01:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:01:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:01:01 np0005541913.localdomain systemd[1]: tmp-crun.U9W07d.mount: Deactivated successfully.
Dec 02 09:01:01 np0005541913.localdomain podman[102707]: 2025-12-02 09:01:01.440336799 +0000 UTC m=+0.071166073 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git)
Dec 02 09:01:01 np0005541913.localdomain podman[102706]: 2025-12-02 09:01:01.459976733 +0000 UTC m=+0.089987225 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:01:01 np0005541913.localdomain podman[102707]: 2025-12-02 09:01:01.488244398 +0000 UTC m=+0.119073702 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 02 09:01:01 np0005541913.localdomain podman[102707]: unhealthy
Dec 02 09:01:01 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:01:01 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:01:01 np0005541913.localdomain podman[102706]: 2025-12-02 09:01:01.502789287 +0000 UTC m=+0.132799769 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:01:01 np0005541913.localdomain podman[102706]: unhealthy
Dec 02 09:01:01 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:01:01 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:01:02 np0005541913.localdomain systemd[1]: tmp-crun.49FuzX.mount: Deactivated successfully.
Dec 02 09:01:03 np0005541913.localdomain sudo[102746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:01:03 np0005541913.localdomain sudo[102746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:01:03 np0005541913.localdomain sudo[102746]: pam_unix(sudo:session): session closed for user root
Dec 02 09:01:03 np0005541913.localdomain sudo[102761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:01:03 np0005541913.localdomain sudo[102761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:01:04 np0005541913.localdomain podman[102848]: 2025-12-02 09:01:04.103219275 +0000 UTC m=+0.078239901 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, version=7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, vcs-type=git, release=1763362218, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:01:04 np0005541913.localdomain podman[102848]: 2025-12-02 09:01:04.20190213 +0000 UTC m=+0.176922746 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, name=rhceph, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True)
Dec 02 09:01:04 np0005541913.localdomain sudo[102761]: pam_unix(sudo:session): session closed for user root
Dec 02 09:01:04 np0005541913.localdomain sudo[102917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:01:04 np0005541913.localdomain sudo[102917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:01:04 np0005541913.localdomain sudo[102917]: pam_unix(sudo:session): session closed for user root
Dec 02 09:01:04 np0005541913.localdomain sudo[102932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:01:04 np0005541913.localdomain sudo[102932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:01:05 np0005541913.localdomain sudo[102932]: pam_unix(sudo:session): session closed for user root
Dec 02 09:01:05 np0005541913.localdomain sudo[102978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:01:05 np0005541913.localdomain sudo[102978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:01:05 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 09:01:05 np0005541913.localdomain sudo[102978]: pam_unix(sudo:session): session closed for user root
Dec 02 09:01:06 np0005541913.localdomain recover_tripleo_nova_virtqemud[102994]: 62312
Dec 02 09:01:06 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 09:01:06 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 09:01:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:01:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:01:10 np0005541913.localdomain podman[102995]: 2025-12-02 09:01:10.503803737 +0000 UTC m=+0.136338983 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-collectd)
Dec 02 09:01:10 np0005541913.localdomain podman[102995]: 2025-12-02 09:01:10.514111692 +0000 UTC m=+0.146646948 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=collectd, config_id=tripleo_step3, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 02 09:01:10 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:01:10 np0005541913.localdomain podman[102996]: 2025-12-02 09:01:10.467908968 +0000 UTC m=+0.099074917 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1)
Dec 02 09:01:10 np0005541913.localdomain podman[102996]: 2025-12-02 09:01:10.60203172 +0000 UTC m=+0.233197719 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12)
Dec 02 09:01:10 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:01:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:01:20 np0005541913.localdomain podman[103035]: 2025-12-02 09:01:20.442253736 +0000 UTC m=+0.087016925 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Dec 02 09:01:20 np0005541913.localdomain podman[103035]: 2025-12-02 09:01:20.660974518 +0000 UTC m=+0.305737717 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:01:20 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: tmp-crun.TJK4PH.mount: Deactivated successfully.
Dec 02 09:01:29 np0005541913.localdomain podman[103065]: 2025-12-02 09:01:29.482991967 +0000 UTC m=+0.122287097 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, com.redhat.component=openstack-cron-container, container_name=logrotate_crond)
Dec 02 09:01:29 np0005541913.localdomain podman[103065]: 2025-12-02 09:01:29.489984153 +0000 UTC m=+0.129279323 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:01:29 np0005541913.localdomain podman[103066]: 2025-12-02 09:01:29.490338283 +0000 UTC m=+0.125394710 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:01:29 np0005541913.localdomain podman[103073]: 2025-12-02 09:01:29.544727176 +0000 UTC m=+0.175535830 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=)
Dec 02 09:01:29 np0005541913.localdomain podman[103073]: 2025-12-02 09:01:29.561057782 +0000 UTC m=+0.191866456 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:01:29 np0005541913.localdomain podman[103079]: 2025-12-02 09:01:29.650251475 +0000 UTC m=+0.277276217 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 02 09:01:29 np0005541913.localdomain podman[103067]: 2025-12-02 09:01:29.696763517 +0000 UTC m=+0.327298064 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:01:29 np0005541913.localdomain podman[103079]: 2025-12-02 09:01:29.710238847 +0000 UTC m=+0.337263519 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ceilometer_agent_ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 09:01:29 np0005541913.localdomain podman[103067]: 2025-12-02 09:01:29.745945831 +0000 UTC m=+0.376480358 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 09:01:29 np0005541913.localdomain podman[103066]: 2025-12-02 09:01:29.863663295 +0000 UTC m=+0.498719752 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=nova_migration_target)
Dec 02 09:01:29 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:01:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:01:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:01:32 np0005541913.localdomain systemd[1]: tmp-crun.7hPt0m.mount: Deactivated successfully.
Dec 02 09:01:32 np0005541913.localdomain podman[103183]: 2025-12-02 09:01:32.432692985 +0000 UTC m=+0.076493695 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 02 09:01:32 np0005541913.localdomain podman[103184]: 2025-12-02 09:01:32.453690905 +0000 UTC m=+0.089300796 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, config_id=tripleo_step4, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team)
Dec 02 09:01:32 np0005541913.localdomain podman[103184]: 2025-12-02 09:01:32.46396093 +0000 UTC m=+0.099570801 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller)
Dec 02 09:01:32 np0005541913.localdomain podman[103184]: unhealthy
Dec 02 09:01:32 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:01:32 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:01:32 np0005541913.localdomain podman[103183]: 2025-12-02 09:01:32.478446206 +0000 UTC m=+0.122246896 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 02 09:01:32 np0005541913.localdomain podman[103183]: unhealthy
Dec 02 09:01:32 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:01:32 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:01:33 np0005541913.localdomain systemd[1]: tmp-crun.dbgBrQ.mount: Deactivated successfully.
Dec 02 09:01:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:01:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:01:41 np0005541913.localdomain podman[103223]: 2025-12-02 09:01:41.43643378 +0000 UTC m=+0.066876467 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, tcib_managed=true, version=17.1.12, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc.)
Dec 02 09:01:41 np0005541913.localdomain podman[103223]: 2025-12-02 09:01:41.47088768 +0000 UTC m=+0.101330337 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 09:01:41 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:01:41 np0005541913.localdomain podman[103222]: 2025-12-02 09:01:41.490114914 +0000 UTC m=+0.121896468 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:01:41 np0005541913.localdomain podman[103222]: 2025-12-02 09:01:41.52816487 +0000 UTC m=+0.159946414 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team)
Dec 02 09:01:41 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:01:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:01:51 np0005541913.localdomain podman[103261]: 2025-12-02 09:01:51.421834913 +0000 UTC m=+0.062516870 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, distribution-scope=public, vcs-type=git, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, tcib_managed=true)
Dec 02 09:01:51 np0005541913.localdomain podman[103261]: 2025-12-02 09:01:51.638155041 +0000 UTC m=+0.278837028 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, version=17.1.12, config_id=tripleo_step1)
Dec 02 09:01:51 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:02:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:02:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:02:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:02:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:02:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:02:00 np0005541913.localdomain podman[103300]: 2025-12-02 09:02:00.475344516 +0000 UTC m=+0.097063394 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, managed_by=tripleo_ansible)
Dec 02 09:02:00 np0005541913.localdomain podman[103294]: 2025-12-02 09:02:00.452299541 +0000 UTC m=+0.081383735 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, release=1761123044, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 09:02:00 np0005541913.localdomain podman[103300]: 2025-12-02 09:02:00.529569764 +0000 UTC m=+0.151288632 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 02 09:02:00 np0005541913.localdomain podman[103294]: 2025-12-02 09:02:00.53765063 +0000 UTC m=+0.166734914 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 09:02:00 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 09:02:00 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:02:00 np0005541913.localdomain podman[103293]: 2025-12-02 09:02:00.510472395 +0000 UTC m=+0.139808246 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Dec 02 09:02:00 np0005541913.localdomain podman[103291]: 2025-12-02 09:02:00.608784551 +0000 UTC m=+0.242387006 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, container_name=logrotate_crond, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 09:02:00 np0005541913.localdomain podman[103291]: 2025-12-02 09:02:00.61399825 +0000 UTC m=+0.247600705 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Dec 02 09:02:00 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:02:00 np0005541913.localdomain podman[103292]: 2025-12-02 09:02:00.65819642 +0000 UTC m=+0.288360383 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc.)
Dec 02 09:02:00 np0005541913.localdomain podman[103293]: 2025-12-02 09:02:00.691328865 +0000 UTC m=+0.320664776 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 09:02:00 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 09:02:01 np0005541913.localdomain podman[103292]: 2025-12-02 09:02:01.044134589 +0000 UTC m=+0.674298552 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, vendor=Red Hat, Inc.)
Dec 02 09:02:01 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:02:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:02:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:02:03 np0005541913.localdomain systemd[1]: tmp-crun.KArVo7.mount: Deactivated successfully.
Dec 02 09:02:03 np0005541913.localdomain podman[103413]: 2025-12-02 09:02:03.441926735 +0000 UTC m=+0.085872345 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 02 09:02:03 np0005541913.localdomain podman[103413]: 2025-12-02 09:02:03.481924603 +0000 UTC m=+0.125870143 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:02:03 np0005541913.localdomain podman[103413]: unhealthy
Dec 02 09:02:03 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:02:03 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:02:03 np0005541913.localdomain podman[103414]: 2025-12-02 09:02:03.483319821 +0000 UTC m=+0.124537998 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 09:02:03 np0005541913.localdomain podman[103414]: 2025-12-02 09:02:03.563317627 +0000 UTC m=+0.204535784 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, container_name=ovn_controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true)
Dec 02 09:02:03 np0005541913.localdomain podman[103414]: unhealthy
Dec 02 09:02:03 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:02:03 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:02:06 np0005541913.localdomain sudo[103453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:02:06 np0005541913.localdomain sudo[103453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:02:06 np0005541913.localdomain sudo[103453]: pam_unix(sudo:session): session closed for user root
Dec 02 09:02:06 np0005541913.localdomain sudo[103468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:02:06 np0005541913.localdomain sudo[103468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:02:06 np0005541913.localdomain sudo[103468]: pam_unix(sudo:session): session closed for user root
Dec 02 09:02:07 np0005541913.localdomain sudo[103515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:02:07 np0005541913.localdomain sudo[103515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:02:07 np0005541913.localdomain sudo[103515]: pam_unix(sudo:session): session closed for user root
Dec 02 09:02:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:02:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:02:12 np0005541913.localdomain podman[103530]: 2025-12-02 09:02:12.45437167 +0000 UTC m=+0.089202355 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git)
Dec 02 09:02:12 np0005541913.localdomain systemd[1]: tmp-crun.cX3pwF.mount: Deactivated successfully.
Dec 02 09:02:12 np0005541913.localdomain podman[103531]: 2025-12-02 09:02:12.501080138 +0000 UTC m=+0.134526505 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 02 09:02:12 np0005541913.localdomain podman[103530]: 2025-12-02 09:02:12.518333839 +0000 UTC m=+0.153164514 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 02 09:02:12 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:02:12 np0005541913.localdomain podman[103531]: 2025-12-02 09:02:12.533748391 +0000 UTC m=+0.167194798 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z)
Dec 02 09:02:12 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:02:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:02:22 np0005541913.localdomain podman[103569]: 2025-12-02 09:02:22.441909173 +0000 UTC m=+0.084748545 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64)
Dec 02 09:02:22 np0005541913.localdomain podman[103569]: 2025-12-02 09:02:22.614473053 +0000 UTC m=+0.257312455 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=metrics_qdr, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., url=https://www.redhat.com)
Dec 02 09:02:22 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:02:31 np0005541913.localdomain podman[103599]: 2025-12-02 09:02:31.458140899 +0000 UTC m=+0.089331848 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12)
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: tmp-crun.jGRnWN.mount: Deactivated successfully.
Dec 02 09:02:31 np0005541913.localdomain podman[103600]: 2025-12-02 09:02:31.508468673 +0000 UTC m=+0.137247597 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:02:31 np0005541913.localdomain podman[103597]: 2025-12-02 09:02:31.548860152 +0000 UTC m=+0.185011134 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=logrotate_crond, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=)
Dec 02 09:02:31 np0005541913.localdomain podman[103601]: 2025-12-02 09:02:31.555298734 +0000 UTC m=+0.178649433 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 09:02:31 np0005541913.localdomain podman[103597]: 2025-12-02 09:02:31.579909821 +0000 UTC m=+0.216060813 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:02:31 np0005541913.localdomain podman[103600]: 2025-12-02 09:02:31.58996294 +0000 UTC m=+0.218741834 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:02:31 np0005541913.localdomain podman[103601]: 2025-12-02 09:02:31.608753273 +0000 UTC m=+0.232104012 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 09:02:31 np0005541913.localdomain podman[103599]: 2025-12-02 09:02:31.64050444 +0000 UTC m=+0.271695469 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git)
Dec 02 09:02:31 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 09:02:31 np0005541913.localdomain podman[103598]: 2025-12-02 09:02:31.710980343 +0000 UTC m=+0.342823709 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 09:02:32 np0005541913.localdomain podman[103598]: 2025-12-02 09:02:32.082104538 +0000 UTC m=+0.713947984 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 09:02:32 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:02:32 np0005541913.localdomain systemd[1]: tmp-crun.rzLv7A.mount: Deactivated successfully.
Dec 02 09:02:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:02:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:02:34 np0005541913.localdomain podman[103713]: 2025-12-02 09:02:34.435513451 +0000 UTC m=+0.078319443 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, version=17.1.12)
Dec 02 09:02:34 np0005541913.localdomain systemd[1]: tmp-crun.ikL5dH.mount: Deactivated successfully.
Dec 02 09:02:34 np0005541913.localdomain podman[103714]: 2025-12-02 09:02:34.458241738 +0000 UTC m=+0.096760726 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 02 09:02:34 np0005541913.localdomain podman[103714]: 2025-12-02 09:02:34.475005626 +0000 UTC m=+0.113524614 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Dec 02 09:02:34 np0005541913.localdomain podman[103714]: unhealthy
Dec 02 09:02:34 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:02:34 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:02:34 np0005541913.localdomain podman[103713]: 2025-12-02 09:02:34.521037286 +0000 UTC m=+0.163843308 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, release=1761123044, tcib_managed=true)
Dec 02 09:02:34 np0005541913.localdomain podman[103713]: unhealthy
Dec 02 09:02:34 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:02:34 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:02:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:02:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:02:43 np0005541913.localdomain podman[103754]: 2025-12-02 09:02:43.443588747 +0000 UTC m=+0.081734724 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Dec 02 09:02:43 np0005541913.localdomain podman[103754]: 2025-12-02 09:02:43.48300057 +0000 UTC m=+0.121146577 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid)
Dec 02 09:02:43 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:02:43 np0005541913.localdomain podman[103753]: 2025-12-02 09:02:43.503376065 +0000 UTC m=+0.143078144 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd)
Dec 02 09:02:43 np0005541913.localdomain podman[103753]: 2025-12-02 09:02:43.536023816 +0000 UTC m=+0.175725915 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, container_name=collectd, vcs-type=git, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:02:43 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:02:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:02:53 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 09:02:53 np0005541913.localdomain recover_tripleo_nova_virtqemud[103796]: 62312
Dec 02 09:02:53 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 09:02:53 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 09:02:53 np0005541913.localdomain podman[103792]: 2025-12-02 09:02:53.450117237 +0000 UTC m=+0.093549899 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 02 09:02:53 np0005541913.localdomain podman[103792]: 2025-12-02 09:02:53.676081955 +0000 UTC m=+0.319514597 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 09:02:53 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:03:02 np0005541913.localdomain podman[103823]: 2025-12-02 09:03:02.453319883 +0000 UTC m=+0.093501579 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, container_name=logrotate_crond)
Dec 02 09:03:02 np0005541913.localdomain podman[103823]: 2025-12-02 09:03:02.461852841 +0000 UTC m=+0.102034597 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.openshift.expose-services=, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:03:02 np0005541913.localdomain podman[103825]: 2025-12-02 09:03:02.502263981 +0000 UTC m=+0.132251385 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: tmp-crun.0F4AYP.mount: Deactivated successfully.
Dec 02 09:03:02 np0005541913.localdomain podman[103824]: 2025-12-02 09:03:02.56328552 +0000 UTC m=+0.198886714 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Dec 02 09:03:02 np0005541913.localdomain podman[103832]: 2025-12-02 09:03:02.579722389 +0000 UTC m=+0.205066659 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044)
Dec 02 09:03:02 np0005541913.localdomain podman[103825]: 2025-12-02 09:03:02.591336299 +0000 UTC m=+0.221323763 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z)
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 09:03:02 np0005541913.localdomain podman[103831]: 2025-12-02 09:03:02.644967283 +0000 UTC m=+0.271963057 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 09:03:02 np0005541913.localdomain podman[103831]: 2025-12-02 09:03:02.658870275 +0000 UTC m=+0.285866049 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible)
Dec 02 09:03:02 np0005541913.localdomain podman[103832]: 2025-12-02 09:03:02.664902626 +0000 UTC m=+0.290246856 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z)
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 09:03:02 np0005541913.localdomain podman[103824]: 2025-12-02 09:03:02.93904887 +0000 UTC m=+0.574650124 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 09:03:02 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:03:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:03:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:03:05 np0005541913.localdomain podman[103945]: 2025-12-02 09:03:05.448069759 +0000 UTC m=+0.082233047 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, release=1761123044, version=17.1.12, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 09:03:05 np0005541913.localdomain podman[103945]: 2025-12-02 09:03:05.464000635 +0000 UTC m=+0.098163913 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 09:03:05 np0005541913.localdomain podman[103945]: unhealthy
Dec 02 09:03:05 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:03:05 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:03:05 np0005541913.localdomain systemd[1]: tmp-crun.XqWs76.mount: Deactivated successfully.
Dec 02 09:03:05 np0005541913.localdomain podman[103944]: 2025-12-02 09:03:05.553874436 +0000 UTC m=+0.190800078 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:03:05 np0005541913.localdomain podman[103944]: 2025-12-02 09:03:05.573905531 +0000 UTC m=+0.210831173 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 02 09:03:05 np0005541913.localdomain podman[103944]: unhealthy
Dec 02 09:03:05 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:03:05 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:03:07 np0005541913.localdomain sudo[103982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:03:07 np0005541913.localdomain sudo[103982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:03:07 np0005541913.localdomain sudo[103982]: pam_unix(sudo:session): session closed for user root
Dec 02 09:03:08 np0005541913.localdomain sudo[103997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:03:08 np0005541913.localdomain sudo[103997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:03:08 np0005541913.localdomain sudo[103997]: pam_unix(sudo:session): session closed for user root
Dec 02 09:03:09 np0005541913.localdomain sudo[104044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:03:09 np0005541913.localdomain sudo[104044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:03:09 np0005541913.localdomain sudo[104044]: pam_unix(sudo:session): session closed for user root
Dec 02 09:03:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:03:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:03:14 np0005541913.localdomain systemd[1]: tmp-crun.O86cdt.mount: Deactivated successfully.
Dec 02 09:03:14 np0005541913.localdomain podman[104060]: 2025-12-02 09:03:14.444733752 +0000 UTC m=+0.078385955 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-iscsid, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3)
Dec 02 09:03:14 np0005541913.localdomain podman[104060]: 2025-12-02 09:03:14.45666778 +0000 UTC m=+0.090319973 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 02 09:03:14 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:03:14 np0005541913.localdomain podman[104059]: 2025-12-02 09:03:14.513998941 +0000 UTC m=+0.148143777 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, version=17.1.12, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 09:03:14 np0005541913.localdomain podman[104059]: 2025-12-02 09:03:14.548511744 +0000 UTC m=+0.182656570 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, batch=17.1_20251118.1, container_name=collectd, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:03:14 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:03:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:03:24 np0005541913.localdomain systemd[1]: tmp-crun.rMn5dD.mount: Deactivated successfully.
Dec 02 09:03:24 np0005541913.localdomain podman[104099]: 2025-12-02 09:03:24.433128047 +0000 UTC m=+0.076335850 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 09:03:24 np0005541913.localdomain podman[104099]: 2025-12-02 09:03:24.607226048 +0000 UTC m=+0.250433921 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, architecture=x86_64, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4)
Dec 02 09:03:24 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:03:33 np0005541913.localdomain podman[104131]: 2025-12-02 09:03:33.432057378 +0000 UTC m=+0.067013360 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc.)
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: tmp-crun.3NAghI.mount: Deactivated successfully.
Dec 02 09:03:33 np0005541913.localdomain podman[104142]: 2025-12-02 09:03:33.473975268 +0000 UTC m=+0.101346698 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044)
Dec 02 09:03:33 np0005541913.localdomain podman[104131]: 2025-12-02 09:03:33.519024682 +0000 UTC m=+0.153980654 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:03:33 np0005541913.localdomain podman[104142]: 2025-12-02 09:03:33.570143308 +0000 UTC m=+0.197514728 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git)
Dec 02 09:03:33 np0005541913.localdomain podman[104128]: 2025-12-02 09:03:33.47667339 +0000 UTC m=+0.117080428 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, container_name=nova_migration_target, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Deactivated successfully.
Dec 02 09:03:33 np0005541913.localdomain podman[104129]: 2025-12-02 09:03:33.610501776 +0000 UTC m=+0.245512370 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, release=1761123044, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 09:03:33 np0005541913.localdomain podman[104129]: 2025-12-02 09:03:33.635892714 +0000 UTC m=+0.270903338 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 09:03:33 np0005541913.localdomain podman[104127]: 2025-12-02 09:03:33.647530605 +0000 UTC m=+0.287650785 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, distribution-scope=public)
Dec 02 09:03:33 np0005541913.localdomain podman[104127]: 2025-12-02 09:03:33.682014037 +0000 UTC m=+0.322134247 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:03:33 np0005541913.localdomain podman[104128]: 2025-12-02 09:03:33.796854655 +0000 UTC m=+0.437261703 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com)
Dec 02 09:03:33 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:03:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:03:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:03:36 np0005541913.localdomain podman[104244]: 2025-12-02 09:03:36.456787487 +0000 UTC m=+0.095235405 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, tcib_managed=true, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=)
Dec 02 09:03:36 np0005541913.localdomain podman[104244]: 2025-12-02 09:03:36.500343391 +0000 UTC m=+0.138791339 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64)
Dec 02 09:03:36 np0005541913.localdomain podman[104244]: unhealthy
Dec 02 09:03:36 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:03:36 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:03:36 np0005541913.localdomain systemd[1]: tmp-crun.OajBs7.mount: Deactivated successfully.
Dec 02 09:03:36 np0005541913.localdomain podman[104243]: 2025-12-02 09:03:36.53323476 +0000 UTC m=+0.172100389 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Dec 02 09:03:36 np0005541913.localdomain podman[104243]: 2025-12-02 09:03:36.580004649 +0000 UTC m=+0.218870308 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4)
Dec 02 09:03:36 np0005541913.localdomain podman[104243]: unhealthy
Dec 02 09:03:36 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:03:36 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:03:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:03:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:03:45 np0005541913.localdomain systemd[1]: tmp-crun.2vPmdy.mount: Deactivated successfully.
Dec 02 09:03:45 np0005541913.localdomain podman[104284]: 2025-12-02 09:03:45.461837663 +0000 UTC m=+0.099491850 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 02 09:03:45 np0005541913.localdomain podman[104284]: 2025-12-02 09:03:45.474188592 +0000 UTC m=+0.111842789 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:03:45 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:03:45 np0005541913.localdomain podman[104285]: 2025-12-02 09:03:45.561809423 +0000 UTC m=+0.196215553 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64)
Dec 02 09:03:45 np0005541913.localdomain podman[104285]: 2025-12-02 09:03:45.571922404 +0000 UTC m=+0.206328534 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, version=17.1.12, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 09:03:45 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:03:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:03:55 np0005541913.localdomain podman[104323]: 2025-12-02 09:03:55.426703092 +0000 UTC m=+0.071738448 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc.)
Dec 02 09:03:55 np0005541913.localdomain podman[104323]: 2025-12-02 09:03:55.618226258 +0000 UTC m=+0.263261594 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, tcib_managed=true, release=1761123044, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 09:03:55 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:04:04 np0005541913.localdomain podman[104353]: 2025-12-02 09:04:04.459509599 +0000 UTC m=+0.092607995 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4)
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: tmp-crun.FUFD2O.mount: Deactivated successfully.
Dec 02 09:04:04 np0005541913.localdomain podman[104352]: 2025-12-02 09:04:04.511664253 +0000 UTC m=+0.146304160 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4)
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: tmp-crun.HO532D.mount: Deactivated successfully.
Dec 02 09:04:04 np0005541913.localdomain podman[104354]: 2025-12-02 09:04:04.566995221 +0000 UTC m=+0.197565619 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Dec 02 09:04:04 np0005541913.localdomain podman[104354]: 2025-12-02 09:04:04.598077961 +0000 UTC m=+0.228648369 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:04:04 np0005541913.localdomain podman[104355]: 2025-12-02 09:04:04.616244006 +0000 UTC m=+0.243030214 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible)
Dec 02 09:04:04 np0005541913.localdomain podman[104351]: 2025-12-02 09:04:04.663798887 +0000 UTC m=+0.300155600 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:04:04 np0005541913.localdomain podman[104351]: 2025-12-02 09:04:04.674654457 +0000 UTC m=+0.311011210 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:04:04 np0005541913.localdomain podman[104353]: 2025-12-02 09:04:04.691154178 +0000 UTC m=+0.324252554 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Deactivated successfully.
Dec 02 09:04:04 np0005541913.localdomain podman[104355]: 2025-12-02 09:04:04.725150206 +0000 UTC m=+0.351936404 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 02 09:04:04 np0005541913.localdomain podman[104355]: unhealthy
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed with result 'exit-code'.
Dec 02 09:04:04 np0005541913.localdomain podman[104352]: 2025-12-02 09:04:04.863193003 +0000 UTC m=+0.497832910 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, container_name=nova_migration_target, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4)
Dec 02 09:04:04 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:04:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:04:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:04:07 np0005541913.localdomain podman[104472]: 2025-12-02 09:04:07.440508549 +0000 UTC m=+0.078982571 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1)
Dec 02 09:04:07 np0005541913.localdomain podman[104472]: 2025-12-02 09:04:07.453938188 +0000 UTC m=+0.092412220 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044)
Dec 02 09:04:07 np0005541913.localdomain podman[104473]: 2025-12-02 09:04:07.487964426 +0000 UTC m=+0.123077209 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 02 09:04:07 np0005541913.localdomain podman[104472]: unhealthy
Dec 02 09:04:07 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:04:07 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:04:07 np0005541913.localdomain podman[104473]: 2025-12-02 09:04:07.528039278 +0000 UTC m=+0.163152081 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 02 09:04:07 np0005541913.localdomain podman[104473]: unhealthy
Dec 02 09:04:07 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:04:07 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:04:09 np0005541913.localdomain sudo[104513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:04:09 np0005541913.localdomain sudo[104513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:04:09 np0005541913.localdomain sudo[104513]: pam_unix(sudo:session): session closed for user root
Dec 02 09:04:09 np0005541913.localdomain sudo[104528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:04:09 np0005541913.localdomain sudo[104528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:04:10 np0005541913.localdomain sudo[104528]: pam_unix(sudo:session): session closed for user root
Dec 02 09:04:10 np0005541913.localdomain sudo[104577]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:04:10 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 09:04:10 np0005541913.localdomain sudo[104577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:04:10 np0005541913.localdomain sudo[104577]: pam_unix(sudo:session): session closed for user root
Dec 02 09:04:10 np0005541913.localdomain recover_tripleo_nova_virtqemud[104593]: 62312
Dec 02 09:04:10 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 09:04:10 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 09:04:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:04:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:04:16 np0005541913.localdomain systemd[1]: tmp-crun.NrzbcW.mount: Deactivated successfully.
Dec 02 09:04:16 np0005541913.localdomain podman[104594]: 2025-12-02 09:04:16.450827226 +0000 UTC m=+0.090141049 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, release=1761123044, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 02 09:04:16 np0005541913.localdomain podman[104594]: 2025-12-02 09:04:16.493263579 +0000 UTC m=+0.132577402 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, release=1761123044, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3)
Dec 02 09:04:16 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:04:16 np0005541913.localdomain podman[104595]: 2025-12-02 09:04:16.51651828 +0000 UTC m=+0.153460151 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 02 09:04:16 np0005541913.localdomain podman[104595]: 2025-12-02 09:04:16.52882176 +0000 UTC m=+0.165763651 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64)
Dec 02 09:04:16 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:04:17 np0005541913.localdomain systemd[1]: tmp-crun.FPVEki.mount: Deactivated successfully.
Dec 02 09:04:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:04:26 np0005541913.localdomain podman[104634]: 2025-12-02 09:04:26.438333689 +0000 UTC m=+0.075315134 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044)
Dec 02 09:04:26 np0005541913.localdomain podman[104634]: 2025-12-02 09:04:26.623345811 +0000 UTC m=+0.260326966 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1)
Dec 02 09:04:26 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:04:35 np0005541913.localdomain podman[104664]: 2025-12-02 09:04:35.470992661 +0000 UTC m=+0.106491005 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, container_name=logrotate_crond, managed_by=tripleo_ansible)
Dec 02 09:04:35 np0005541913.localdomain podman[104665]: 2025-12-02 09:04:35.446716254 +0000 UTC m=+0.080414090 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, tcib_managed=true)
Dec 02 09:04:35 np0005541913.localdomain podman[104664]: 2025-12-02 09:04:35.509997094 +0000 UTC m=+0.145495438 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-cron-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:04:35 np0005541913.localdomain podman[104673]: 2025-12-02 09:04:35.514632478 +0000 UTC m=+0.139302263 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:04:35 np0005541913.localdomain podman[104666]: 2025-12-02 09:04:35.569579626 +0000 UTC m=+0.199318377 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step5, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 09:04:35 np0005541913.localdomain podman[104666]: 2025-12-02 09:04:35.590873044 +0000 UTC m=+0.220611805 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 09:04:35 np0005541913.localdomain podman[104666]: unhealthy
Dec 02 09:04:35 np0005541913.localdomain podman[104673]: 2025-12-02 09:04:35.598234261 +0000 UTC m=+0.222904046 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 09:04:35 np0005541913.localdomain podman[104673]: unhealthy
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed with result 'exit-code'.
Dec 02 09:04:35 np0005541913.localdomain podman[104667]: 2025-12-02 09:04:35.684511527 +0000 UTC m=+0.312631664 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:04:35 np0005541913.localdomain podman[104667]: 2025-12-02 09:04:35.713117381 +0000 UTC m=+0.341237518 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public)
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:04:35 np0005541913.localdomain podman[104665]: 2025-12-02 09:04:35.832817028 +0000 UTC m=+0.466514854 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:04:35 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:04:36 np0005541913.localdomain systemd[1]: tmp-crun.YHgNEr.mount: Deactivated successfully.
Dec 02 09:04:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:04:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:04:38 np0005541913.localdomain podman[104781]: 2025-12-02 09:04:38.459863071 +0000 UTC m=+0.092320718 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64)
Dec 02 09:04:38 np0005541913.localdomain systemd[1]: tmp-crun.Xh0Kdt.mount: Deactivated successfully.
Dec 02 09:04:38 np0005541913.localdomain podman[104780]: 2025-12-02 09:04:38.513855004 +0000 UTC m=+0.147560504 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, release=1761123044, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 02 09:04:38 np0005541913.localdomain podman[104780]: 2025-12-02 09:04:38.525340821 +0000 UTC m=+0.159046301 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:14:25Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public)
Dec 02 09:04:38 np0005541913.localdomain podman[104780]: unhealthy
Dec 02 09:04:38 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:04:38 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:04:38 np0005541913.localdomain podman[104781]: 2025-12-02 09:04:38.580831622 +0000 UTC m=+0.213289259 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 09:04:38 np0005541913.localdomain podman[104781]: unhealthy
Dec 02 09:04:38 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:04:38 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:04:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:04:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:04:47 np0005541913.localdomain systemd[1]: tmp-crun.r1i7ZS.mount: Deactivated successfully.
Dec 02 09:04:47 np0005541913.localdomain podman[104821]: 2025-12-02 09:04:47.443652658 +0000 UTC m=+0.086677307 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Dec 02 09:04:47 np0005541913.localdomain podman[104820]: 2025-12-02 09:04:47.485520456 +0000 UTC m=+0.128604026 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:04:47 np0005541913.localdomain podman[104820]: 2025-12-02 09:04:47.500958369 +0000 UTC m=+0.144041959 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 02 09:04:47 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:04:47 np0005541913.localdomain podman[104821]: 2025-12-02 09:04:47.556811832 +0000 UTC m=+0.199836461 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 09:04:47 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:04:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:04:57 np0005541913.localdomain podman[104858]: 2025-12-02 09:04:57.441496125 +0000 UTC m=+0.084908409 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Dec 02 09:04:57 np0005541913.localdomain podman[104858]: 2025-12-02 09:04:57.668283824 +0000 UTC m=+0.311696108 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 09:04:57 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:05:06 np0005541913.localdomain podman[104887]: 2025-12-02 09:05:06.452889701 +0000 UTC m=+0.093688044 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:32Z, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond)
Dec 02 09:05:06 np0005541913.localdomain podman[104887]: 2025-12-02 09:05:06.463105114 +0000 UTC m=+0.103903447 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1)
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:05:06 np0005541913.localdomain podman[104888]: 2025-12-02 09:05:06.496039054 +0000 UTC m=+0.136236480 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:05:06 np0005541913.localdomain podman[104889]: 2025-12-02 09:05:06.564010159 +0000 UTC m=+0.198346739 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 09:05:06 np0005541913.localdomain podman[104889]: 2025-12-02 09:05:06.607669676 +0000 UTC m=+0.242006286 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com)
Dec 02 09:05:06 np0005541913.localdomain podman[104889]: unhealthy
Dec 02 09:05:06 np0005541913.localdomain podman[104890]: 2025-12-02 09:05:06.617766026 +0000 UTC m=+0.248189962 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container)
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 09:05:06 np0005541913.localdomain podman[104902]: 2025-12-02 09:05:06.538807256 +0000 UTC m=+0.164889615 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, version=17.1.12)
Dec 02 09:05:06 np0005541913.localdomain podman[104890]: 2025-12-02 09:05:06.66695528 +0000 UTC m=+0.297379196 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:05:06 np0005541913.localdomain podman[104902]: 2025-12-02 09:05:06.719379511 +0000 UTC m=+0.345461880 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com)
Dec 02 09:05:06 np0005541913.localdomain podman[104902]: unhealthy
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed with result 'exit-code'.
Dec 02 09:05:06 np0005541913.localdomain podman[104888]: 2025-12-02 09:05:06.878603575 +0000 UTC m=+0.518800971 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, tcib_managed=true, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4)
Dec 02 09:05:06 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:05:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:05:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:05:09 np0005541913.localdomain systemd[1]: tmp-crun.gxb0F5.mount: Deactivated successfully.
Dec 02 09:05:09 np0005541913.localdomain podman[105003]: 2025-12-02 09:05:09.433923182 +0000 UTC m=+0.073151445 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:05:09 np0005541913.localdomain podman[105003]: 2025-12-02 09:05:09.452232871 +0000 UTC m=+0.091461214 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12)
Dec 02 09:05:09 np0005541913.localdomain podman[105003]: unhealthy
Dec 02 09:05:09 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:05:09 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:05:09 np0005541913.localdomain podman[105004]: 2025-12-02 09:05:09.549195871 +0000 UTC m=+0.182306711 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible)
Dec 02 09:05:09 np0005541913.localdomain podman[105004]: 2025-12-02 09:05:09.56823512 +0000 UTC m=+0.201345960 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, distribution-scope=public, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4)
Dec 02 09:05:09 np0005541913.localdomain podman[105004]: unhealthy
Dec 02 09:05:09 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:05:09 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:05:10 np0005541913.localdomain sudo[105043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:05:10 np0005541913.localdomain sudo[105043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:05:10 np0005541913.localdomain sudo[105043]: pam_unix(sudo:session): session closed for user root
Dec 02 09:05:10 np0005541913.localdomain sudo[105058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:05:10 np0005541913.localdomain sudo[105058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:05:11 np0005541913.localdomain sudo[105058]: pam_unix(sudo:session): session closed for user root
Dec 02 09:05:12 np0005541913.localdomain sudo[105104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:05:12 np0005541913.localdomain sudo[105104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:05:12 np0005541913.localdomain sudo[105104]: pam_unix(sudo:session): session closed for user root
Dec 02 09:05:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:05:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:05:18 np0005541913.localdomain podman[105119]: 2025-12-02 09:05:18.460467481 +0000 UTC m=+0.099176490 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-collectd, container_name=collectd, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 09:05:18 np0005541913.localdomain podman[105120]: 2025-12-02 09:05:18.510472697 +0000 UTC m=+0.147892192 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, container_name=iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 09:05:18 np0005541913.localdomain podman[105119]: 2025-12-02 09:05:18.522795486 +0000 UTC m=+0.161504455 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step3, version=17.1.12, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd)
Dec 02 09:05:18 np0005541913.localdomain podman[105120]: 2025-12-02 09:05:18.526087135 +0000 UTC m=+0.163506600 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=iscsid, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 09:05:18 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:05:18 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:05:19 np0005541913.localdomain sshd[105158]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:05:19 np0005541913.localdomain sshd[105158]: Accepted publickey for zuul from 192.168.122.30 port 47180 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:05:19 np0005541913.localdomain systemd-logind[757]: New session 37 of user zuul.
Dec 02 09:05:19 np0005541913.localdomain systemd[1]: Started Session 37 of User zuul.
Dec 02 09:05:19 np0005541913.localdomain sshd[105158]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:05:20 np0005541913.localdomain sudo[105251]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vunqpradzktisbwjsoepuwhhzziknxcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666319.6249244-27-17647740305765/AnsiballZ_stat.py
Dec 02 09:05:20 np0005541913.localdomain sudo[105251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:05:20 np0005541913.localdomain python3.9[105253]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:05:20 np0005541913.localdomain sudo[105251]: pam_unix(sudo:session): session closed for user root
Dec 02 09:05:20 np0005541913.localdomain sudo[105345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuhaxxdnilgqxskgapxclafvmtpvpwpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666320.6063535-63-83443535504868/AnsiballZ_command.py
Dec 02 09:05:20 np0005541913.localdomain sudo[105345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:05:21 np0005541913.localdomain python3.9[105347]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:05:21 np0005541913.localdomain sudo[105345]: pam_unix(sudo:session): session closed for user root
Dec 02 09:05:21 np0005541913.localdomain sudo[105438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igbpmrjdrwsahkwmcuucqpngfuajcghz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666321.4035242-87-227220360799573/AnsiballZ_stat.py
Dec 02 09:05:21 np0005541913.localdomain sudo[105438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:05:21 np0005541913.localdomain python3.9[105440]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:05:21 np0005541913.localdomain sudo[105438]: pam_unix(sudo:session): session closed for user root
Dec 02 09:05:22 np0005541913.localdomain sudo[105532]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-motmmngrgftjfehcpkrzdgccjmtngtxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666322.0424767-111-108353724294668/AnsiballZ_command.py
Dec 02 09:05:22 np0005541913.localdomain sudo[105532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:05:22 np0005541913.localdomain python3.9[105534]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:05:22 np0005541913.localdomain sudo[105532]: pam_unix(sudo:session): session closed for user root
Dec 02 09:05:22 np0005541913.localdomain sudo[105625]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmalhtylmyrajcahpaegstpchxajvpro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666322.7352288-138-270684236375615/AnsiballZ_command.py
Dec 02 09:05:22 np0005541913.localdomain sudo[105625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:05:23 np0005541913.localdomain python3.9[105627]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:05:23 np0005541913.localdomain sudo[105625]: pam_unix(sudo:session): session closed for user root
Dec 02 09:05:23 np0005541913.localdomain python3.9[105718]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 02 09:05:25 np0005541913.localdomain python3.9[105808]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:05:26 np0005541913.localdomain python3.9[105900]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 02 09:05:27 np0005541913.localdomain python3.9[105990]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:05:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:05:28 np0005541913.localdomain python3.9[106038]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:05:28 np0005541913.localdomain podman[106039]: 2025-12-02 09:05:28.445865648 +0000 UTC m=+0.082695901 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Dec 02 09:05:28 np0005541913.localdomain podman[106039]: 2025-12-02 09:05:28.614869853 +0000 UTC m=+0.251700106 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, release=1761123044, architecture=x86_64)
Dec 02 09:05:28 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:05:28 np0005541913.localdomain sshd[105158]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:05:28 np0005541913.localdomain systemd[1]: session-37.scope: Deactivated successfully.
Dec 02 09:05:28 np0005541913.localdomain systemd[1]: session-37.scope: Consumed 4.647s CPU time.
Dec 02 09:05:28 np0005541913.localdomain systemd-logind[757]: Session 37 logged out. Waiting for processes to exit.
Dec 02 09:05:28 np0005541913.localdomain systemd-logind[757]: Removed session 37.
Dec 02 09:05:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23240 DF PROTO=TCP SPT=51104 DPT=9102 SEQ=2771939589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477EF06D0000000001030307) 
Dec 02 09:05:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57881 DF PROTO=TCP SPT=48754 DPT=9105 SEQ=2036713384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477EF0EE0000000001030307) 
Dec 02 09:05:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23241 DF PROTO=TCP SPT=51104 DPT=9102 SEQ=2771939589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477EF4640000000001030307) 
Dec 02 09:05:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57882 DF PROTO=TCP SPT=48754 DPT=9105 SEQ=2036713384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477EF4E40000000001030307) 
Dec 02 09:05:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24597 DF PROTO=TCP SPT=59492 DPT=9882 SEQ=281102290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477EF8AC0000000001030307) 
Dec 02 09:05:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23242 DF PROTO=TCP SPT=51104 DPT=9102 SEQ=2771939589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477EFC650000000001030307) 
Dec 02 09:05:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24598 DF PROTO=TCP SPT=59492 DPT=9882 SEQ=281102290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477EFCA40000000001030307) 
Dec 02 09:05:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57883 DF PROTO=TCP SPT=48754 DPT=9105 SEQ=2036713384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477EFCE40000000001030307) 
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 09:05:37 np0005541913.localdomain recover_tripleo_nova_virtqemud[106112]: 62312
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 09:05:37 np0005541913.localdomain podman[106085]: 2025-12-02 09:05:37.462876031 +0000 UTC m=+0.095559704 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible)
Dec 02 09:05:37 np0005541913.localdomain podman[106085]: 2025-12-02 09:05:37.472661733 +0000 UTC m=+0.105345396 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, name=rhosp17/openstack-cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64)
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:05:37 np0005541913.localdomain podman[106088]: 2025-12-02 09:05:37.523959743 +0000 UTC m=+0.149365671 container health_status 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute)
Dec 02 09:05:37 np0005541913.localdomain podman[106088]: 2025-12-02 09:05:37.552828044 +0000 UTC m=+0.178233902 container exec_died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, release=1761123044)
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Deactivated successfully.
Dec 02 09:05:37 np0005541913.localdomain podman[106086]: 2025-12-02 09:05:37.559850462 +0000 UTC m=+0.189105303 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 09:05:37 np0005541913.localdomain podman[106089]: 2025-12-02 09:05:37.622220928 +0000 UTC m=+0.246548217 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:05:37 np0005541913.localdomain podman[106089]: 2025-12-02 09:05:37.666983004 +0000 UTC m=+0.291310263 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z)
Dec 02 09:05:37 np0005541913.localdomain podman[106089]: unhealthy
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed with result 'exit-code'.
Dec 02 09:05:37 np0005541913.localdomain podman[106087]: 2025-12-02 09:05:37.683875066 +0000 UTC m=+0.309150320 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Dec 02 09:05:37 np0005541913.localdomain podman[106087]: 2025-12-02 09:05:37.703974752 +0000 UTC m=+0.329249996 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step5, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute)
Dec 02 09:05:37 np0005541913.localdomain podman[106087]: unhealthy
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 09:05:37 np0005541913.localdomain podman[106086]: 2025-12-02 09:05:37.927775881 +0000 UTC m=+0.557030752 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vendor=Red Hat, Inc.)
Dec 02 09:05:37 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:05:39 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65145 DF PROTO=TCP SPT=60756 DPT=9100 SEQ=314869193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F045B0000000001030307) 
Dec 02 09:05:39 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24599 DF PROTO=TCP SPT=59492 DPT=9882 SEQ=281102290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F04A40000000001030307) 
Dec 02 09:05:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65146 DF PROTO=TCP SPT=60756 DPT=9100 SEQ=314869193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F08650000000001030307) 
Dec 02 09:05:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:05:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:05:40 np0005541913.localdomain podman[106206]: 2025-12-02 09:05:40.436730459 +0000 UTC m=+0.075428966 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1)
Dec 02 09:05:40 np0005541913.localdomain podman[106206]: 2025-12-02 09:05:40.448272297 +0000 UTC m=+0.086970784 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public)
Dec 02 09:05:40 np0005541913.localdomain podman[106206]: unhealthy
Dec 02 09:05:40 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:05:40 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:05:40 np0005541913.localdomain systemd[1]: tmp-crun.H0u1XW.mount: Deactivated successfully.
Dec 02 09:05:40 np0005541913.localdomain podman[106207]: 2025-12-02 09:05:40.507449088 +0000 UTC m=+0.142456907 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Dec 02 09:05:40 np0005541913.localdomain podman[106207]: 2025-12-02 09:05:40.547317613 +0000 UTC m=+0.182325432 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 09:05:40 np0005541913.localdomain podman[106207]: unhealthy
Dec 02 09:05:40 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:05:40 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:05:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23243 DF PROTO=TCP SPT=51104 DPT=9102 SEQ=2771939589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F0C240000000001030307) 
Dec 02 09:05:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57884 DF PROTO=TCP SPT=48754 DPT=9105 SEQ=2036713384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F0CA40000000001030307) 
Dec 02 09:05:42 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65147 DF PROTO=TCP SPT=60756 DPT=9100 SEQ=314869193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F10640000000001030307) 
Dec 02 09:05:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24600 DF PROTO=TCP SPT=59492 DPT=9882 SEQ=281102290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F14640000000001030307) 
Dec 02 09:05:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65148 DF PROTO=TCP SPT=60756 DPT=9100 SEQ=314869193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F20250000000001030307) 
Dec 02 09:05:48 np0005541913.localdomain sshd[106244]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:05:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:05:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:05:48 np0005541913.localdomain sshd[106244]: Accepted publickey for zuul from 192.168.122.30 port 59700 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:05:48 np0005541913.localdomain systemd-logind[757]: New session 38 of user zuul.
Dec 02 09:05:48 np0005541913.localdomain systemd[1]: Started Session 38 of User zuul.
Dec 02 09:05:48 np0005541913.localdomain sshd[106244]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:05:48 np0005541913.localdomain podman[106247]: 2025-12-02 09:05:48.781113264 +0000 UTC m=+0.081519900 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 02 09:05:48 np0005541913.localdomain podman[106247]: 2025-12-02 09:05:48.795135648 +0000 UTC m=+0.095542334 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z)
Dec 02 09:05:48 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:05:48 np0005541913.localdomain podman[106246]: 2025-12-02 09:05:48.89778293 +0000 UTC m=+0.200690432 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, container_name=collectd, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 02 09:05:48 np0005541913.localdomain podman[106246]: 2025-12-02 09:05:48.908463795 +0000 UTC m=+0.211371267 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 09:05:48 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:05:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23244 DF PROTO=TCP SPT=51104 DPT=9102 SEQ=2771939589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F2BE40000000001030307) 
Dec 02 09:05:49 np0005541913.localdomain sudo[106376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjrbwucaconfonpmpacfzfdbhbtljtzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666348.8429322-24-178376453960816/AnsiballZ_systemd_service.py
Dec 02 09:05:49 np0005541913.localdomain sudo[106376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:05:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57885 DF PROTO=TCP SPT=48754 DPT=9105 SEQ=2036713384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F2DE50000000001030307) 
Dec 02 09:05:49 np0005541913.localdomain python3.9[106378]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:05:49 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:05:49 np0005541913.localdomain systemd-rc-local-generator[106399]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:05:49 np0005541913.localdomain systemd-sysv-generator[106402]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:05:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:05:50 np0005541913.localdomain sudo[106376]: pam_unix(sudo:session): session closed for user root
Dec 02 09:05:51 np0005541913.localdomain python3.9[106504]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:05:51 np0005541913.localdomain network[106521]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:05:51 np0005541913.localdomain network[106522]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:05:51 np0005541913.localdomain network[106523]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:05:51 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24601 DF PROTO=TCP SPT=59492 DPT=9882 SEQ=281102290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F33E50000000001030307) 
Dec 02 09:05:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1123 DF PROTO=TCP SPT=56514 DPT=9101 SEQ=1969001915 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F37DF0000000001030307) 
Dec 02 09:05:52 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:05:53 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1124 DF PROTO=TCP SPT=56514 DPT=9101 SEQ=1969001915 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F3BE50000000001030307) 
Dec 02 09:05:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1125 DF PROTO=TCP SPT=56514 DPT=9101 SEQ=1969001915 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F43E40000000001030307) 
Dec 02 09:05:57 np0005541913.localdomain python3.9[106720]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:05:57 np0005541913.localdomain network[106737]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:05:57 np0005541913.localdomain network[106738]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:05:57 np0005541913.localdomain network[106739]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:05:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:05:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:05:58 np0005541913.localdomain podman[106797]: 2025-12-02 09:05:58.757682532 +0000 UTC m=+0.090557020 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 09:05:58 np0005541913.localdomain podman[106797]: 2025-12-02 09:05:58.977885984 +0000 UTC m=+0.310760532 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 09:05:58 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:05:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1126 DF PROTO=TCP SPT=56514 DPT=9101 SEQ=1969001915 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F53A50000000001030307) 
Dec 02 09:06:01 np0005541913.localdomain sudo[106965]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luhenefxubkunakzwtwilsebzlpxvfkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666361.0935197-115-7837270149508/AnsiballZ_systemd_service.py
Dec 02 09:06:01 np0005541913.localdomain sudo[106965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:06:01 np0005541913.localdomain python3.9[106967]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:06:01 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:06:01 np0005541913.localdomain systemd-sysv-generator[106993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:06:01 np0005541913.localdomain systemd-rc-local-generator[106990]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:06:01 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:06:02 np0005541913.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Dec 02 09:06:02 np0005541913.localdomain systemd[1]: tmp-crun.q1jhBh.mount: Deactivated successfully.
Dec 02 09:06:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48720 DF PROTO=TCP SPT=35558 DPT=9102 SEQ=2174524384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F659D0000000001030307) 
Dec 02 09:06:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37809 DF PROTO=TCP SPT=41890 DPT=9105 SEQ=2583707932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F661E0000000001030307) 
Dec 02 09:06:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48722 DF PROTO=TCP SPT=35558 DPT=9102 SEQ=2174524384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F71A40000000001030307) 
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:06:07 np0005541913.localdomain podman[107023]: Error: container 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be is not running
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Main process exited, code=exited, status=125/n/a
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Failed with result 'exit-code'.
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: tmp-crun.tPbIQg.mount: Deactivated successfully.
Dec 02 09:06:07 np0005541913.localdomain podman[107022]: 2025-12-02 09:06:07.7448696 +0000 UTC m=+0.130724604 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public)
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:06:07 np0005541913.localdomain podman[107022]: 2025-12-02 09:06:07.791925827 +0000 UTC m=+0.177780811 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:06:07 np0005541913.localdomain podman[107055]: 2025-12-02 09:06:07.857757235 +0000 UTC m=+0.080269604 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-type=git, container_name=ceilometer_agent_ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 02 09:06:07 np0005541913.localdomain podman[107054]: 2025-12-02 09:06:07.904284849 +0000 UTC m=+0.128175116 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044)
Dec 02 09:06:07 np0005541913.localdomain podman[107054]: 2025-12-02 09:06:07.951072739 +0000 UTC m=+0.174963026 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 09:06:07 np0005541913.localdomain podman[107054]: unhealthy
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 09:06:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:06:08 np0005541913.localdomain podman[107055]: 2025-12-02 09:06:08.008211085 +0000 UTC m=+0.230723454 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi)
Dec 02 09:06:08 np0005541913.localdomain podman[107055]: unhealthy
Dec 02 09:06:08 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:06:08 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed with result 'exit-code'.
Dec 02 09:06:08 np0005541913.localdomain podman[107104]: 2025-12-02 09:06:08.103764868 +0000 UTC m=+0.118669071 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:06:08 np0005541913.localdomain podman[107104]: 2025-12-02 09:06:08.476026313 +0000 UTC m=+0.490930546 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 09:06:08 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:06:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9314 DF PROTO=TCP SPT=57020 DPT=9100 SEQ=1014919568 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F7DA40000000001030307) 
Dec 02 09:06:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:06:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:06:10 np0005541913.localdomain systemd[1]: tmp-crun.06cTbv.mount: Deactivated successfully.
Dec 02 09:06:10 np0005541913.localdomain podman[107128]: 2025-12-02 09:06:10.692357003 +0000 UTC m=+0.086526042 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, container_name=ovn_controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ovn-controller)
Dec 02 09:06:10 np0005541913.localdomain podman[107128]: 2025-12-02 09:06:10.736908704 +0000 UTC m=+0.131077733 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public)
Dec 02 09:06:10 np0005541913.localdomain podman[107127]: 2025-12-02 09:06:10.739937524 +0000 UTC m=+0.133831836 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 02 09:06:10 np0005541913.localdomain podman[107127]: 2025-12-02 09:06:10.779026319 +0000 UTC m=+0.172920571 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, distribution-scope=public, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 02 09:06:10 np0005541913.localdomain podman[107127]: unhealthy
Dec 02 09:06:10 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:06:10 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:06:10 np0005541913.localdomain podman[107128]: unhealthy
Dec 02 09:06:10 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:06:10 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:06:12 np0005541913.localdomain sudo[107165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:06:12 np0005541913.localdomain sudo[107165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:06:12 np0005541913.localdomain sudo[107165]: pam_unix(sudo:session): session closed for user root
Dec 02 09:06:12 np0005541913.localdomain sudo[107180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:06:12 np0005541913.localdomain sudo[107180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:06:13 np0005541913.localdomain sudo[107180]: pam_unix(sudo:session): session closed for user root
Dec 02 09:06:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53815 DF PROTO=TCP SPT=50542 DPT=9882 SEQ=3057712063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F89A50000000001030307) 
Dec 02 09:06:14 np0005541913.localdomain sudo[107227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:06:14 np0005541913.localdomain sudo[107227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:06:14 np0005541913.localdomain sudo[107227]: pam_unix(sudo:session): session closed for user root
Dec 02 09:06:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9316 DF PROTO=TCP SPT=57020 DPT=9100 SEQ=1014919568 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477F95650000000001030307) 
Dec 02 09:06:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:06:18 np0005541913.localdomain systemd[1]: tmp-crun.lDw2dd.mount: Deactivated successfully.
Dec 02 09:06:18 np0005541913.localdomain podman[107242]: 2025-12-02 09:06:18.953049512 +0000 UTC m=+0.090717444 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 09:06:18 np0005541913.localdomain podman[107242]: 2025-12-02 09:06:18.963451241 +0000 UTC m=+0.101119143 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 02 09:06:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:06:18 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:06:19 np0005541913.localdomain podman[107259]: 2025-12-02 09:06:19.047314711 +0000 UTC m=+0.071860851 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Dec 02 09:06:19 np0005541913.localdomain podman[107259]: 2025-12-02 09:06:19.057733419 +0000 UTC m=+0.082279549 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 02 09:06:19 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:06:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48724 DF PROTO=TCP SPT=35558 DPT=9102 SEQ=2174524384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477FA1E40000000001030307) 
Dec 02 09:06:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56518 DF PROTO=TCP SPT=36398 DPT=9101 SEQ=1443848096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477FAD100000000001030307) 
Dec 02 09:06:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56520 DF PROTO=TCP SPT=36398 DPT=9101 SEQ=1443848096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477FB9240000000001030307) 
Dec 02 09:06:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:06:29 np0005541913.localdomain podman[107280]: 2025-12-02 09:06:29.182796349 +0000 UTC m=+0.076032523 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12)
Dec 02 09:06:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56521 DF PROTO=TCP SPT=36398 DPT=9101 SEQ=1443848096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477FC8E40000000001030307) 
Dec 02 09:06:29 np0005541913.localdomain podman[107280]: 2025-12-02 09:06:29.347436196 +0000 UTC m=+0.240672330 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 02 09:06:29 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:06:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56498 DF PROTO=TCP SPT=53246 DPT=9102 SEQ=1239950182 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477FDACE0000000001030307) 
Dec 02 09:06:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32615 DF PROTO=TCP SPT=59620 DPT=9105 SEQ=2395432243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477FDB4E0000000001030307) 
Dec 02 09:06:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56500 DF PROTO=TCP SPT=53246 DPT=9102 SEQ=1239950182 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477FE6E40000000001030307) 
Dec 02 09:06:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:06:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:06:37 np0005541913.localdomain systemd[1]: tmp-crun.qNMF5o.mount: Deactivated successfully.
Dec 02 09:06:37 np0005541913.localdomain podman[107309]: 2025-12-02 09:06:37.922282388 +0000 UTC m=+0.060057215 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, name=rhosp17/openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 02 09:06:37 np0005541913.localdomain podman[107309]: 2025-12-02 09:06:37.929532492 +0000 UTC m=+0.067307349 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=logrotate_crond, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, architecture=x86_64)
Dec 02 09:06:37 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:06:37 np0005541913.localdomain podman[107310]: Error: container 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be is not running
Dec 02 09:06:37 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Main process exited, code=exited, status=125/n/a
Dec 02 09:06:37 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Failed with result 'exit-code'.
Dec 02 09:06:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:06:38 np0005541913.localdomain podman[107340]: 2025-12-02 09:06:38.028573457 +0000 UTC m=+0.053721996 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, version=17.1.12)
Dec 02 09:06:38 np0005541913.localdomain podman[107340]: 2025-12-02 09:06:38.041995916 +0000 UTC m=+0.067144445 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, tcib_managed=true)
Dec 02 09:06:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:06:38 np0005541913.localdomain podman[107340]: unhealthy
Dec 02 09:06:38 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:06:38 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 09:06:38 np0005541913.localdomain podman[107362]: 2025-12-02 09:06:38.116817025 +0000 UTC m=+0.060884828 container health_status 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 02 09:06:38 np0005541913.localdomain podman[107362]: 2025-12-02 09:06:38.143869897 +0000 UTC m=+0.087937720 container exec_died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible)
Dec 02 09:06:38 np0005541913.localdomain podman[107362]: unhealthy
Dec 02 09:06:38 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:06:38 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed with result 'exit-code'.
Dec 02 09:06:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:06:39 np0005541913.localdomain podman[107387]: 2025-12-02 09:06:39.444876055 +0000 UTC m=+0.084415726 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, container_name=nova_migration_target, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:06:39 np0005541913.localdomain podman[107387]: 2025-12-02 09:06:39.864869766 +0000 UTC m=+0.504409427 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true)
Dec 02 09:06:39 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:06:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44603 DF PROTO=TCP SPT=44326 DPT=9100 SEQ=3333704589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477FF2E40000000001030307) 
Dec 02 09:06:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:06:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:06:41 np0005541913.localdomain podman[107411]: 2025-12-02 09:06:41.439455642 +0000 UTC m=+0.078317804 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 09:06:41 np0005541913.localdomain podman[107411]: 2025-12-02 09:06:41.450889017 +0000 UTC m=+0.089751169 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 09:06:41 np0005541913.localdomain podman[107411]: unhealthy
Dec 02 09:06:41 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:06:41 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:06:41 np0005541913.localdomain podman[107410]: 2025-12-02 09:06:41.482723277 +0000 UTC m=+0.124634330 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:06:41 np0005541913.localdomain podman[107410]: 2025-12-02 09:06:41.521102413 +0000 UTC m=+0.163013476 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 02 09:06:41 np0005541913.localdomain podman[107410]: unhealthy
Dec 02 09:06:41 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:06:41 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:06:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54581 DF PROTO=TCP SPT=53504 DPT=9882 SEQ=2933903461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A477FFEE40000000001030307) 
Dec 02 09:06:44 np0005541913.localdomain podman[107007]: time="2025-12-02T09:06:44Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL"
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: tmp-crun.8vAHCu.mount: Deactivated successfully.
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: libpod-4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.scope: Deactivated successfully.
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: libpod-4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.scope: Consumed 6.232s CPU time.
Dec 02 09:06:44 np0005541913.localdomain podman[107007]: 2025-12-02 09:06:44.140287626 +0000 UTC m=+42.091105041 container stop 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 09:06:44 np0005541913.localdomain podman[107007]: 2025-12-02 09:06:44.174057448 +0000 UTC m=+42.124874893 container died 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.timer: Deactivated successfully.
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Failed to open /run/systemd/transient/4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: No such file or directory
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: tmp-crun.Ynl3VB.mount: Deactivated successfully.
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be-userdata-shm.mount: Deactivated successfully.
Dec 02 09:06:44 np0005541913.localdomain podman[107007]: 2025-12-02 09:06:44.285423803 +0000 UTC m=+42.236241198 container cleanup 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:06:44 np0005541913.localdomain podman[107007]: ceilometer_agent_compute
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.timer: Failed to open /run/systemd/transient/4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.timer: No such file or directory
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Failed to open /run/systemd/transient/4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: No such file or directory
Dec 02 09:06:44 np0005541913.localdomain podman[107450]: 2025-12-02 09:06:44.301470921 +0000 UTC m=+0.145784365 container cleanup 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: libpod-conmon-4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.scope: Deactivated successfully.
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.timer: Failed to open /run/systemd/transient/4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.timer: No such file or directory
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: Failed to open /run/systemd/transient/4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be.service: No such file or directory
Dec 02 09:06:44 np0005541913.localdomain podman[107465]: 2025-12-02 09:06:44.386658668 +0000 UTC m=+0.050025948 container cleanup 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64)
Dec 02 09:06:44 np0005541913.localdomain podman[107465]: ceilometer_agent_compute
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Dec 02 09:06:44 np0005541913.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.045s CPU time, no IO.
Dec 02 09:06:44 np0005541913.localdomain sudo[106965]: pam_unix(sudo:session): session closed for user root
Dec 02 09:06:44 np0005541913.localdomain sudo[107566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmoyzhqqnsvhjutkoyhtxhqeqvnexugq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666404.5459907-115-41467145416417/AnsiballZ_systemd_service.py
Dec 02 09:06:44 np0005541913.localdomain sudo[107566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:06:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b8a416db81901f96d6fd72f5969e70208d019cecbe75cef9d1ed7630b319da67-merged.mount: Deactivated successfully.
Dec 02 09:06:45 np0005541913.localdomain python3.9[107568]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:06:45 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:06:45 np0005541913.localdomain systemd-rc-local-generator[107592]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:06:45 np0005541913.localdomain systemd-sysv-generator[107597]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:06:45 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:06:45 np0005541913.localdomain systemd[1]: Stopping ceilometer_agent_ipmi container...
Dec 02 09:06:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44605 DF PROTO=TCP SPT=44326 DPT=9100 SEQ=3333704589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47800AA40000000001030307) 
Dec 02 09:06:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:06:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:06:49 np0005541913.localdomain podman[107622]: 2025-12-02 09:06:49.444045097 +0000 UTC m=+0.080250364 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:06:49 np0005541913.localdomain podman[107622]: 2025-12-02 09:06:49.455950925 +0000 UTC m=+0.092156122 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 02 09:06:49 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:06:49 np0005541913.localdomain podman[107621]: 2025-12-02 09:06:49.546648469 +0000 UTC m=+0.185426965 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-collectd)
Dec 02 09:06:49 np0005541913.localdomain podman[107621]: 2025-12-02 09:06:49.555182787 +0000 UTC m=+0.193961213 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 09:06:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32619 DF PROTO=TCP SPT=59620 DPT=9105 SEQ=2395432243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478017E40000000001030307) 
Dec 02 09:06:49 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:06:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30898 DF PROTO=TCP SPT=56690 DPT=9101 SEQ=4024337551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478022400000000001030307) 
Dec 02 09:06:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30900 DF PROTO=TCP SPT=56690 DPT=9101 SEQ=4024337551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47802E640000000001030307) 
Dec 02 09:06:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30901 DF PROTO=TCP SPT=56690 DPT=9101 SEQ=4024337551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47803E250000000001030307) 
Dec 02 09:06:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:06:59 np0005541913.localdomain podman[107659]: 2025-12-02 09:06:59.688333311 +0000 UTC m=+0.083196754 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 02 09:06:59 np0005541913.localdomain podman[107659]: 2025-12-02 09:06:59.884316896 +0000 UTC m=+0.279180319 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64)
Dec 02 09:06:59 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:07:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34102 DF PROTO=TCP SPT=47628 DPT=9102 SEQ=3160238718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47804FFE0000000001030307) 
Dec 02 09:07:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64345 DF PROTO=TCP SPT=59134 DPT=9105 SEQ=833223188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4780507E0000000001030307) 
Dec 02 09:07:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34104 DF PROTO=TCP SPT=47628 DPT=9102 SEQ=3160238718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47805C240000000001030307) 
Dec 02 09:07:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:07:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:07:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:07:08 np0005541913.localdomain podman[107690]: Error: container 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe is not running
Dec 02 09:07:08 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Main process exited, code=exited, status=125/n/a
Dec 02 09:07:08 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed with result 'exit-code'.
Dec 02 09:07:08 np0005541913.localdomain podman[107688]: 2025-12-02 09:07:08.455260695 +0000 UTC m=+0.092030430 container health_status 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044)
Dec 02 09:07:08 np0005541913.localdomain podman[107688]: 2025-12-02 09:07:08.461003908 +0000 UTC m=+0.097773673 container exec_died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=logrotate_crond, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, version=17.1.12)
Dec 02 09:07:08 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Deactivated successfully.
Dec 02 09:07:08 np0005541913.localdomain systemd[1]: tmp-crun.T7d8XL.mount: Deactivated successfully.
Dec 02 09:07:08 np0005541913.localdomain podman[107689]: 2025-12-02 09:07:08.506986546 +0000 UTC m=+0.142075857 container health_status 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, container_name=nova_compute, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 09:07:08 np0005541913.localdomain podman[107689]: 2025-12-02 09:07:08.552243355 +0000 UTC m=+0.187332696 container exec_died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.buildah.version=1.41.4, container_name=nova_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:07:08 np0005541913.localdomain podman[107689]: unhealthy
Dec 02 09:07:08 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:07:08 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 09:07:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32717 DF PROTO=TCP SPT=37158 DPT=9100 SEQ=685219675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478067E40000000001030307) 
Dec 02 09:07:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:07:10 np0005541913.localdomain podman[107741]: 2025-12-02 09:07:10.447582001 +0000 UTC m=+0.083735048 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Dec 02 09:07:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:07:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 4776 writes, 21K keys, 4776 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4776 writes, 569 syncs, 8.39 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:07:10 np0005541913.localdomain podman[107741]: 2025-12-02 09:07:10.819229539 +0000 UTC m=+0.455382596 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 02 09:07:10 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:07:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:07:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:07:12 np0005541913.localdomain podman[107764]: 2025-12-02 09:07:12.435728184 +0000 UTC m=+0.077149551 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:07:12 np0005541913.localdomain podman[107764]: 2025-12-02 09:07:12.456012267 +0000 UTC m=+0.097433644 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:07:12 np0005541913.localdomain podman[107764]: unhealthy
Dec 02 09:07:12 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:07:12 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:07:12 np0005541913.localdomain podman[107765]: 2025-12-02 09:07:12.501977804 +0000 UTC m=+0.139797766 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true)
Dec 02 09:07:12 np0005541913.localdomain podman[107765]: 2025-12-02 09:07:12.515549287 +0000 UTC m=+0.153369279 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, architecture=x86_64, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Dec 02 09:07:12 np0005541913.localdomain podman[107765]: unhealthy
Dec 02 09:07:12 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:07:12 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:07:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9319 DF PROTO=TCP SPT=57020 DPT=9100 SEQ=1014919568 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478073E40000000001030307) 
Dec 02 09:07:14 np0005541913.localdomain sudo[107802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:07:14 np0005541913.localdomain sudo[107802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:07:14 np0005541913.localdomain sudo[107802]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:14 np0005541913.localdomain sudo[107817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:07:14 np0005541913.localdomain sudo[107817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:07:14 np0005541913.localdomain sudo[107817]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:07:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.2 total, 600.0 interval
                                                          Cumulative writes: 5722 writes, 25K keys, 5722 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5722 writes, 780 syncs, 7.34 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:07:15 np0005541913.localdomain sudo[107863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:07:15 np0005541913.localdomain sudo[107863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:07:15 np0005541913.localdomain sudo[107863]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32719 DF PROTO=TCP SPT=37158 DPT=9100 SEQ=685219675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47807FA40000000001030307) 
Dec 02 09:07:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34106 DF PROTO=TCP SPT=47628 DPT=9102 SEQ=3160238718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47808BE40000000001030307) 
Dec 02 09:07:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:07:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:07:19 np0005541913.localdomain podman[107879]: 2025-12-02 09:07:19.723499072 +0000 UTC m=+0.102103570 container health_status c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, container_name=iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 02 09:07:19 np0005541913.localdomain podman[107879]: 2025-12-02 09:07:19.739087979 +0000 UTC m=+0.117692507 container exec_died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com)
Dec 02 09:07:19 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Deactivated successfully.
Dec 02 09:07:19 np0005541913.localdomain systemd[1]: tmp-crun.xQkXgI.mount: Deactivated successfully.
Dec 02 09:07:19 np0005541913.localdomain podman[107878]: 2025-12-02 09:07:19.830606913 +0000 UTC m=+0.211067119 container health_status 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, tcib_managed=true, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 09:07:19 np0005541913.localdomain podman[107878]: 2025-12-02 09:07:19.844028242 +0000 UTC m=+0.224464748 container exec_died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z)
Dec 02 09:07:19 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Deactivated successfully.
Dec 02 09:07:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22068 DF PROTO=TCP SPT=52596 DPT=9101 SEQ=2844187226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478097700000000001030307) 
Dec 02 09:07:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22070 DF PROTO=TCP SPT=52596 DPT=9101 SEQ=2844187226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4780A3640000000001030307) 
Dec 02 09:07:27 np0005541913.localdomain podman[107608]: time="2025-12-02T09:07:27Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL"
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: libpod-7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.scope: Deactivated successfully.
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: libpod-7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.scope: Consumed 6.287s CPU time.
Dec 02 09:07:27 np0005541913.localdomain podman[107608]: 2025-12-02 09:07:27.762276971 +0000 UTC m=+42.095603588 container died 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.timer: Deactivated successfully.
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed to open /run/systemd/transient/7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: No such file or directory
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe-userdata-shm.mount: Deactivated successfully.
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-fddcd6dd4df186203ff55efce1dca7750680c9de7878dc7d77dfefe109af9b62-merged.mount: Deactivated successfully.
Dec 02 09:07:27 np0005541913.localdomain podman[107608]: 2025-12-02 09:07:27.816242173 +0000 UTC m=+42.149568770 container cleanup 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 02 09:07:27 np0005541913.localdomain podman[107608]: ceilometer_agent_ipmi
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.timer: Failed to open /run/systemd/transient/7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.timer: No such file or directory
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed to open /run/systemd/transient/7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: No such file or directory
Dec 02 09:07:27 np0005541913.localdomain podman[107917]: 2025-12-02 09:07:27.850567259 +0000 UTC m=+0.081597130 container cleanup 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, release=1761123044, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: libpod-conmon-7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.scope: Deactivated successfully.
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.timer: Failed to open /run/systemd/transient/7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.timer: No such file or directory
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: Failed to open /run/systemd/transient/7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe.service: No such file or directory
Dec 02 09:07:27 np0005541913.localdomain podman[107933]: 2025-12-02 09:07:27.931410409 +0000 UTC m=+0.056313345 container cleanup 7c38d69431417999192726acf4fb66e48576980efee171d6750a496e1575f4fe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64)
Dec 02 09:07:27 np0005541913.localdomain podman[107933]: ceilometer_agent_ipmi
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Dec 02 09:07:27 np0005541913.localdomain systemd[1]: Stopped ceilometer_agent_ipmi container.
Dec 02 09:07:27 np0005541913.localdomain sudo[107566]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:28 np0005541913.localdomain sudo[108034]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjzocqiqccaxinyyezushtedcweeixei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666448.0843084-115-174245366101590/AnsiballZ_systemd_service.py
Dec 02 09:07:28 np0005541913.localdomain sudo[108034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:07:28 np0005541913.localdomain python3.9[108036]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:07:28 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:07:28 np0005541913.localdomain systemd-rc-local-generator[108065]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:07:28 np0005541913.localdomain systemd-sysv-generator[108069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:07:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:07:29 np0005541913.localdomain systemd[1]: Stopping collectd container...
Dec 02 09:07:29 np0005541913.localdomain systemd[1]: tmp-crun.55fuFM.mount: Deactivated successfully.
Dec 02 09:07:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22071 DF PROTO=TCP SPT=52596 DPT=9101 SEQ=2844187226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4780B3250000000001030307) 
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:07:30 np0005541913.localdomain podman[108091]: 2025-12-02 09:07:30.43748989 +0000 UTC m=+0.082661739 container health_status 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: libpod-237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.scope: Deactivated successfully.
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: libpod-237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.scope: Consumed 2.179s CPU time.
Dec 02 09:07:30 np0005541913.localdomain podman[108091]: 2025-12-02 09:07:30.656009668 +0000 UTC m=+0.301181447 container exec_died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Deactivated successfully.
Dec 02 09:07:30 np0005541913.localdomain podman[108077]: 2025-12-02 09:07:30.708448079 +0000 UTC m=+1.646039745 container died 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc.)
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.timer: Deactivated successfully.
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Failed to open /run/systemd/transient/237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: No such file or directory
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6-userdata-shm.mount: Deactivated successfully.
Dec 02 09:07:30 np0005541913.localdomain podman[108077]: 2025-12-02 09:07:30.748478869 +0000 UTC m=+1.686070535 container cleanup 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 02 09:07:30 np0005541913.localdomain podman[108077]: collectd
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.timer: Failed to open /run/systemd/transient/237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.timer: No such file or directory
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Failed to open /run/systemd/transient/237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: No such file or directory
Dec 02 09:07:30 np0005541913.localdomain podman[108120]: 2025-12-02 09:07:30.769541222 +0000 UTC m=+0.104571985 container cleanup 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, url=https://www.redhat.com)
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: libpod-conmon-237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.scope: Deactivated successfully.
Dec 02 09:07:30 np0005541913.localdomain podman[108149]: error opening file `/run/crun/237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6/status`: No such file or directory
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.timer: Failed to open /run/systemd/transient/237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.timer: No such file or directory
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: Failed to open /run/systemd/transient/237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6.service: No such file or directory
Dec 02 09:07:30 np0005541913.localdomain podman[108136]: 2025-12-02 09:07:30.865360031 +0000 UTC m=+0.067929925 container cleanup 237065159bc32e1055f58656f49d027a128e2751373c9ed9dce9083e14cc3ac6 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Dec 02 09:07:30 np0005541913.localdomain podman[108136]: collectd
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'.
Dec 02 09:07:30 np0005541913.localdomain systemd[1]: Stopped collectd container.
Dec 02 09:07:30 np0005541913.localdomain sudo[108034]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:31 np0005541913.localdomain sudo[108240]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuylqrzdtwsdwumoxadfwemaekxmhioq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666451.0154564-115-139722368809044/AnsiballZ_systemd_service.py
Dec 02 09:07:31 np0005541913.localdomain sudo[108240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:07:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-082042a751b48593af3e4b42b09156dbc115dd133d7891319f3ff1ad0b672b0b-merged.mount: Deactivated successfully.
Dec 02 09:07:31 np0005541913.localdomain python3.9[108242]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:07:31 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:07:31 np0005541913.localdomain systemd-rc-local-generator[108272]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:07:31 np0005541913.localdomain systemd-sysv-generator[108275]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:07:31 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:07:31 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 09:07:31 np0005541913.localdomain systemd[1]: Stopping iscsid container...
Dec 02 09:07:31 np0005541913.localdomain recover_tripleo_nova_virtqemud[108285]: 62312
Dec 02 09:07:31 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 09:07:31 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: libpod-c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.scope: Deactivated successfully.
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: libpod-c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.scope: Consumed 1.145s CPU time.
Dec 02 09:07:32 np0005541913.localdomain podman[108284]: 2025-12-02 09:07:32.063785828 +0000 UTC m=+0.079036032 container died c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, distribution-scope=public, release=1761123044, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=iscsid, io.buildah.version=1.41.4)
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.timer: Deactivated successfully.
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Failed to open /run/systemd/transient/c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: No such file or directory
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: tmp-crun.gn3GKU.mount: Deactivated successfully.
Dec 02 09:07:32 np0005541913.localdomain podman[108284]: 2025-12-02 09:07:32.1147724 +0000 UTC m=+0.130022594 container cleanup c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 02 09:07:32 np0005541913.localdomain podman[108284]: iscsid
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.timer: Failed to open /run/systemd/transient/c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.timer: No such file or directory
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Failed to open /run/systemd/transient/c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: No such file or directory
Dec 02 09:07:32 np0005541913.localdomain podman[108298]: 2025-12-02 09:07:32.193117133 +0000 UTC m=+0.118997699 container cleanup c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: libpod-conmon-c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.scope: Deactivated successfully.
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.timer: Failed to open /run/systemd/transient/c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.timer: No such file or directory
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: Failed to open /run/systemd/transient/c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52.service: No such file or directory
Dec 02 09:07:32 np0005541913.localdomain podman[108314]: 2025-12-02 09:07:32.286840878 +0000 UTC m=+0.061906605 container cleanup c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Dec 02 09:07:32 np0005541913.localdomain podman[108314]: iscsid
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: Stopped iscsid container.
Dec 02 09:07:32 np0005541913.localdomain sudo[108240]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-eee6dae47ff617871c47add2aa57f33c2f7e68905855055afb3a7b04648ecacd-merged.mount: Deactivated successfully.
Dec 02 09:07:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52-userdata-shm.mount: Deactivated successfully.
Dec 02 09:07:32 np0005541913.localdomain sudo[108416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jitfdysdanppmgwolfeuwespmwqaorky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666452.4255977-115-230245134722535/AnsiballZ_systemd_service.py
Dec 02 09:07:32 np0005541913.localdomain sudo[108416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:07:32 np0005541913.localdomain python3.9[108418]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:07:33 np0005541913.localdomain systemd-sysv-generator[108450]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:07:33 np0005541913.localdomain systemd-rc-local-generator[108445]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: Stopping logrotate_crond container...
Dec 02 09:07:33 np0005541913.localdomain crond[71000]: (CRON) INFO (Shutting down)
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: libpod-0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.scope: Deactivated successfully.
Dec 02 09:07:33 np0005541913.localdomain podman[108459]: 2025-12-02 09:07:33.463685698 +0000 UTC m=+0.077805240 container died 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.timer: Deactivated successfully.
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Failed to open /run/systemd/transient/0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: No such file or directory
Dec 02 09:07:33 np0005541913.localdomain podman[108459]: 2025-12-02 09:07:33.574992071 +0000 UTC m=+0.189111583 container cleanup 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4)
Dec 02 09:07:33 np0005541913.localdomain podman[108459]: logrotate_crond
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.timer: Failed to open /run/systemd/transient/0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.timer: No such file or directory
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Failed to open /run/systemd/transient/0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: No such file or directory
Dec 02 09:07:33 np0005541913.localdomain podman[108472]: 2025-12-02 09:07:33.598667073 +0000 UTC m=+0.131118395 container cleanup 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: libpod-conmon-0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.scope: Deactivated successfully.
Dec 02 09:07:33 np0005541913.localdomain podman[108503]: error opening file `/run/crun/0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b/status`: No such file or directory
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.timer: Failed to open /run/systemd/transient/0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.timer: No such file or directory
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: Failed to open /run/systemd/transient/0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b.service: No such file or directory
Dec 02 09:07:33 np0005541913.localdomain podman[108490]: 2025-12-02 09:07:33.695021137 +0000 UTC m=+0.068917971 container cleanup 0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, container_name=logrotate_crond, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z)
Dec 02 09:07:33 np0005541913.localdomain podman[108490]: logrotate_crond
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Dec 02 09:07:33 np0005541913.localdomain systemd[1]: Stopped logrotate_crond container.
Dec 02 09:07:33 np0005541913.localdomain sudo[108416]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29877 DF PROTO=TCP SPT=32840 DPT=9102 SEQ=798519461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4780C52E0000000001030307) 
Dec 02 09:07:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24303 DF PROTO=TCP SPT=46326 DPT=9105 SEQ=4168036757 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4780C5AF0000000001030307) 
Dec 02 09:07:34 np0005541913.localdomain sudo[108594]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wngeuvsasrqpvlqqrxscauifovnvmgmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666453.840272-115-245522883543755/AnsiballZ_systemd_service.py
Dec 02 09:07:34 np0005541913.localdomain sudo[108594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:07:34 np0005541913.localdomain python3.9[108596]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:07:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d5dc9262725001f2f73a799452ce705d444359a7e34fc5a93c05c8a39696c355-merged.mount: Deactivated successfully.
Dec 02 09:07:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b-userdata-shm.mount: Deactivated successfully.
Dec 02 09:07:34 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:07:34 np0005541913.localdomain systemd-rc-local-generator[108620]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:07:34 np0005541913.localdomain systemd-sysv-generator[108624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:07:34 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:07:34 np0005541913.localdomain systemd[1]: Stopping metrics_qdr container...
Dec 02 09:07:34 np0005541913.localdomain kernel: qdrouterd[54996]: segfault at 0 ip 00007fa2f77fe7cb sp 00007ffd821391b0 error 4 in libc.so.6[7fa2f779b000+175000]
Dec 02 09:07:34 np0005541913.localdomain kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Dec 02 09:07:34 np0005541913.localdomain systemd[1]: Created slice Slice /system/systemd-coredump.
Dec 02 09:07:34 np0005541913.localdomain systemd[1]: Started Process Core Dump (PID 108651/UID 0).
Dec 02 09:07:35 np0005541913.localdomain systemd-coredump[108652]: Resource limits disable core dumping for process 54996 (qdrouterd).
Dec 02 09:07:35 np0005541913.localdomain systemd-coredump[108652]: Process 54996 (qdrouterd) of user 42465 dumped core.
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: systemd-coredump@0-108651-0.service: Deactivated successfully.
Dec 02 09:07:35 np0005541913.localdomain podman[108637]: 2025-12-02 09:07:35.040846662 +0000 UTC m=+0.239595192 container died 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: libpod-71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.scope: Deactivated successfully.
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: libpod-71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.scope: Consumed 28.577s CPU time.
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.timer: Deactivated successfully.
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Failed to open /run/systemd/transient/71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: No such file or directory
Dec 02 09:07:35 np0005541913.localdomain podman[108637]: 2025-12-02 09:07:35.087778935 +0000 UTC m=+0.286527465 container cleanup 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 02 09:07:35 np0005541913.localdomain podman[108637]: metrics_qdr
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.timer: Failed to open /run/systemd/transient/71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.timer: No such file or directory
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Failed to open /run/systemd/transient/71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: No such file or directory
Dec 02 09:07:35 np0005541913.localdomain podman[108656]: 2025-12-02 09:07:35.12649905 +0000 UTC m=+0.071306575 container cleanup 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: libpod-conmon-71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.scope: Deactivated successfully.
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.timer: Failed to open /run/systemd/transient/71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.timer: No such file or directory
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: Failed to open /run/systemd/transient/71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5.service: No such file or directory
Dec 02 09:07:35 np0005541913.localdomain podman[108670]: 2025-12-02 09:07:35.234025253 +0000 UTC m=+0.073522706 container cleanup 71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '36af2f1ef63ece3c88eb676f44e9c36d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044)
Dec 02 09:07:35 np0005541913.localdomain podman[108670]: metrics_qdr
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: Stopped metrics_qdr container.
Dec 02 09:07:35 np0005541913.localdomain sudo[108594]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-083325a356d009687825873f5ef80d42d8ec3a9c9ef25c5a97dbce5b8f99fa32-merged.mount: Deactivated successfully.
Dec 02 09:07:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-71b0df469ac1667ac6e61c84f308440912da39a2a9f53c444b9100712adc10d5-userdata-shm.mount: Deactivated successfully.
Dec 02 09:07:35 np0005541913.localdomain sudo[108773]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gykuufdpxqxvnnkttqrmgfkyovsgmhkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666455.3917708-115-220787946270353/AnsiballZ_systemd_service.py
Dec 02 09:07:35 np0005541913.localdomain sudo[108773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:07:35 np0005541913.localdomain python3.9[108775]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:07:35 np0005541913.localdomain sudo[108773]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:36 np0005541913.localdomain sudo[108866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clntilkionvuhdhnjtjgkgytatgsbdnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666456.0962873-115-171954725463669/AnsiballZ_systemd_service.py
Dec 02 09:07:36 np0005541913.localdomain sudo[108866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:07:36 np0005541913.localdomain python3.9[108868]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:07:36 np0005541913.localdomain sudo[108866]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29879 DF PROTO=TCP SPT=32840 DPT=9102 SEQ=798519461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4780D1250000000001030307) 
Dec 02 09:07:37 np0005541913.localdomain sudo[108959]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxooygkpxbbaidssxjxwaouxygpnphja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666456.7812974-115-18589040862570/AnsiballZ_systemd_service.py
Dec 02 09:07:37 np0005541913.localdomain sudo[108959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:07:37 np0005541913.localdomain python3.9[108961]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:07:37 np0005541913.localdomain sudo[108959]: pam_unix(sudo:session): session closed for user root
Dec 02 09:07:37 np0005541913.localdomain sudo[109052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nayqdvuinlsbbbmsfdmfckxajttsyetj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666457.4965107-115-226377347528246/AnsiballZ_systemd_service.py
Dec 02 09:07:37 np0005541913.localdomain sudo[109052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:07:38 np0005541913.localdomain python3.9[109054]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:07:38 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:07:38 np0005541913.localdomain systemd-rc-local-generator[109073]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:07:38 np0005541913.localdomain systemd-sysv-generator[109077]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:07:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:07:38 np0005541913.localdomain systemd[1]: Stopping nova_compute container...
Dec 02 09:07:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:07:39 np0005541913.localdomain podman[109106]: Error: container 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 is not running
Dec 02 09:07:39 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=125/n/a
Dec 02 09:07:39 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 09:07:39 np0005541913.localdomain sshd[109119]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:07:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30611 DF PROTO=TCP SPT=45150 DPT=9100 SEQ=1645113348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4780DD240000000001030307) 
Dec 02 09:07:40 np0005541913.localdomain sshd[109119]: Invalid user user from 78.128.112.74 port 52762
Dec 02 09:07:40 np0005541913.localdomain sshd[109119]: Connection closed by invalid user user 78.128.112.74 port 52762 [preauth]
Dec 02 09:07:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:07:41 np0005541913.localdomain podman[109121]: 2025-12-02 09:07:41.44076167 +0000 UTC m=+0.073866564 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:07:41 np0005541913.localdomain podman[109121]: 2025-12-02 09:07:41.78883643 +0000 UTC m=+0.421941264 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 02 09:07:41 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:07:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47663 DF PROTO=TCP SPT=48864 DPT=9882 SEQ=1505906793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4780E9240000000001030307) 
Dec 02 09:07:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:07:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:07:43 np0005541913.localdomain podman[109145]: 2025-12-02 09:07:43.435386678 +0000 UTC m=+0.071680646 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, release=1761123044)
Dec 02 09:07:43 np0005541913.localdomain podman[109144]: 2025-12-02 09:07:43.493000497 +0000 UTC m=+0.130253551 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:07:43 np0005541913.localdomain podman[109144]: 2025-12-02 09:07:43.50807933 +0000 UTC m=+0.145332354 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team)
Dec 02 09:07:43 np0005541913.localdomain podman[109144]: unhealthy
Dec 02 09:07:43 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:07:43 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:07:43 np0005541913.localdomain podman[109145]: 2025-12-02 09:07:43.525027083 +0000 UTC m=+0.161321021 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 02 09:07:43 np0005541913.localdomain podman[109145]: unhealthy
Dec 02 09:07:43 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:07:43 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:07:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30613 DF PROTO=TCP SPT=45150 DPT=9100 SEQ=1645113348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4780F4E40000000001030307) 
Dec 02 09:07:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24307 DF PROTO=TCP SPT=46326 DPT=9105 SEQ=4168036757 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478101E40000000001030307) 
Dec 02 09:07:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53321 DF PROTO=TCP SPT=49934 DPT=9101 SEQ=1691256137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47810CA00000000001030307) 
Dec 02 09:07:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53323 DF PROTO=TCP SPT=49934 DPT=9101 SEQ=1691256137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478118A40000000001030307) 
Dec 02 09:07:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53324 DF PROTO=TCP SPT=49934 DPT=9101 SEQ=1691256137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478128650000000001030307) 
Dec 02 09:08:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26138 DF PROTO=TCP SPT=38034 DPT=9102 SEQ=4151817394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47813A5E0000000001030307) 
Dec 02 09:08:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6218 DF PROTO=TCP SPT=41124 DPT=9105 SEQ=3870507885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47813ADE0000000001030307) 
Dec 02 09:08:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26140 DF PROTO=TCP SPT=38034 DPT=9102 SEQ=4151817394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478146640000000001030307) 
Dec 02 09:08:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:08:09 np0005541913.localdomain podman[109184]: Error: container 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 is not running
Dec 02 09:08:09 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Main process exited, code=exited, status=125/n/a
Dec 02 09:08:09 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed with result 'exit-code'.
Dec 02 09:08:09 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49490 DF PROTO=TCP SPT=41402 DPT=9882 SEQ=857368683 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478151E40000000001030307) 
Dec 02 09:08:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:08:11 np0005541913.localdomain podman[109196]: 2025-12-02 09:08:11.972423907 +0000 UTC m=+0.101310438 container health_status 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=)
Dec 02 09:08:12 np0005541913.localdomain podman[109196]: 2025-12-02 09:08:12.322032887 +0000 UTC m=+0.450919458 container exec_died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Dec 02 09:08:12 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Deactivated successfully.
Dec 02 09:08:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32722 DF PROTO=TCP SPT=37158 DPT=9100 SEQ=685219675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47815DE50000000001030307) 
Dec 02 09:08:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:08:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:08:14 np0005541913.localdomain podman[109220]: 2025-12-02 09:08:14.195594941 +0000 UTC m=+0.083591335 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4)
Dec 02 09:08:14 np0005541913.localdomain podman[109219]: 2025-12-02 09:08:14.245652708 +0000 UTC m=+0.135637935 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true)
Dec 02 09:08:14 np0005541913.localdomain podman[109219]: 2025-12-02 09:08:14.259147878 +0000 UTC m=+0.149133115 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:08:14 np0005541913.localdomain podman[109219]: unhealthy
Dec 02 09:08:14 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:08:14 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:08:14 np0005541913.localdomain podman[109220]: 2025-12-02 09:08:14.312285198 +0000 UTC m=+0.200281592 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 02 09:08:14 np0005541913.localdomain podman[109220]: unhealthy
Dec 02 09:08:14 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:08:14 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:08:15 np0005541913.localdomain sudo[109260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:08:15 np0005541913.localdomain sudo[109260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:08:15 np0005541913.localdomain sudo[109260]: pam_unix(sudo:session): session closed for user root
Dec 02 09:08:15 np0005541913.localdomain sudo[109275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:08:15 np0005541913.localdomain sudo[109275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:08:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37133 DF PROTO=TCP SPT=39174 DPT=9100 SEQ=3855010009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47816A240000000001030307) 
Dec 02 09:08:16 np0005541913.localdomain sudo[109275]: pam_unix(sudo:session): session closed for user root
Dec 02 09:08:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26142 DF PROTO=TCP SPT=38034 DPT=9102 SEQ=4151817394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478175E50000000001030307) 
Dec 02 09:08:19 np0005541913.localdomain sudo[109323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:08:19 np0005541913.localdomain sudo[109323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:08:19 np0005541913.localdomain sudo[109323]: pam_unix(sudo:session): session closed for user root
Dec 02 09:08:20 np0005541913.localdomain podman[109094]: time="2025-12-02T09:08:20Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL"
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: session-c11.scope: Deactivated successfully.
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: libpod-1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.scope: Deactivated successfully.
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: libpod-1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.scope: Consumed 35.934s CPU time.
Dec 02 09:08:20 np0005541913.localdomain podman[109094]: 2025-12-02 09:08:20.509898312 +0000 UTC m=+42.083247998 container died 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true)
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.timer: Deactivated successfully.
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed to open /run/systemd/transient/1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: No such file or directory
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: tmp-crun.W5saBG.mount: Deactivated successfully.
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0399317fe788e77a051163f65a715baa05b56d1254753267f43144269e89c7fb-merged.mount: Deactivated successfully.
Dec 02 09:08:20 np0005541913.localdomain podman[109094]: 2025-12-02 09:08:20.57872597 +0000 UTC m=+42.152075646 container cleanup 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 02 09:08:20 np0005541913.localdomain podman[109094]: nova_compute
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.timer: Failed to open /run/systemd/transient/1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.timer: No such file or directory
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed to open /run/systemd/transient/1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: No such file or directory
Dec 02 09:08:20 np0005541913.localdomain podman[109339]: 2025-12-02 09:08:20.656354464 +0000 UTC m=+0.135647675 container cleanup 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: libpod-conmon-1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.scope: Deactivated successfully.
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.timer: Failed to open /run/systemd/transient/1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.timer: No such file or directory
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: Failed to open /run/systemd/transient/1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1.service: No such file or directory
Dec 02 09:08:20 np0005541913.localdomain podman[109354]: 2025-12-02 09:08:20.753372596 +0000 UTC m=+0.067928516 container cleanup 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute)
Dec 02 09:08:20 np0005541913.localdomain podman[109354]: nova_compute
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: Stopped nova_compute container.
Dec 02 09:08:20 np0005541913.localdomain systemd[1]: tripleo_nova_compute.service: Consumed 1.043s CPU time, no IO.
Dec 02 09:08:20 np0005541913.localdomain sudo[109052]: pam_unix(sudo:session): session closed for user root
Dec 02 09:08:21 np0005541913.localdomain sudo[109456]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yskxrgsvdfplaaunzhmqrcwxydajhpfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666500.9101517-115-159225946800334/AnsiballZ_systemd_service.py
Dec 02 09:08:21 np0005541913.localdomain sudo[109456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:08:21 np0005541913.localdomain python3.9[109458]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:08:21 np0005541913.localdomain systemd-sysv-generator[109488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:08:21 np0005541913.localdomain systemd-rc-local-generator[109484]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: Stopping nova_migration_target container...
Dec 02 09:08:21 np0005541913.localdomain sshd[71327]: Received signal 15; terminating.
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: libpod-17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.scope: Deactivated successfully.
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: libpod-17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.scope: Consumed 33.739s CPU time.
Dec 02 09:08:21 np0005541913.localdomain podman[109499]: 2025-12-02 09:08:21.9314821 +0000 UTC m=+0.057452466 container died 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.timer: Deactivated successfully.
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Failed to open /run/systemd/transient/17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: No such file or directory
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159-userdata-shm.mount: Deactivated successfully.
Dec 02 09:08:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-aed02a8eef27d7fad5076c16a3501516599cfd6963ae4f4d75e8f0b164242bc5-merged.mount: Deactivated successfully.
Dec 02 09:08:21 np0005541913.localdomain podman[109499]: 2025-12-02 09:08:21.979578485 +0000 UTC m=+0.105548801 container cleanup 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true)
Dec 02 09:08:21 np0005541913.localdomain podman[109499]: nova_migration_target
Dec 02 09:08:22 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.timer: Failed to open /run/systemd/transient/17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.timer: No such file or directory
Dec 02 09:08:22 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Failed to open /run/systemd/transient/17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: No such file or directory
Dec 02 09:08:22 np0005541913.localdomain podman[109513]: 2025-12-02 09:08:22.005085096 +0000 UTC m=+0.062213183 container cleanup 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4)
Dec 02 09:08:22 np0005541913.localdomain systemd[1]: libpod-conmon-17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.scope: Deactivated successfully.
Dec 02 09:08:22 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.timer: Failed to open /run/systemd/transient/17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.timer: No such file or directory
Dec 02 09:08:22 np0005541913.localdomain systemd[1]: 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: Failed to open /run/systemd/transient/17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159.service: No such file or directory
Dec 02 09:08:22 np0005541913.localdomain podman[109525]: 2025-12-02 09:08:22.117565692 +0000 UTC m=+0.070863875 container cleanup 17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true)
Dec 02 09:08:22 np0005541913.localdomain podman[109525]: nova_migration_target
Dec 02 09:08:22 np0005541913.localdomain systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Dec 02 09:08:22 np0005541913.localdomain systemd[1]: Stopped nova_migration_target container.
Dec 02 09:08:22 np0005541913.localdomain sudo[109456]: pam_unix(sudo:session): session closed for user root
Dec 02 09:08:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58533 DF PROTO=TCP SPT=35196 DPT=9101 SEQ=1587794403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478181CF0000000001030307) 
Dec 02 09:08:24 np0005541913.localdomain sudo[109627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzevispgvagwlvyewmjcyhzygnjwmxmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666503.5167058-115-111349542559208/AnsiballZ_systemd_service.py
Dec 02 09:08:24 np0005541913.localdomain sudo[109627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:08:24 np0005541913.localdomain python3.9[109629]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:08:24 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:08:24 np0005541913.localdomain systemd-rc-local-generator[109653]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:08:24 np0005541913.localdomain systemd-sysv-generator[109657]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:08:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:08:24 np0005541913.localdomain systemd[1]: Stopping nova_virtlogd_wrapper container...
Dec 02 09:08:24 np0005541913.localdomain systemd[1]: libpod-6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56.scope: Deactivated successfully.
Dec 02 09:08:24 np0005541913.localdomain podman[109670]: 2025-12-02 09:08:24.794556238 +0000 UTC m=+0.066221990 container died 6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtlogd_wrapper, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']})
Dec 02 09:08:24 np0005541913.localdomain podman[109670]: 2025-12-02 09:08:24.83955501 +0000 UTC m=+0.111220762 container cleanup 6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, container_name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:08:24 np0005541913.localdomain podman[109670]: nova_virtlogd_wrapper
Dec 02 09:08:24 np0005541913.localdomain podman[109685]: 2025-12-02 09:08:24.87810548 +0000 UTC m=+0.068142341 container cleanup 6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtlogd_wrapper, vcs-type=git, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 02 09:08:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58535 DF PROTO=TCP SPT=35196 DPT=9101 SEQ=1587794403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47818DE50000000001030307) 
Dec 02 09:08:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996-merged.mount: Deactivated successfully.
Dec 02 09:08:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56-userdata-shm.mount: Deactivated successfully.
Dec 02 09:08:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58536 DF PROTO=TCP SPT=35196 DPT=9101 SEQ=1587794403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47819DA50000000001030307) 
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Activating special unit Exit the Session...
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Removed slice User Background Tasks Slice.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Stopped target Main User Target.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Stopped target Basic System.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Stopped target Paths.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Stopped target Sockets.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Stopped target Timers.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Closed D-Bus User Message Bus Socket.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Stopped Create User's Volatile Files and Directories.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Removed slice User Application Slice.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Reached target Shutdown.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Finished Exit the Session.
Dec 02 09:08:30 np0005541913.localdomain systemd[84191]: Reached target Exit the Session.
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: user@0.service: Consumed 4.288s CPU time, no IO.
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 02 09:08:30 np0005541913.localdomain systemd[1]: user-0.slice: Consumed 5.223s CPU time.
Dec 02 09:08:33 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 09:08:33 np0005541913.localdomain recover_tripleo_nova_virtqemud[109703]: 62312
Dec 02 09:08:33 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 09:08:33 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 09:08:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40307 DF PROTO=TCP SPT=39264 DPT=9102 SEQ=1737105334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4781AF8D0000000001030307) 
Dec 02 09:08:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57399 DF PROTO=TCP SPT=51472 DPT=9105 SEQ=2724178224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4781B00E0000000001030307) 
Dec 02 09:08:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40309 DF PROTO=TCP SPT=39264 DPT=9102 SEQ=1737105334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4781BBA40000000001030307) 
Dec 02 09:08:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39696 DF PROTO=TCP SPT=54392 DPT=9100 SEQ=3246007098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4781C7A50000000001030307) 
Dec 02 09:08:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26967 DF PROTO=TCP SPT=54930 DPT=9882 SEQ=2162550144 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4781D3A40000000001030307) 
Dec 02 09:08:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:08:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:08:44 np0005541913.localdomain podman[109704]: 2025-12-02 09:08:44.450344716 +0000 UTC m=+0.087252583 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 02 09:08:44 np0005541913.localdomain podman[109705]: 2025-12-02 09:08:44.493813726 +0000 UTC m=+0.130548439 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 02 09:08:44 np0005541913.localdomain podman[109705]: 2025-12-02 09:08:44.512445895 +0000 UTC m=+0.149180628 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, release=1761123044, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 02 09:08:44 np0005541913.localdomain podman[109705]: unhealthy
Dec 02 09:08:44 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:08:44 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:08:44 np0005541913.localdomain podman[109704]: 2025-12-02 09:08:44.568950264 +0000 UTC m=+0.205858201 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, tcib_managed=true)
Dec 02 09:08:44 np0005541913.localdomain podman[109704]: unhealthy
Dec 02 09:08:44 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:08:44 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:08:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39698 DF PROTO=TCP SPT=54392 DPT=9100 SEQ=3246007098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4781DF640000000001030307) 
Dec 02 09:08:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40311 DF PROTO=TCP SPT=39264 DPT=9102 SEQ=1737105334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4781EBE50000000001030307) 
Dec 02 09:08:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22197 DF PROTO=TCP SPT=37708 DPT=9101 SEQ=1092242824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4781F6FF0000000001030307) 
Dec 02 09:08:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22199 DF PROTO=TCP SPT=37708 DPT=9101 SEQ=1092242824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478203250000000001030307) 
Dec 02 09:08:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22200 DF PROTO=TCP SPT=37708 DPT=9101 SEQ=1092242824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478212E40000000001030307) 
Dec 02 09:09:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=361 DF PROTO=TCP SPT=60578 DPT=9102 SEQ=62058449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478224BE0000000001030307) 
Dec 02 09:09:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4819 DF PROTO=TCP SPT=51496 DPT=9105 SEQ=1706879213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4782253E0000000001030307) 
Dec 02 09:09:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=363 DF PROTO=TCP SPT=60578 DPT=9102 SEQ=62058449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478230E40000000001030307) 
Dec 02 09:09:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53129 DF PROTO=TCP SPT=32860 DPT=9100 SEQ=562697888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47823CA40000000001030307) 
Dec 02 09:09:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4989 DF PROTO=TCP SPT=53174 DPT=9882 SEQ=3182283504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478248E40000000001030307) 
Dec 02 09:09:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:09:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:09:14 np0005541913.localdomain podman[109746]: 2025-12-02 09:09:14.679873978 +0000 UTC m=+0.071229844 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=ovn_metadata_agent)
Dec 02 09:09:14 np0005541913.localdomain podman[109746]: 2025-12-02 09:09:14.693750978 +0000 UTC m=+0.085106844 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:09:14 np0005541913.localdomain podman[109746]: unhealthy
Dec 02 09:09:14 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:09:14 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:09:14 np0005541913.localdomain systemd[1]: tmp-crun.QdacN5.mount: Deactivated successfully.
Dec 02 09:09:14 np0005541913.localdomain podman[109747]: 2025-12-02 09:09:14.734558268 +0000 UTC m=+0.123093659 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, container_name=ovn_controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team)
Dec 02 09:09:14 np0005541913.localdomain podman[109747]: 2025-12-02 09:09:14.747879725 +0000 UTC m=+0.136415096 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 09:09:14 np0005541913.localdomain podman[109747]: unhealthy
Dec 02 09:09:14 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:09:14 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:09:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53131 DF PROTO=TCP SPT=32860 DPT=9100 SEQ=562697888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478254650000000001030307) 
Dec 02 09:09:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4823 DF PROTO=TCP SPT=51496 DPT=9105 SEQ=1706879213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478261E40000000001030307) 
Dec 02 09:09:20 np0005541913.localdomain sudo[109786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:09:20 np0005541913.localdomain sudo[109786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:09:20 np0005541913.localdomain sudo[109786]: pam_unix(sudo:session): session closed for user root
Dec 02 09:09:20 np0005541913.localdomain sudo[109801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 09:09:20 np0005541913.localdomain sudo[109801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:09:20 np0005541913.localdomain sudo[109801]: pam_unix(sudo:session): session closed for user root
Dec 02 09:09:20 np0005541913.localdomain sudo[109836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:09:20 np0005541913.localdomain sudo[109836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:09:20 np0005541913.localdomain sudo[109836]: pam_unix(sudo:session): session closed for user root
Dec 02 09:09:20 np0005541913.localdomain sudo[109851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:09:20 np0005541913.localdomain sudo[109851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:09:21 np0005541913.localdomain sudo[109851]: pam_unix(sudo:session): session closed for user root
Dec 02 09:09:22 np0005541913.localdomain sudo[109898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:09:22 np0005541913.localdomain sudo[109898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:09:22 np0005541913.localdomain sudo[109898]: pam_unix(sudo:session): session closed for user root
Dec 02 09:09:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59482 DF PROTO=TCP SPT=45232 DPT=9101 SEQ=1911152021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47826C310000000001030307) 
Dec 02 09:09:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59484 DF PROTO=TCP SPT=45232 DPT=9101 SEQ=1911152021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478278250000000001030307) 
Dec 02 09:09:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59485 DF PROTO=TCP SPT=45232 DPT=9101 SEQ=1911152021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478287E40000000001030307) 
Dec 02 09:09:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8727 DF PROTO=TCP SPT=49944 DPT=9102 SEQ=934880506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478299ED0000000001030307) 
Dec 02 09:09:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6888 DF PROTO=TCP SPT=48004 DPT=9105 SEQ=811392733 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47829A6F0000000001030307) 
Dec 02 09:09:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8729 DF PROTO=TCP SPT=49944 DPT=9102 SEQ=934880506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4782A5E40000000001030307) 
Dec 02 09:09:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26970 DF PROTO=TCP SPT=54930 DPT=9882 SEQ=2162550144 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4782B1E40000000001030307) 
Dec 02 09:09:42 np0005541913.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 02 09:09:42 np0005541913.localdomain recover_tripleo_nova_virtqemud[109914]: 62312
Dec 02 09:09:42 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 02 09:09:42 np0005541913.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 02 09:09:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39701 DF PROTO=TCP SPT=54392 DPT=9100 SEQ=3246007098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4782BDE40000000001030307) 
Dec 02 09:09:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:09:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:09:44 np0005541913.localdomain podman[109916]: 2025-12-02 09:09:44.955809938 +0000 UTC m=+0.089959414 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller)
Dec 02 09:09:44 np0005541913.localdomain podman[109916]: 2025-12-02 09:09:44.973946392 +0000 UTC m=+0.108095868 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vcs-type=git, config_id=tripleo_step4, version=17.1.12)
Dec 02 09:09:44 np0005541913.localdomain podman[109916]: unhealthy
Dec 02 09:09:44 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:09:44 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:09:45 np0005541913.localdomain podman[109915]: 2025-12-02 09:09:45.061711508 +0000 UTC m=+0.197258132 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 02 09:09:45 np0005541913.localdomain podman[109915]: 2025-12-02 09:09:45.081092376 +0000 UTC m=+0.216638970 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-type=git)
Dec 02 09:09:45 np0005541913.localdomain podman[109915]: unhealthy
Dec 02 09:09:45 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:09:45 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:09:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11686 DF PROTO=TCP SPT=43486 DPT=9100 SEQ=2281320750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4782C9A50000000001030307) 
Dec 02 09:09:49 np0005541913.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Dec 02 09:09:49 np0005541913.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 61538 (conmon) with signal SIGKILL.
Dec 02 09:09:49 np0005541913.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Dec 02 09:09:49 np0005541913.localdomain systemd[1]: libpod-conmon-6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56.scope: Deactivated successfully.
Dec 02 09:09:49 np0005541913.localdomain podman[109967]: error opening file `/run/crun/6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56/status`: No such file or directory
Dec 02 09:09:49 np0005541913.localdomain podman[109955]: 2025-12-02 09:09:49.18294986 +0000 UTC m=+0.069988341 container cleanup 6e39cd661b9121bee9b5acf067e939fc9033a15f3a6537b3e8d8126e59e2dc56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, release=1761123044, container_name=nova_virtlogd_wrapper, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 02 09:09:49 np0005541913.localdomain podman[109955]: nova_virtlogd_wrapper
Dec 02 09:09:49 np0005541913.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Dec 02 09:09:49 np0005541913.localdomain systemd[1]: Stopped nova_virtlogd_wrapper container.
Dec 02 09:09:49 np0005541913.localdomain sudo[109627]: pam_unix(sudo:session): session closed for user root
Dec 02 09:09:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6892 DF PROTO=TCP SPT=48004 DPT=9105 SEQ=811392733 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4782D5E40000000001030307) 
Dec 02 09:09:49 np0005541913.localdomain sudo[110058]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnqmwtbgnockgxlndcxobrrtxzghjwdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666589.340283-115-40084362880404/AnsiballZ_systemd_service.py
Dec 02 09:09:49 np0005541913.localdomain sudo[110058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:09:49 np0005541913.localdomain python3.9[110060]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:09:50 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:09:50 np0005541913.localdomain systemd-rc-local-generator[110090]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:09:50 np0005541913.localdomain systemd-sysv-generator[110094]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:09:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:09:50 np0005541913.localdomain systemd[1]: Stopping nova_virtnodedevd container...
Dec 02 09:09:50 np0005541913.localdomain systemd[1]: libpod-21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8.scope: Deactivated successfully.
Dec 02 09:09:50 np0005541913.localdomain systemd[1]: libpod-21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8.scope: Consumed 1.512s CPU time.
Dec 02 09:09:50 np0005541913.localdomain podman[110102]: 2025-12-02 09:09:50.507445715 +0000 UTC m=+0.086460341 container died 21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, distribution-scope=public, container_name=nova_virtnodedevd, url=https://www.redhat.com, config_id=tripleo_step3)
Dec 02 09:09:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8-userdata-shm.mount: Deactivated successfully.
Dec 02 09:09:50 np0005541913.localdomain podman[110102]: 2025-12-02 09:09:50.544552776 +0000 UTC m=+0.123567352 container cleanup 21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtnodedevd, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team)
Dec 02 09:09:50 np0005541913.localdomain podman[110102]: nova_virtnodedevd
Dec 02 09:09:50 np0005541913.localdomain podman[110116]: 2025-12-02 09:09:50.598425455 +0000 UTC m=+0.069573790 container cleanup 21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, tcib_managed=true, container_name=nova_virtnodedevd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z)
Dec 02 09:09:50 np0005541913.localdomain systemd[1]: libpod-conmon-21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8.scope: Deactivated successfully.
Dec 02 09:09:50 np0005541913.localdomain podman[110145]: error opening file `/run/crun/21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8/status`: No such file or directory
Dec 02 09:09:50 np0005541913.localdomain podman[110132]: 2025-12-02 09:09:50.694740609 +0000 UTC m=+0.066563440 container cleanup 21ba66c04209f456290730a004123ba7623872bc65a5bce1c6488aa3b8e487e8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=nova_virtnodedevd, build-date=2025-11-19T00:35:22Z, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:09:50 np0005541913.localdomain podman[110132]: nova_virtnodedevd
Dec 02 09:09:50 np0005541913.localdomain systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Dec 02 09:09:50 np0005541913.localdomain systemd[1]: Stopped nova_virtnodedevd container.
Dec 02 09:09:50 np0005541913.localdomain sudo[110058]: pam_unix(sudo:session): session closed for user root
Dec 02 09:09:51 np0005541913.localdomain sudo[110236]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krzvtkvbikpupgsesslpbytpvekevvit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666590.853612-115-79849088130152/AnsiballZ_systemd_service.py
Dec 02 09:09:51 np0005541913.localdomain sudo[110236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:09:51 np0005541913.localdomain python3.9[110238]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:09:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261-merged.mount: Deactivated successfully.
Dec 02 09:09:51 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:09:51 np0005541913.localdomain systemd-rc-local-generator[110263]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:09:51 np0005541913.localdomain systemd-sysv-generator[110271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:09:51 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:09:51 np0005541913.localdomain systemd[1]: Stopping nova_virtproxyd container...
Dec 02 09:09:51 np0005541913.localdomain systemd[1]: libpod-16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3.scope: Deactivated successfully.
Dec 02 09:09:51 np0005541913.localdomain podman[110279]: 2025-12-02 09:09:51.924698808 +0000 UTC m=+0.082719682 container died 16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, container_name=nova_virtproxyd, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, tcib_managed=true)
Dec 02 09:09:51 np0005541913.localdomain podman[110279]: 2025-12-02 09:09:51.966823893 +0000 UTC m=+0.124844717 container cleanup 16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.expose-services=, container_name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 02 09:09:51 np0005541913.localdomain podman[110279]: nova_virtproxyd
Dec 02 09:09:52 np0005541913.localdomain podman[110293]: 2025-12-02 09:09:52.00940361 +0000 UTC m=+0.069713963 container cleanup 16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Dec 02 09:09:52 np0005541913.localdomain systemd[1]: libpod-conmon-16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3.scope: Deactivated successfully.
Dec 02 09:09:52 np0005541913.localdomain podman[110320]: error opening file `/run/crun/16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3/status`: No such file or directory
Dec 02 09:09:52 np0005541913.localdomain podman[110309]: 2025-12-02 09:09:52.116558643 +0000 UTC m=+0.076889485 container cleanup 16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12)
Dec 02 09:09:52 np0005541913.localdomain podman[110309]: nova_virtproxyd
Dec 02 09:09:52 np0005541913.localdomain systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Dec 02 09:09:52 np0005541913.localdomain systemd[1]: Stopped nova_virtproxyd container.
Dec 02 09:09:52 np0005541913.localdomain sudo[110236]: pam_unix(sudo:session): session closed for user root
Dec 02 09:09:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52767 DF PROTO=TCP SPT=58890 DPT=9101 SEQ=809638464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4782E1600000000001030307) 
Dec 02 09:09:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699-merged.mount: Deactivated successfully.
Dec 02 09:09:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-16d84a4ce4ba5ecc4f3c44b1a277cf4f1758563f4ced80b2826a6e0f692865d3-userdata-shm.mount: Deactivated successfully.
Dec 02 09:09:52 np0005541913.localdomain sudo[110411]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voqdfrjvcrmkjorxpmwivlhhtvsdpriw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666592.2770195-115-192730818333326/AnsiballZ_systemd_service.py
Dec 02 09:09:52 np0005541913.localdomain sudo[110411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:09:52 np0005541913.localdomain python3.9[110413]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:09:52 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:09:52 np0005541913.localdomain systemd-sysv-generator[110443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:09:52 np0005541913.localdomain systemd-rc-local-generator[110440]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:09:53 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:09:53 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Dec 02 09:09:53 np0005541913.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Dec 02 09:09:53 np0005541913.localdomain systemd[1]: Stopping nova_virtqemud container...
Dec 02 09:09:53 np0005541913.localdomain systemd[1]: libpod-df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca.scope: Deactivated successfully.
Dec 02 09:09:53 np0005541913.localdomain systemd[1]: libpod-df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca.scope: Consumed 2.651s CPU time.
Dec 02 09:09:53 np0005541913.localdomain podman[110454]: 2025-12-02 09:09:53.253653251 +0000 UTC m=+0.078713284 container died df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Dec 02 09:09:53 np0005541913.localdomain podman[110454]: 2025-12-02 09:09:53.284501035 +0000 UTC m=+0.109561058 container cleanup df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=nova_virtqemud, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true)
Dec 02 09:09:53 np0005541913.localdomain podman[110454]: nova_virtqemud
Dec 02 09:09:53 np0005541913.localdomain podman[110467]: 2025-12-02 09:09:53.343954683 +0000 UTC m=+0.073858723 container cleanup df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, release=1761123044, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 02 09:09:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f-merged.mount: Deactivated successfully.
Dec 02 09:09:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca-userdata-shm.mount: Deactivated successfully.
Dec 02 09:09:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52769 DF PROTO=TCP SPT=58890 DPT=9101 SEQ=809638464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4782ED640000000001030307) 
Dec 02 09:09:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52770 DF PROTO=TCP SPT=58890 DPT=9101 SEQ=809638464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4782FD240000000001030307) 
Dec 02 09:10:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24987 DF PROTO=TCP SPT=40144 DPT=9102 SEQ=2502057526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47830F1D0000000001030307) 
Dec 02 09:10:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64488 DF PROTO=TCP SPT=48226 DPT=9105 SEQ=2956584921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47830F9F0000000001030307) 
Dec 02 09:10:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24989 DF PROTO=TCP SPT=40144 DPT=9102 SEQ=2502057526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47831B240000000001030307) 
Dec 02 09:10:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37706 DF PROTO=TCP SPT=59590 DPT=9100 SEQ=2039090101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478327240000000001030307) 
Dec 02 09:10:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56110 DF PROTO=TCP SPT=46236 DPT=9882 SEQ=3997679653 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478333240000000001030307) 
Dec 02 09:10:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:10:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:10:15 np0005541913.localdomain systemd[1]: tmp-crun.ZaIpZY.mount: Deactivated successfully.
Dec 02 09:10:15 np0005541913.localdomain podman[110484]: 2025-12-02 09:10:15.454862999 +0000 UTC m=+0.089568844 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1)
Dec 02 09:10:15 np0005541913.localdomain podman[110484]: 2025-12-02 09:10:15.496892931 +0000 UTC m=+0.131598746 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4)
Dec 02 09:10:15 np0005541913.localdomain podman[110484]: unhealthy
Dec 02 09:10:15 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:10:15 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:10:15 np0005541913.localdomain podman[110483]: 2025-12-02 09:10:15.499779059 +0000 UTC m=+0.136301352 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com)
Dec 02 09:10:15 np0005541913.localdomain podman[110483]: 2025-12-02 09:10:15.584523822 +0000 UTC m=+0.221046185 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, release=1761123044, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4)
Dec 02 09:10:15 np0005541913.localdomain podman[110483]: unhealthy
Dec 02 09:10:15 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:10:15 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:10:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37708 DF PROTO=TCP SPT=59590 DPT=9100 SEQ=2039090101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47833EE40000000001030307) 
Dec 02 09:10:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24991 DF PROTO=TCP SPT=40144 DPT=9102 SEQ=2502057526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47834BE40000000001030307) 
Dec 02 09:10:22 np0005541913.localdomain sudo[110522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:10:22 np0005541913.localdomain sudo[110522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:10:22 np0005541913.localdomain sudo[110522]: pam_unix(sudo:session): session closed for user root
Dec 02 09:10:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28309 DF PROTO=TCP SPT=38506 DPT=9101 SEQ=3649466983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478356900000000001030307) 
Dec 02 09:10:22 np0005541913.localdomain sudo[110537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:10:22 np0005541913.localdomain sudo[110537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:10:22 np0005541913.localdomain sudo[110537]: pam_unix(sudo:session): session closed for user root
Dec 02 09:10:23 np0005541913.localdomain sudo[110584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:10:23 np0005541913.localdomain sudo[110584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:10:23 np0005541913.localdomain sudo[110584]: pam_unix(sudo:session): session closed for user root
Dec 02 09:10:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28311 DF PROTO=TCP SPT=38506 DPT=9101 SEQ=3649466983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478362A40000000001030307) 
Dec 02 09:10:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28312 DF PROTO=TCP SPT=38506 DPT=9101 SEQ=3649466983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478372640000000001030307) 
Dec 02 09:10:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63656 DF PROTO=TCP SPT=55352 DPT=9102 SEQ=3766022190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4783844E0000000001030307) 
Dec 02 09:10:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51743 DF PROTO=TCP SPT=36780 DPT=9105 SEQ=2074142622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478384CF0000000001030307) 
Dec 02 09:10:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63658 DF PROTO=TCP SPT=55352 DPT=9102 SEQ=3766022190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478390640000000001030307) 
Dec 02 09:10:39 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50437 DF PROTO=TCP SPT=35628 DPT=9882 SEQ=1603742372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47839BE50000000001030307) 
Dec 02 09:10:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11689 DF PROTO=TCP SPT=43486 DPT=9100 SEQ=2281320750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4783A7E40000000001030307) 
Dec 02 09:10:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:10:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:10:45 np0005541913.localdomain podman[110599]: 2025-12-02 09:10:45.670729845 +0000 UTC m=+0.063001333 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 02 09:10:45 np0005541913.localdomain podman[110599]: 2025-12-02 09:10:45.686033855 +0000 UTC m=+0.078305393 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, name=rhosp17/openstack-ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller)
Dec 02 09:10:45 np0005541913.localdomain podman[110599]: unhealthy
Dec 02 09:10:45 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:10:45 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:10:45 np0005541913.localdomain systemd[1]: tmp-crun.3yNkRa.mount: Deactivated successfully.
Dec 02 09:10:45 np0005541913.localdomain podman[110600]: 2025-12-02 09:10:45.739565085 +0000 UTC m=+0.122247607 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 02 09:10:45 np0005541913.localdomain podman[110600]: 2025-12-02 09:10:45.753752464 +0000 UTC m=+0.136434986 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:10:45 np0005541913.localdomain podman[110600]: unhealthy
Dec 02 09:10:45 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:10:45 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:10:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19990 DF PROTO=TCP SPT=36268 DPT=9100 SEQ=3508387011 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4783B3E40000000001030307) 
Dec 02 09:10:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63660 DF PROTO=TCP SPT=55352 DPT=9102 SEQ=3766022190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4783BFE50000000001030307) 
Dec 02 09:10:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18150 DF PROTO=TCP SPT=45882 DPT=9101 SEQ=699891340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4783CBC00000000001030307) 
Dec 02 09:10:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18152 DF PROTO=TCP SPT=45882 DPT=9101 SEQ=699891340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4783D7E50000000001030307) 
Dec 02 09:10:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18153 DF PROTO=TCP SPT=45882 DPT=9101 SEQ=699891340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4783E7A50000000001030307) 
Dec 02 09:11:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55004 DF PROTO=TCP SPT=41188 DPT=9102 SEQ=4215512189 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4783F97E0000000001030307) 
Dec 02 09:11:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17290 DF PROTO=TCP SPT=32972 DPT=9105 SEQ=3338811856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4783F9FF0000000001030307) 
Dec 02 09:11:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55006 DF PROTO=TCP SPT=41188 DPT=9102 SEQ=4215512189 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478405A40000000001030307) 
Dec 02 09:11:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64817 DF PROTO=TCP SPT=57784 DPT=9100 SEQ=2509961656 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478411640000000001030307) 
Dec 02 09:11:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2012 DF PROTO=TCP SPT=35046 DPT=9882 SEQ=851284200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47841DA40000000001030307) 
Dec 02 09:11:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:11:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:11:15 np0005541913.localdomain podman[110640]: 2025-12-02 09:11:15.90849522 +0000 UTC m=+0.053068513 container health_status e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64)
Dec 02 09:11:15 np0005541913.localdomain podman[110639]: 2025-12-02 09:11:15.95841554 +0000 UTC m=+0.105022427 container health_status 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:11:15 np0005541913.localdomain podman[110639]: 2025-12-02 09:11:15.968360352 +0000 UTC m=+0.114967229 container exec_died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ovn_metadata_agent, tcib_managed=true, build-date=2025-11-19T00:14:25Z, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:11:15 np0005541913.localdomain podman[110639]: unhealthy
Dec 02 09:11:15 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:11:15 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed with result 'exit-code'.
Dec 02 09:11:15 np0005541913.localdomain podman[110640]: 2025-12-02 09:11:15.989290946 +0000 UTC m=+0.133864269 container exec_died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Dec 02 09:11:15 np0005541913.localdomain podman[110640]: unhealthy
Dec 02 09:11:16 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:11:16 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed with result 'exit-code'.
Dec 02 09:11:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64819 DF PROTO=TCP SPT=57784 DPT=9100 SEQ=2509961656 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478429250000000001030307) 
Dec 02 09:11:17 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing.
Dec 02 09:11:17 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud.service: Killing process 62308 (conmon) with signal SIGKILL.
Dec 02 09:11:17 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL
Dec 02 09:11:17 np0005541913.localdomain systemd[1]: libpod-conmon-df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca.scope: Deactivated successfully.
Dec 02 09:11:17 np0005541913.localdomain podman[110692]: error opening file `/run/crun/df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca/status`: No such file or directory
Dec 02 09:11:17 np0005541913.localdomain podman[110680]: 2025-12-02 09:11:17.440986877 +0000 UTC m=+0.076410601 container cleanup df9293835a969db4ed1da89ed49d9744a7c000abca2cb7f5672a9e3b2e5a79ca (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtqemud, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 02 09:11:17 np0005541913.localdomain podman[110680]: nova_virtqemud
Dec 02 09:11:17 np0005541913.localdomain systemd[1]: tmp-crun.u6LS8a.mount: Deactivated successfully.
Dec 02 09:11:17 np0005541913.localdomain systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'.
Dec 02 09:11:17 np0005541913.localdomain systemd[1]: Stopped nova_virtqemud container.
Dec 02 09:11:17 np0005541913.localdomain sudo[110411]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:17 np0005541913.localdomain sudo[110783]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzxrmzdpttjtgwbfpydtadwwdveoivzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666677.5901449-115-17003883802305/AnsiballZ_systemd_service.py
Dec 02 09:11:17 np0005541913.localdomain sudo[110783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:11:18 np0005541913.localdomain python3.9[110785]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:11:18 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:11:18 np0005541913.localdomain systemd-rc-local-generator[110812]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:11:18 np0005541913.localdomain systemd-sysv-generator[110818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:11:18 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:11:18 np0005541913.localdomain sudo[110783]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:18 np0005541913.localdomain sudo[110913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqdbulmpouujbqcplqztjkbzgfyfegqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666678.7078757-115-123185056535692/AnsiballZ_systemd_service.py
Dec 02 09:11:18 np0005541913.localdomain sudo[110913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:11:19 np0005541913.localdomain python3.9[110915]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:11:19 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:11:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17294 DF PROTO=TCP SPT=32972 DPT=9105 SEQ=3338811856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478435E40000000001030307) 
Dec 02 09:11:19 np0005541913.localdomain systemd-rc-local-generator[110941]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:11:19 np0005541913.localdomain systemd-sysv-generator[110944]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:11:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:11:19 np0005541913.localdomain systemd[1]: Stopping nova_virtsecretd container...
Dec 02 09:11:19 np0005541913.localdomain systemd[1]: libpod-d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f.scope: Deactivated successfully.
Dec 02 09:11:19 np0005541913.localdomain podman[110956]: 2025-12-02 09:11:19.751406636 +0000 UTC m=+0.081567987 container died d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtsecretd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true)
Dec 02 09:11:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f-userdata-shm.mount: Deactivated successfully.
Dec 02 09:11:19 np0005541913.localdomain podman[110956]: 2025-12-02 09:11:19.78790766 +0000 UTC m=+0.118068991 container cleanup d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtsecretd, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com)
Dec 02 09:11:19 np0005541913.localdomain podman[110956]: nova_virtsecretd
Dec 02 09:11:19 np0005541913.localdomain podman[110969]: 2025-12-02 09:11:19.846084048 +0000 UTC m=+0.082997825 container cleanup d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc.)
Dec 02 09:11:19 np0005541913.localdomain systemd[1]: libpod-conmon-d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f.scope: Deactivated successfully.
Dec 02 09:11:19 np0005541913.localdomain podman[111002]: error opening file `/run/crun/d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f/status`: No such file or directory
Dec 02 09:11:19 np0005541913.localdomain podman[110990]: 2025-12-02 09:11:19.957795671 +0000 UTC m=+0.073000821 container cleanup d03ee59c7a667467d7894db6377f1c0920833c450a13535ae78eaa182412468f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, container_name=nova_virtsecretd, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 02 09:11:19 np0005541913.localdomain podman[110990]: nova_virtsecretd
Dec 02 09:11:19 np0005541913.localdomain systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Dec 02 09:11:19 np0005541913.localdomain systemd[1]: Stopped nova_virtsecretd container.
Dec 02 09:11:19 np0005541913.localdomain sudo[110913]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:20 np0005541913.localdomain sudo[111093]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygfjvxpidulpruuvkhxmlkfcpyxfbmfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666680.0969927-115-272035507473105/AnsiballZ_systemd_service.py
Dec 02 09:11:20 np0005541913.localdomain sudo[111093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:11:20 np0005541913.localdomain python3.9[111095]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:11:20 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:11:20 np0005541913.localdomain systemd-rc-local-generator[111124]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:11:20 np0005541913.localdomain systemd-sysv-generator[111127]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:11:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:11:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a-merged.mount: Deactivated successfully.
Dec 02 09:11:20 np0005541913.localdomain systemd[1]: Stopping nova_virtstoraged container...
Dec 02 09:11:21 np0005541913.localdomain systemd[1]: libpod-4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56.scope: Deactivated successfully.
Dec 02 09:11:21 np0005541913.localdomain podman[111135]: 2025-12-02 09:11:21.0837074 +0000 UTC m=+0.074240383 container died 4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtstoraged, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1761123044)
Dec 02 09:11:21 np0005541913.localdomain podman[111135]: 2025-12-02 09:11:21.120907584 +0000 UTC m=+0.111440587 container cleanup 4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=nova_virtstoraged, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container)
Dec 02 09:11:21 np0005541913.localdomain podman[111135]: nova_virtstoraged
Dec 02 09:11:21 np0005541913.localdomain podman[111150]: 2025-12-02 09:11:21.160654245 +0000 UTC m=+0.066131750 container cleanup 4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, release=1761123044)
Dec 02 09:11:21 np0005541913.localdomain systemd[1]: libpod-conmon-4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56.scope: Deactivated successfully.
Dec 02 09:11:21 np0005541913.localdomain podman[111178]: error opening file `/run/crun/4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56/status`: No such file or directory
Dec 02 09:11:21 np0005541913.localdomain podman[111166]: 2025-12-02 09:11:21.259750154 +0000 UTC m=+0.069253871 container cleanup 4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ff8ff724cb5f0d02131158e2fae849b6'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., container_name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 02 09:11:21 np0005541913.localdomain podman[111166]: nova_virtstoraged
Dec 02 09:11:21 np0005541913.localdomain systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Dec 02 09:11:21 np0005541913.localdomain systemd[1]: Stopped nova_virtstoraged container.
Dec 02 09:11:21 np0005541913.localdomain sudo[111093]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b-merged.mount: Deactivated successfully.
Dec 02 09:11:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a8634e5907beb37e61c16da6d777a6ee20926770ebd3c1322ad3f3d55924e56-userdata-shm.mount: Deactivated successfully.
Dec 02 09:11:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12363 DF PROTO=TCP SPT=42942 DPT=9101 SEQ=4269477184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478440F00000000001030307) 
Dec 02 09:11:22 np0005541913.localdomain sudo[111269]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbsunmjxgqlrfuzmktpzocintgbifodo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666682.182307-115-59370553007630/AnsiballZ_systemd_service.py
Dec 02 09:11:22 np0005541913.localdomain sudo[111269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:11:22 np0005541913.localdomain python3.9[111271]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:11:22 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:11:22 np0005541913.localdomain systemd-rc-local-generator[111300]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:11:22 np0005541913.localdomain systemd-sysv-generator[111304]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:11:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: Stopping ovn_controller container...
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: libpod-e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.scope: Deactivated successfully.
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: libpod-e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.scope: Consumed 2.635s CPU time.
Dec 02 09:11:23 np0005541913.localdomain podman[111313]: 2025-12-02 09:11:23.189637363 +0000 UTC m=+0.105378325 container died e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=ovn_controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.timer: Deactivated successfully.
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed to open /run/systemd/transient/e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: No such file or directory
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b-userdata-shm.mount: Deactivated successfully.
Dec 02 09:11:23 np0005541913.localdomain podman[111313]: 2025-12-02 09:11:23.288838336 +0000 UTC m=+0.204579288 container cleanup e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team)
Dec 02 09:11:23 np0005541913.localdomain podman[111313]: ovn_controller
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.timer: Failed to open /run/systemd/transient/e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.timer: No such file or directory
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed to open /run/systemd/transient/e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: No such file or directory
Dec 02 09:11:23 np0005541913.localdomain podman[111327]: 2025-12-02 09:11:23.302703813 +0000 UTC m=+0.108777357 container cleanup e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, container_name=ovn_controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: libpod-conmon-e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.scope: Deactivated successfully.
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.timer: Failed to open /run/systemd/transient/e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.timer: No such file or directory
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: Failed to open /run/systemd/transient/e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b.service: No such file or directory
Dec 02 09:11:23 np0005541913.localdomain podman[111341]: 2025-12-02 09:11:23.394002335 +0000 UTC m=+0.066035476 container cleanup e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, container_name=ovn_controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:11:23 np0005541913.localdomain podman[111341]: ovn_controller
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Dec 02 09:11:23 np0005541913.localdomain systemd[1]: Stopped ovn_controller container.
Dec 02 09:11:23 np0005541913.localdomain sudo[111269]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:23 np0005541913.localdomain sudo[111412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:11:23 np0005541913.localdomain sudo[111412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:11:23 np0005541913.localdomain sudo[111412]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:23 np0005541913.localdomain sudo[111470]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmdmtmzjkwwaiguwqtzqzgtqqfcrhvwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666683.5632417-115-194524676675926/AnsiballZ_systemd_service.py
Dec 02 09:11:23 np0005541913.localdomain sudo[111442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:11:23 np0005541913.localdomain sudo[111442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:11:23 np0005541913.localdomain sudo[111470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:11:24 np0005541913.localdomain python3.9[111473]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:11:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-fa2735d70b4229c33d88157dc663cc996128839f7744195fee819ab923e68e6b-merged.mount: Deactivated successfully.
Dec 02 09:11:24 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:11:24 np0005541913.localdomain systemd-sysv-generator[111529]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:11:24 np0005541913.localdomain systemd-rc-local-generator[111526]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:11:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:11:24 np0005541913.localdomain systemd[1]: Stopping ovn_metadata_agent container...
Dec 02 09:11:24 np0005541913.localdomain systemd[1]: tmp-crun.LQ3zQe.mount: Deactivated successfully.
Dec 02 09:11:24 np0005541913.localdomain podman[111594]: 2025-12-02 09:11:24.729156766 +0000 UTC m=+0.097724034 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, name=rhceph, release=1763362218, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main)
Dec 02 09:11:24 np0005541913.localdomain podman[111594]: 2025-12-02 09:11:24.861155435 +0000 UTC m=+0.229722723 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, version=7, com.redhat.component=rhceph-container, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=)
Dec 02 09:11:25 np0005541913.localdomain sudo[111442]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:25 np0005541913.localdomain sudo[111663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:11:25 np0005541913.localdomain sudo[111663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:11:25 np0005541913.localdomain sudo[111663]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12365 DF PROTO=TCP SPT=42942 DPT=9101 SEQ=4269477184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47844CE40000000001030307) 
Dec 02 09:11:25 np0005541913.localdomain sudo[111678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:11:25 np0005541913.localdomain sudo[111678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:11:25 np0005541913.localdomain systemd[1]: libpod-1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.scope: Deactivated successfully.
Dec 02 09:11:25 np0005541913.localdomain systemd[1]: libpod-1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.scope: Consumed 11.221s CPU time.
Dec 02 09:11:25 np0005541913.localdomain podman[111558]: 2025-12-02 09:11:25.569441366 +0000 UTC m=+1.128673614 container died 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, release=1761123044, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:11:25 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.timer: Deactivated successfully.
Dec 02 09:11:25 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.
Dec 02 09:11:25 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed to open /run/systemd/transient/1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: No such file or directory
Dec 02 09:11:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85-userdata-shm.mount: Deactivated successfully.
Dec 02 09:11:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3a1af3edb87ae84c24194878020e22370aba8355c75888d8a0972cd3b1ac86c8-merged.mount: Deactivated successfully.
Dec 02 09:11:25 np0005541913.localdomain podman[111558]: 2025-12-02 09:11:25.638432179 +0000 UTC m=+1.197664367 container cleanup 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, container_name=ovn_metadata_agent, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:11:25 np0005541913.localdomain podman[111558]: ovn_metadata_agent
Dec 02 09:11:25 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.timer: Failed to open /run/systemd/transient/1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.timer: No such file or directory
Dec 02 09:11:25 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed to open /run/systemd/transient/1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: No such file or directory
Dec 02 09:11:25 np0005541913.localdomain podman[111706]: 2025-12-02 09:11:25.677587964 +0000 UTC m=+0.094454326 container cleanup 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent)
Dec 02 09:11:25 np0005541913.localdomain sudo[111678]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:26 np0005541913.localdomain sudo[111741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:11:26 np0005541913.localdomain sudo[111741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:11:26 np0005541913.localdomain sudo[111741]: pam_unix(sudo:session): session closed for user root
Dec 02 09:11:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12366 DF PROTO=TCP SPT=42942 DPT=9101 SEQ=4269477184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47845CA40000000001030307) 
Dec 02 09:11:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60259 DF PROTO=TCP SPT=58926 DPT=9102 SEQ=2733500089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47846EAD0000000001030307) 
Dec 02 09:11:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53741 DF PROTO=TCP SPT=41902 DPT=9105 SEQ=3207970519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47846F2E0000000001030307) 
Dec 02 09:11:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60261 DF PROTO=TCP SPT=58926 DPT=9102 SEQ=2733500089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47847AA50000000001030307) 
Dec 02 09:11:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46428 DF PROTO=TCP SPT=39210 DPT=9100 SEQ=2271897119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478486A40000000001030307) 
Dec 02 09:11:42 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19993 DF PROTO=TCP SPT=36268 DPT=9100 SEQ=3508387011 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478491E40000000001030307) 
Dec 02 09:11:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46430 DF PROTO=TCP SPT=39210 DPT=9100 SEQ=2271897119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47849E650000000001030307) 
Dec 02 09:11:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60263 DF PROTO=TCP SPT=58926 DPT=9102 SEQ=2733500089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4784A9E40000000001030307) 
Dec 02 09:11:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12176 DF PROTO=TCP SPT=49436 DPT=9101 SEQ=1109354132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4784B6200000000001030307) 
Dec 02 09:11:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12178 DF PROTO=TCP SPT=49436 DPT=9101 SEQ=1109354132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4784C2240000000001030307) 
Dec 02 09:11:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12179 DF PROTO=TCP SPT=49436 DPT=9101 SEQ=1109354132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4784D1E40000000001030307) 
Dec 02 09:12:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58636 DF PROTO=TCP SPT=44682 DPT=9102 SEQ=1860097015 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4784E3DD0000000001030307) 
Dec 02 09:12:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34983 DF PROTO=TCP SPT=34808 DPT=9105 SEQ=465326508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4784E45F0000000001030307) 
Dec 02 09:12:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58638 DF PROTO=TCP SPT=44682 DPT=9102 SEQ=1860097015 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4784EFE40000000001030307) 
Dec 02 09:12:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2015 DF PROTO=TCP SPT=35046 DPT=9882 SEQ=851284200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4784FBE40000000001030307) 
Dec 02 09:12:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64822 DF PROTO=TCP SPT=57784 DPT=9100 SEQ=2509961656 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478507E50000000001030307) 
Dec 02 09:12:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5080 DF PROTO=TCP SPT=49934 DPT=9100 SEQ=3740161054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478513A50000000001030307) 
Dec 02 09:12:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34987 DF PROTO=TCP SPT=34808 DPT=9105 SEQ=465326508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47851FE40000000001030307) 
Dec 02 09:12:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38447 DF PROTO=TCP SPT=34622 DPT=9101 SEQ=460115161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47852B500000000001030307) 
Dec 02 09:12:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38449 DF PROTO=TCP SPT=34622 DPT=9101 SEQ=460115161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478537640000000001030307) 
Dec 02 09:12:26 np0005541913.localdomain sudo[111756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:12:26 np0005541913.localdomain sudo[111756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:12:26 np0005541913.localdomain sudo[111756]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:26 np0005541913.localdomain sudo[111771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:12:26 np0005541913.localdomain sudo[111771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:12:27 np0005541913.localdomain sudo[111771]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:28 np0005541913.localdomain sudo[111817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:12:28 np0005541913.localdomain sudo[111817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:12:28 np0005541913.localdomain sudo[111817]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38450 DF PROTO=TCP SPT=34622 DPT=9101 SEQ=460115161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478547240000000001030307) 
Dec 02 09:12:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62581 DF PROTO=TCP SPT=45506 DPT=9102 SEQ=3218035444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785590D0000000001030307) 
Dec 02 09:12:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44227 DF PROTO=TCP SPT=50634 DPT=9105 SEQ=3074358929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785598E0000000001030307) 
Dec 02 09:12:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62583 DF PROTO=TCP SPT=45506 DPT=9102 SEQ=3218035444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478565250000000001030307) 
Dec 02 09:12:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18869 DF PROTO=TCP SPT=58378 DPT=9100 SEQ=4264979749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478571240000000001030307) 
Dec 02 09:12:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22643 DF PROTO=TCP SPT=33716 DPT=9882 SEQ=1764166890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47857D250000000001030307) 
Dec 02 09:12:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18871 DF PROTO=TCP SPT=58378 DPT=9100 SEQ=4264979749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478588E40000000001030307) 
Dec 02 09:12:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62585 DF PROTO=TCP SPT=45506 DPT=9102 SEQ=3218035444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478595E40000000001030307) 
Dec 02 09:12:49 np0005541913.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing.
Dec 02 09:12:49 np0005541913.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 71612 (conmon) with signal SIGKILL.
Dec 02 09:12:49 np0005541913.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL
Dec 02 09:12:49 np0005541913.localdomain systemd[1]: libpod-conmon-1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.scope: Deactivated successfully.
Dec 02 09:12:49 np0005541913.localdomain systemd[1]: tmp-crun.rqpHrh.mount: Deactivated successfully.
Dec 02 09:12:49 np0005541913.localdomain podman[111845]: error opening file `/run/crun/1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85/status`: No such file or directory
Dec 02 09:12:49 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.timer: Failed to open /run/systemd/transient/1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.timer: No such file or directory
Dec 02 09:12:49 np0005541913.localdomain systemd[1]: 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: Failed to open /run/systemd/transient/1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85.service: No such file or directory
Dec 02 09:12:49 np0005541913.localdomain podman[111832]: 2025-12-02 09:12:49.95265872 +0000 UTC m=+0.093987586 container cleanup 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:12:49 np0005541913.localdomain podman[111832]: ovn_metadata_agent
Dec 02 09:12:49 np0005541913.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'.
Dec 02 09:12:49 np0005541913.localdomain systemd[1]: Stopped ovn_metadata_agent container.
Dec 02 09:12:50 np0005541913.localdomain sudo[111470]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:50 np0005541913.localdomain sudo[111937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptlgtoyzorbsxhuwqbejbxkkguyejefr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666770.1105506-115-114746686996736/AnsiballZ_systemd_service.py
Dec 02 09:12:50 np0005541913.localdomain sudo[111937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:50 np0005541913.localdomain python3.9[111939]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:12:50 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:12:50 np0005541913.localdomain systemd-rc-local-generator[111964]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:12:50 np0005541913.localdomain systemd-sysv-generator[111969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:12:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:12:51 np0005541913.localdomain sudo[111937]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30901 DF PROTO=TCP SPT=55216 DPT=9101 SEQ=2707684008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785A0800000000001030307) 
Dec 02 09:12:52 np0005541913.localdomain sudo[112067]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scymelwinuvdxtlrxqobxfwlscovetmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666772.0741577-564-219994855150451/AnsiballZ_file.py
Dec 02 09:12:52 np0005541913.localdomain sudo[112067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:52 np0005541913.localdomain python3.9[112069]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:52 np0005541913.localdomain sudo[112067]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:53 np0005541913.localdomain sudo[112159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yloaczmpqueunjaqmmqepebpgaklmrep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666772.8071625-564-232683591238739/AnsiballZ_file.py
Dec 02 09:12:53 np0005541913.localdomain sudo[112159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:53 np0005541913.localdomain python3.9[112161]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:53 np0005541913.localdomain sudo[112159]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:53 np0005541913.localdomain sudo[112251]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvxfxorlxyhiatuyuluwnabfghccslrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666773.3426893-564-261804031375981/AnsiballZ_file.py
Dec 02 09:12:53 np0005541913.localdomain sudo[112251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:53 np0005541913.localdomain python3.9[112253]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:53 np0005541913.localdomain sudo[112251]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:54 np0005541913.localdomain sudo[112343]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhtrpqafpbfqvvkiybzugoifdbbugdet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666773.898016-564-249439490969814/AnsiballZ_file.py
Dec 02 09:12:54 np0005541913.localdomain sudo[112343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:54 np0005541913.localdomain python3.9[112345]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:54 np0005541913.localdomain sudo[112343]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:54 np0005541913.localdomain sudo[112435]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihflzxabgtpshfckrjxzyozztrxidfhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666774.454005-564-217252727463600/AnsiballZ_file.py
Dec 02 09:12:54 np0005541913.localdomain sudo[112435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:54 np0005541913.localdomain python3.9[112437]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:54 np0005541913.localdomain sudo[112435]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:55 np0005541913.localdomain sudo[112527]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggropqgoozmsggrdanljiozqdkcdlxxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666774.978591-564-250217378489244/AnsiballZ_file.py
Dec 02 09:12:55 np0005541913.localdomain sudo[112527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30903 DF PROTO=TCP SPT=55216 DPT=9101 SEQ=2707684008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785ACA40000000001030307) 
Dec 02 09:12:55 np0005541913.localdomain python3.9[112529]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:55 np0005541913.localdomain sudo[112527]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:55 np0005541913.localdomain sudo[112619]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpuljjqckglsygteigarvrelgivmnybs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666775.5586746-564-77334425525945/AnsiballZ_file.py
Dec 02 09:12:55 np0005541913.localdomain sudo[112619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:55 np0005541913.localdomain python3.9[112621]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:55 np0005541913.localdomain sudo[112619]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:56 np0005541913.localdomain sudo[112711]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpracvqrwovcuwlxwokigvxajefvoafv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666776.0742157-564-120383819684888/AnsiballZ_file.py
Dec 02 09:12:56 np0005541913.localdomain sudo[112711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:56 np0005541913.localdomain python3.9[112713]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:56 np0005541913.localdomain sudo[112711]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:56 np0005541913.localdomain sudo[112803]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cilmteyuetqppkyjhzllhjxbwhevsqnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666776.6363347-564-232656990809997/AnsiballZ_file.py
Dec 02 09:12:56 np0005541913.localdomain sudo[112803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:57 np0005541913.localdomain python3.9[112805]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:57 np0005541913.localdomain sudo[112803]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:57 np0005541913.localdomain sudo[112895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdhyiropkpcvgswwbmewuwvlwtixqggi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666777.2013013-564-209695131114216/AnsiballZ_file.py
Dec 02 09:12:57 np0005541913.localdomain sudo[112895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:57 np0005541913.localdomain python3.9[112897]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:57 np0005541913.localdomain sudo[112895]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:58 np0005541913.localdomain sudo[112987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgrteprocdecxjrbndlgkaixnmuqlvvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666777.7785804-564-163595921576915/AnsiballZ_file.py
Dec 02 09:12:58 np0005541913.localdomain sudo[112987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:58 np0005541913.localdomain python3.9[112989]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:58 np0005541913.localdomain sudo[112987]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:58 np0005541913.localdomain sudo[113079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyyerueieqxnqobwofzsofreesiovcdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666778.34412-564-273703646995143/AnsiballZ_file.py
Dec 02 09:12:58 np0005541913.localdomain sudo[113079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:58 np0005541913.localdomain python3.9[113081]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:58 np0005541913.localdomain sudo[113079]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:59 np0005541913.localdomain sudo[113171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqvotzgxxlfskqgfojcycbahtxpdgteb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666778.8946457-564-276440003895945/AnsiballZ_file.py
Dec 02 09:12:59 np0005541913.localdomain sudo[113171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:59 np0005541913.localdomain python3.9[113173]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:59 np0005541913.localdomain sudo[113171]: pam_unix(sudo:session): session closed for user root
Dec 02 09:12:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30904 DF PROTO=TCP SPT=55216 DPT=9101 SEQ=2707684008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785BC640000000001030307) 
Dec 02 09:12:59 np0005541913.localdomain sudo[113263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-guxkyusktjokvnrzsxuvwjgsvtaqvmiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666779.4180503-564-87767156641791/AnsiballZ_file.py
Dec 02 09:12:59 np0005541913.localdomain sudo[113263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:12:59 np0005541913.localdomain python3.9[113265]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:12:59 np0005541913.localdomain sudo[113263]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:00 np0005541913.localdomain sudo[113355]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lacnmymsouvegdpfkborauplsrsagtmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666779.9789977-564-70259914934715/AnsiballZ_file.py
Dec 02 09:13:00 np0005541913.localdomain sudo[113355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:00 np0005541913.localdomain python3.9[113357]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:00 np0005541913.localdomain sudo[113355]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:00 np0005541913.localdomain sudo[113447]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmaoztgcxznwwlyefticriwyugxcmjeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666780.5968215-564-222539596193135/AnsiballZ_file.py
Dec 02 09:13:00 np0005541913.localdomain sudo[113447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:01 np0005541913.localdomain python3.9[113449]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:01 np0005541913.localdomain sudo[113447]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:01 np0005541913.localdomain sudo[113539]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpderiqmzuqctmstppilbkyzrthfgkvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666781.1251934-564-195519207887581/AnsiballZ_file.py
Dec 02 09:13:01 np0005541913.localdomain sudo[113539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:01 np0005541913.localdomain python3.9[113541]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:01 np0005541913.localdomain sudo[113539]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:01 np0005541913.localdomain sudo[113631]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ricssbqtsdsxituxkkbtdfrkanyjuvzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666781.7207623-564-52805329801269/AnsiballZ_file.py
Dec 02 09:13:01 np0005541913.localdomain sudo[113631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:02 np0005541913.localdomain python3.9[113633]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:02 np0005541913.localdomain sudo[113631]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:02 np0005541913.localdomain sudo[113723]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjwbpgvuywecnspsyjsoboxfdpbwsdnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666782.2590513-564-239152795893244/AnsiballZ_file.py
Dec 02 09:13:02 np0005541913.localdomain sudo[113723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:02 np0005541913.localdomain python3.9[113725]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:02 np0005541913.localdomain sudo[113723]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:03 np0005541913.localdomain sudo[113815]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgzktqnjkamdijxsltacpvqaewwmudqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666782.8129702-564-275968701740922/AnsiballZ_file.py
Dec 02 09:13:03 np0005541913.localdomain sudo[113815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:03 np0005541913.localdomain python3.9[113817]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:03 np0005541913.localdomain sudo[113815]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:03 np0005541913.localdomain sudo[113907]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfhvroavgeygwiogfijakvxxzcyozixl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666783.3641062-564-2856174791842/AnsiballZ_file.py
Dec 02 09:13:03 np0005541913.localdomain sudo[113907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:03 np0005541913.localdomain python3.9[113909]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:03 np0005541913.localdomain sudo[113907]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35006 DF PROTO=TCP SPT=56412 DPT=9102 SEQ=3164046067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785CE3D0000000001030307) 
Dec 02 09:13:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18193 DF PROTO=TCP SPT=55084 DPT=9105 SEQ=3840888555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785CEBE0000000001030307) 
Dec 02 09:13:04 np0005541913.localdomain sudo[113999]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqdqpgcaylsktoazcswpqrpnmyejpfhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666784.609494-1014-197973842013522/AnsiballZ_file.py
Dec 02 09:13:04 np0005541913.localdomain sudo[113999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:05 np0005541913.localdomain python3.9[114001]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:05 np0005541913.localdomain sudo[113999]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:05 np0005541913.localdomain sudo[114091]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbxfdgyokqasmmqrydadmciydcweygse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666785.1473348-1014-67525110191799/AnsiballZ_file.py
Dec 02 09:13:05 np0005541913.localdomain sudo[114091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:05 np0005541913.localdomain python3.9[114093]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:05 np0005541913.localdomain sudo[114091]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:05 np0005541913.localdomain sudo[114183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnrqsrawyrwrdjfpqluxkrlpylazpdwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666785.6528637-1014-238256484853220/AnsiballZ_file.py
Dec 02 09:13:05 np0005541913.localdomain sudo[114183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:06 np0005541913.localdomain python3.9[114185]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:06 np0005541913.localdomain sudo[114183]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:06 np0005541913.localdomain sudo[114275]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxwrbgffbbbagsyvdvqbgvrajynhmikz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666786.1893954-1014-231421473264071/AnsiballZ_file.py
Dec 02 09:13:06 np0005541913.localdomain sudo[114275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:06 np0005541913.localdomain python3.9[114277]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:06 np0005541913.localdomain sudo[114275]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:06 np0005541913.localdomain sudo[114367]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqmfbyfpkrhtnofoovfoycewgodgmqly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666786.726545-1014-266669001738303/AnsiballZ_file.py
Dec 02 09:13:06 np0005541913.localdomain sudo[114367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35008 DF PROTO=TCP SPT=56412 DPT=9102 SEQ=3164046067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785DA640000000001030307) 
Dec 02 09:13:07 np0005541913.localdomain python3.9[114369]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:07 np0005541913.localdomain sudo[114367]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:07 np0005541913.localdomain sudo[114459]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnndbhfqbzcvyzvnjjvixzqxcszqsyba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666787.2682421-1014-154970260285412/AnsiballZ_file.py
Dec 02 09:13:07 np0005541913.localdomain sudo[114459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:07 np0005541913.localdomain python3.9[114461]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:07 np0005541913.localdomain sudo[114459]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:08 np0005541913.localdomain sudo[114551]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qymcsckvlhgusvtvindqmnflgifzjthv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666787.8078356-1014-164431312210781/AnsiballZ_file.py
Dec 02 09:13:08 np0005541913.localdomain sudo[114551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:08 np0005541913.localdomain python3.9[114553]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:08 np0005541913.localdomain sudo[114551]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:08 np0005541913.localdomain sudo[114643]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxmgvvunuomrcwiyivphmejtwbuwintl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666788.3357008-1014-115057556466988/AnsiballZ_file.py
Dec 02 09:13:08 np0005541913.localdomain sudo[114643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:08 np0005541913.localdomain python3.9[114645]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:08 np0005541913.localdomain sudo[114643]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:09 np0005541913.localdomain sudo[114735]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gygowsiicjmejfwpggjjolnqobqkupui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666788.8885043-1014-235106159769597/AnsiballZ_file.py
Dec 02 09:13:09 np0005541913.localdomain sudo[114735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:09 np0005541913.localdomain python3.9[114737]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:09 np0005541913.localdomain sudo[114735]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:09 np0005541913.localdomain sudo[114827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akjyyznloekllradmrymrpmyxpllwbtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666789.4139342-1014-263364304121590/AnsiballZ_file.py
Dec 02 09:13:09 np0005541913.localdomain sudo[114827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:09 np0005541913.localdomain python3.9[114829]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:09 np0005541913.localdomain sudo[114827]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:09 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42760 DF PROTO=TCP SPT=40518 DPT=9882 SEQ=3640480416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785E5E40000000001030307) 
Dec 02 09:13:10 np0005541913.localdomain sudo[114919]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtgpkjsdkdzbjiquqgiaetokdvmftefc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666789.9568455-1014-20110953941654/AnsiballZ_file.py
Dec 02 09:13:10 np0005541913.localdomain sudo[114919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:10 np0005541913.localdomain python3.9[114921]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:10 np0005541913.localdomain sudo[114919]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:10 np0005541913.localdomain sudo[115011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcupeaawdzjthajyxsefqesmlwondbea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666790.5794191-1014-260600666544027/AnsiballZ_file.py
Dec 02 09:13:10 np0005541913.localdomain sudo[115011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:11 np0005541913.localdomain python3.9[115013]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:11 np0005541913.localdomain sudo[115011]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:11 np0005541913.localdomain sudo[115103]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlmtelorpapzpolzjaawsbjbpaziwehm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666791.1418643-1014-1537508017568/AnsiballZ_file.py
Dec 02 09:13:11 np0005541913.localdomain sudo[115103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:11 np0005541913.localdomain python3.9[115105]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:11 np0005541913.localdomain sudo[115103]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:11 np0005541913.localdomain sudo[115195]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfusbztpuztwzctqenndkrwbwheoecnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666791.7127469-1014-33815198314891/AnsiballZ_file.py
Dec 02 09:13:11 np0005541913.localdomain sudo[115195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:12 np0005541913.localdomain python3.9[115197]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:12 np0005541913.localdomain sudo[115195]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:12 np0005541913.localdomain sudo[115287]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ruhxtzkpxgetrrdyqcelrglxuthtmwdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666792.273149-1014-190825660452627/AnsiballZ_file.py
Dec 02 09:13:12 np0005541913.localdomain sudo[115287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:12 np0005541913.localdomain python3.9[115289]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:12 np0005541913.localdomain sudo[115287]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:13 np0005541913.localdomain sudo[115379]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obogcenpgiadjgugcbqnckeqqpwvhpok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666792.8298998-1014-146140873586250/AnsiballZ_file.py
Dec 02 09:13:13 np0005541913.localdomain sudo[115379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5083 DF PROTO=TCP SPT=49934 DPT=9100 SEQ=3740161054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785F1E40000000001030307) 
Dec 02 09:13:13 np0005541913.localdomain python3.9[115381]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:13 np0005541913.localdomain sudo[115379]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:13 np0005541913.localdomain sudo[115471]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekwxvkrlqeqbudhaeudymhlsqtstopzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666793.3393376-1014-217087047683546/AnsiballZ_file.py
Dec 02 09:13:13 np0005541913.localdomain sudo[115471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:13 np0005541913.localdomain python3.9[115473]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:13 np0005541913.localdomain sudo[115471]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:14 np0005541913.localdomain sudo[115563]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-curjpijnxnvxototbkzlznnjqjrnpfex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666793.8800673-1014-203085456439023/AnsiballZ_file.py
Dec 02 09:13:14 np0005541913.localdomain sudo[115563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:14 np0005541913.localdomain python3.9[115565]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:14 np0005541913.localdomain sudo[115563]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:14 np0005541913.localdomain sudo[115655]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwequjvvjdialiwwhieiakxzewnhyvxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666794.3963463-1014-240670037580229/AnsiballZ_file.py
Dec 02 09:13:14 np0005541913.localdomain sudo[115655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:14 np0005541913.localdomain python3.9[115657]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:14 np0005541913.localdomain sudo[115655]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:15 np0005541913.localdomain sudo[115747]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wprioxnufoshqskbyoabxuhwqmzfgusy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666794.9523582-1014-173667148163817/AnsiballZ_file.py
Dec 02 09:13:15 np0005541913.localdomain sudo[115747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:15 np0005541913.localdomain python3.9[115749]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:15 np0005541913.localdomain sudo[115747]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:15 np0005541913.localdomain sudo[115839]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daidtgohqdnvfrdmvwywooidmgukjtad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666795.4943323-1014-86361686544848/AnsiballZ_file.py
Dec 02 09:13:15 np0005541913.localdomain sudo[115839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:15 np0005541913.localdomain python3.9[115841]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:15 np0005541913.localdomain sudo[115839]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61893 DF PROTO=TCP SPT=34570 DPT=9100 SEQ=3363060885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4785FDE40000000001030307) 
Dec 02 09:13:16 np0005541913.localdomain sudo[115931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmjjcpwpoofsyelutuijmiltaiuzxgfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666796.4339154-1461-246981413379419/AnsiballZ_command.py
Dec 02 09:13:16 np0005541913.localdomain sudo[115931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:17 np0005541913.localdomain python3.9[115933]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:17 np0005541913.localdomain sudo[115931]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:17 np0005541913.localdomain python3.9[116025]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 09:13:18 np0005541913.localdomain sudo[116115]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-couayalkpuydnbdapgmqleghvxqpdrmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666798.1694968-1516-107691405254740/AnsiballZ_systemd_service.py
Dec 02 09:13:18 np0005541913.localdomain sudo[116115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:18 np0005541913.localdomain python3.9[116117]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:13:18 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:13:18 np0005541913.localdomain systemd-rc-local-generator[116142]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:13:18 np0005541913.localdomain systemd-sysv-generator[116145]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:13:18 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:13:19 np0005541913.localdomain sudo[116115]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35010 DF PROTO=TCP SPT=56412 DPT=9102 SEQ=3164046067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478609E50000000001030307) 
Dec 02 09:13:19 np0005541913.localdomain sudo[116244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntxycceaqynsxvuviptaljfcsoqkpibr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666799.2225924-1539-191783152903057/AnsiballZ_command.py
Dec 02 09:13:19 np0005541913.localdomain sudo[116244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:19 np0005541913.localdomain python3.9[116246]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:19 np0005541913.localdomain sudo[116244]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:20 np0005541913.localdomain sudo[116337]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkexugbboshlevssziuqqhzetrkcgspb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666799.7670462-1539-179190422433097/AnsiballZ_command.py
Dec 02 09:13:20 np0005541913.localdomain sudo[116337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:20 np0005541913.localdomain python3.9[116339]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:20 np0005541913.localdomain sudo[116337]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:20 np0005541913.localdomain sudo[116430]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhnyahiyqozcifjbidrrjyqtadczmjzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666800.3228137-1539-229518132729404/AnsiballZ_command.py
Dec 02 09:13:20 np0005541913.localdomain sudo[116430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:20 np0005541913.localdomain python3.9[116432]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:20 np0005541913.localdomain sudo[116430]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:21 np0005541913.localdomain sudo[116523]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crxqunfxiscmbtbuxzkwxxescfgdkcax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666800.8407614-1539-186762890130853/AnsiballZ_command.py
Dec 02 09:13:21 np0005541913.localdomain sudo[116523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:21 np0005541913.localdomain python3.9[116525]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:21 np0005541913.localdomain sudo[116523]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:21 np0005541913.localdomain sudo[116616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzncprikmytgextltrrdpwfejyovgapj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666801.4101696-1539-209513928604296/AnsiballZ_command.py
Dec 02 09:13:21 np0005541913.localdomain sudo[116616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:21 np0005541913.localdomain python3.9[116618]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:21 np0005541913.localdomain sudo[116616]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4366 DF PROTO=TCP SPT=46736 DPT=9101 SEQ=3369090741 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478615B00000000001030307) 
Dec 02 09:13:22 np0005541913.localdomain sudo[116709]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooyhyjkqmtnwjxqhnwnsmvbcdtlpwfkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666801.9781277-1539-170728208237776/AnsiballZ_command.py
Dec 02 09:13:22 np0005541913.localdomain sudo[116709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:22 np0005541913.localdomain python3.9[116711]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:22 np0005541913.localdomain sudo[116709]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:22 np0005541913.localdomain sudo[116802]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmskbhqfrosqbilpmhbhhltsfpzokuhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666802.5578272-1539-133381931444426/AnsiballZ_command.py
Dec 02 09:13:22 np0005541913.localdomain sudo[116802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:23 np0005541913.localdomain python3.9[116804]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:23 np0005541913.localdomain sudo[116802]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:23 np0005541913.localdomain sudo[116895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbjnihsuiueyknewllixlanoajcunupz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666803.12671-1539-31003382343764/AnsiballZ_command.py
Dec 02 09:13:23 np0005541913.localdomain sudo[116895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:23 np0005541913.localdomain python3.9[116897]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:23 np0005541913.localdomain sudo[116895]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:23 np0005541913.localdomain sudo[116988]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzxiyvtkconmixvazotvdpqywystiyhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666803.6401315-1539-9028765212220/AnsiballZ_command.py
Dec 02 09:13:23 np0005541913.localdomain sudo[116988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:24 np0005541913.localdomain python3.9[116990]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:24 np0005541913.localdomain sudo[116988]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:24 np0005541913.localdomain sudo[117081]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksfcgvvbzeasemequdjdxnctcanzqjbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666804.1826916-1539-161171169697327/AnsiballZ_command.py
Dec 02 09:13:24 np0005541913.localdomain sudo[117081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:24 np0005541913.localdomain python3.9[117083]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:24 np0005541913.localdomain sudo[117081]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:24 np0005541913.localdomain sudo[117174]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpcbbwdggpcjtvtmzvslrtsbbvmerjbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666804.7198393-1539-258285418444114/AnsiballZ_command.py
Dec 02 09:13:24 np0005541913.localdomain sudo[117174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:25 np0005541913.localdomain python3.9[117176]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:25 np0005541913.localdomain sudo[117174]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4368 DF PROTO=TCP SPT=46736 DPT=9101 SEQ=3369090741 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478621A40000000001030307) 
Dec 02 09:13:25 np0005541913.localdomain sudo[117267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwhnbptaldbkvspbhogwfzomediwoczj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666805.338422-1539-31521262096643/AnsiballZ_command.py
Dec 02 09:13:25 np0005541913.localdomain sudo[117267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:25 np0005541913.localdomain python3.9[117269]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:25 np0005541913.localdomain sudo[117267]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:26 np0005541913.localdomain sudo[117360]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghtzxtecqtqkbmadjwunmozkhdoloala ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666805.9544876-1539-244568970495715/AnsiballZ_command.py
Dec 02 09:13:26 np0005541913.localdomain sudo[117360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:26 np0005541913.localdomain python3.9[117362]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:26 np0005541913.localdomain sudo[117360]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:26 np0005541913.localdomain sudo[117453]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzvsuqxsbqlzscyvbetzbijpnuuxhmxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666806.5554678-1539-225023403659495/AnsiballZ_command.py
Dec 02 09:13:26 np0005541913.localdomain sudo[117453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:27 np0005541913.localdomain python3.9[117455]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:27 np0005541913.localdomain sudo[117453]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:27 np0005541913.localdomain sudo[117546]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qufwfeeqvsqbmcvbeaoziozoirxarodt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666807.1382668-1539-228175025809114/AnsiballZ_command.py
Dec 02 09:13:27 np0005541913.localdomain sudo[117546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:27 np0005541913.localdomain python3.9[117548]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:27 np0005541913.localdomain sudo[117546]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:27 np0005541913.localdomain sudo[117639]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roryatnzchxdjokwnnujityxennmvsfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666807.7233932-1539-219073713839979/AnsiballZ_command.py
Dec 02 09:13:27 np0005541913.localdomain sudo[117639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:28 np0005541913.localdomain python3.9[117641]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:28 np0005541913.localdomain sudo[117639]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:28 np0005541913.localdomain sudo[117732]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdwccfsjzdiskruvnbifftzkpfhbvaot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666808.2741358-1539-133757435594514/AnsiballZ_command.py
Dec 02 09:13:28 np0005541913.localdomain sudo[117732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:28 np0005541913.localdomain sudo[117735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:13:28 np0005541913.localdomain sudo[117735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:13:28 np0005541913.localdomain sudo[117735]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:28 np0005541913.localdomain sudo[117750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:13:28 np0005541913.localdomain sudo[117750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:13:28 np0005541913.localdomain python3.9[117734]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:28 np0005541913.localdomain sudo[117732]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:29 np0005541913.localdomain sudo[117867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewbkvuifszfkprcthzqkpbwowltibtpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666808.8246956-1539-115877148636527/AnsiballZ_command.py
Dec 02 09:13:29 np0005541913.localdomain sudo[117867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:29 np0005541913.localdomain python3.9[117873]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:29 np0005541913.localdomain sudo[117867]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:29 np0005541913.localdomain sudo[117750]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4369 DF PROTO=TCP SPT=46736 DPT=9101 SEQ=3369090741 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478631650000000001030307) 
Dec 02 09:13:29 np0005541913.localdomain sudo[117981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccjnrsrbcrpqyirppaxxkfysyoafrtnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666809.3810406-1539-159106581000592/AnsiballZ_command.py
Dec 02 09:13:29 np0005541913.localdomain sudo[117981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:29 np0005541913.localdomain python3.9[117983]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:29 np0005541913.localdomain sudo[117981]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:29 np0005541913.localdomain sudo[118012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:13:29 np0005541913.localdomain sudo[118012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:13:30 np0005541913.localdomain sudo[118012]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:30 np0005541913.localdomain sudo[118089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiecsskwnbmvwrvzmdtrmyyeyhdjmobc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666809.9355707-1539-144535410704273/AnsiballZ_command.py
Dec 02 09:13:30 np0005541913.localdomain sudo[118089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:30 np0005541913.localdomain python3.9[118091]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:30 np0005541913.localdomain sudo[118089]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:30 np0005541913.localdomain sudo[118182]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocqcfeficrebpyqfbwrfcooopftbtmai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666810.5190818-1539-97077772141290/AnsiballZ_command.py
Dec 02 09:13:30 np0005541913.localdomain sudo[118182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:30 np0005541913.localdomain python3.9[118184]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:30 np0005541913.localdomain sudo[118182]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:32 np0005541913.localdomain sshd[106244]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:13:32 np0005541913.localdomain systemd[1]: session-38.scope: Deactivated successfully.
Dec 02 09:13:32 np0005541913.localdomain systemd[1]: session-38.scope: Consumed 48.358s CPU time.
Dec 02 09:13:32 np0005541913.localdomain systemd-logind[757]: Session 38 logged out. Waiting for processes to exit.
Dec 02 09:13:32 np0005541913.localdomain systemd-logind[757]: Removed session 38.
Dec 02 09:13:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6338 DF PROTO=TCP SPT=50254 DPT=9102 SEQ=1784828263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4786436D0000000001030307) 
Dec 02 09:13:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28424 DF PROTO=TCP SPT=40026 DPT=9105 SEQ=1371069876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478643EE0000000001030307) 
Dec 02 09:13:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6340 DF PROTO=TCP SPT=50254 DPT=9102 SEQ=1784828263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47864F640000000001030307) 
Dec 02 09:13:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20023 DF PROTO=TCP SPT=43446 DPT=9100 SEQ=1727308412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47865B640000000001030307) 
Dec 02 09:13:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45252 DF PROTO=TCP SPT=42452 DPT=9882 SEQ=808888576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478667650000000001030307) 
Dec 02 09:13:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20025 DF PROTO=TCP SPT=43446 DPT=9100 SEQ=1727308412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478673250000000001030307) 
Dec 02 09:13:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6342 DF PROTO=TCP SPT=50254 DPT=9102 SEQ=1784828263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47867FE40000000001030307) 
Dec 02 09:13:52 np0005541913.localdomain sshd[118201]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:13:52 np0005541913.localdomain sshd[118201]: Accepted publickey for zuul from 192.168.122.30 port 37996 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:13:52 np0005541913.localdomain systemd-logind[757]: New session 39 of user zuul.
Dec 02 09:13:52 np0005541913.localdomain systemd[1]: Started Session 39 of User zuul.
Dec 02 09:13:52 np0005541913.localdomain sshd[118201]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:13:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3088 DF PROTO=TCP SPT=47780 DPT=9101 SEQ=3477556958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47868AE00000000001030307) 
Dec 02 09:13:52 np0005541913.localdomain python3.9[118294]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 02 09:13:54 np0005541913.localdomain python3.9[118398]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:13:54 np0005541913.localdomain sudo[118488]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqedxgqnrysggpmfqcbyoghmvwvfhokm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666834.401188-94-266260302980982/AnsiballZ_command.py
Dec 02 09:13:54 np0005541913.localdomain sudo[118488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:54 np0005541913.localdomain python3.9[118490]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:13:54 np0005541913.localdomain sudo[118488]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3090 DF PROTO=TCP SPT=47780 DPT=9101 SEQ=3477556958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478696E40000000001030307) 
Dec 02 09:13:55 np0005541913.localdomain sudo[118581]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjrvtenuuzntilgwnuszxwpfvpxwpeqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666835.353538-130-106356849583286/AnsiballZ_stat.py
Dec 02 09:13:55 np0005541913.localdomain sudo[118581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:55 np0005541913.localdomain python3.9[118583]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:13:55 np0005541913.localdomain sudo[118581]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:56 np0005541913.localdomain sudo[118673]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddbazxribwrhcrkbgjonridlsalhbyez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666836.1275952-154-121908183749054/AnsiballZ_file.py
Dec 02 09:13:56 np0005541913.localdomain sudo[118673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:56 np0005541913.localdomain python3.9[118675]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:56 np0005541913.localdomain sudo[118673]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:57 np0005541913.localdomain sudo[118765]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amgfsrarikrjrtovyiddrzgzarwwqcec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666836.8977585-178-220646490490930/AnsiballZ_stat.py
Dec 02 09:13:57 np0005541913.localdomain sudo[118765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:57 np0005541913.localdomain python3.9[118767]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:13:57 np0005541913.localdomain sudo[118765]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:57 np0005541913.localdomain sudo[118838]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-durklyebgzfoxbthdjalhexaqcmxvhpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666836.8977585-178-220646490490930/AnsiballZ_copy.py
Dec 02 09:13:57 np0005541913.localdomain sudo[118838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:57 np0005541913.localdomain python3.9[118840]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764666836.8977585-178-220646490490930/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:13:58 np0005541913.localdomain sudo[118838]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:58 np0005541913.localdomain sudo[118930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bweulrjmdqntdvpvbrboltnhlagfaulr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666838.252139-223-124243370059752/AnsiballZ_setup.py
Dec 02 09:13:58 np0005541913.localdomain sudo[118930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:58 np0005541913.localdomain python3.9[118932]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:13:59 np0005541913.localdomain sudo[118930]: pam_unix(sudo:session): session closed for user root
Dec 02 09:13:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3091 DF PROTO=TCP SPT=47780 DPT=9101 SEQ=3477556958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4786A6A40000000001030307) 
Dec 02 09:13:59 np0005541913.localdomain sudo[119026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhowyqrmomdqshndwisaxlsftmkuiebv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666839.2097614-247-251073329336523/AnsiballZ_file.py
Dec 02 09:13:59 np0005541913.localdomain sudo[119026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:13:59 np0005541913.localdomain python3.9[119028]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:13:59 np0005541913.localdomain sudo[119026]: pam_unix(sudo:session): session closed for user root
Dec 02 09:14:00 np0005541913.localdomain sudo[119118]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kifwbvngkjhjlbmctkawkijwhtnbyfja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666839.8826127-274-51668722202091/AnsiballZ_file.py
Dec 02 09:14:00 np0005541913.localdomain sudo[119118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:14:00 np0005541913.localdomain python3.9[119120]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:14:00 np0005541913.localdomain sudo[119118]: pam_unix(sudo:session): session closed for user root
Dec 02 09:14:01 np0005541913.localdomain python3.9[119210]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:14:01 np0005541913.localdomain network[119227]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:14:01 np0005541913.localdomain network[119228]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:14:01 np0005541913.localdomain network[119229]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:14:02 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:14:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37698 DF PROTO=TCP SPT=52042 DPT=9102 SEQ=1732293158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4786B89E0000000001030307) 
Dec 02 09:14:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46409 DF PROTO=TCP SPT=52604 DPT=9105 SEQ=3146473158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4786B91E0000000001030307) 
Dec 02 09:14:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37700 DF PROTO=TCP SPT=52042 DPT=9102 SEQ=1732293158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4786C4A50000000001030307) 
Dec 02 09:14:08 np0005541913.localdomain python3.9[119426]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:14:08 np0005541913.localdomain python3.9[119516]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:14:09 np0005541913.localdomain sudo[119610]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhgqkkzadmgxgxcfcxroqprbkbshcpwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666849.3262663-376-160790531458054/AnsiballZ_command.py
Dec 02 09:14:09 np0005541913.localdomain sudo[119610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:14:09 np0005541913.localdomain python3.9[119612]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                            set -euxo pipefail
                                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                            python3 -m venv ./venv
                                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                            # This is required for FIPS enabled until trunk.rdoproject.org
                                                            # is not being served from a centos7 host, tracked by
                                                            # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                            dnf -y install crypto-policies
                                                            update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                            ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                            
                                                            # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                            # with rhel 9.2 openssh
                                                            dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                            # FIXME: perform dnf upgrade for other packages in EDPM ansible
                                                            # here we only ensuring that decontainerized libvirt can start
                                                            dnf -y upgrade openstack-selinux
                                                            rm -f /run/virtlogd.pid
                                                            
                                                            rm -rf repo-setup-main
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:14:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4276 DF PROTO=TCP SPT=33714 DPT=9100 SEQ=2178519232 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4786D0A40000000001030307) 
Dec 02 09:14:12 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61896 DF PROTO=TCP SPT=34570 DPT=9100 SEQ=3363060885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4786DBE40000000001030307) 
Dec 02 09:14:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4278 DF PROTO=TCP SPT=33714 DPT=9100 SEQ=2178519232 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4786E8860000000001030307) 
Dec 02 09:14:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37702 DF PROTO=TCP SPT=52042 DPT=9102 SEQ=1732293158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4786F3E40000000001030307) 
Dec 02 09:14:19 np0005541913.localdomain sshd[45356]: Received signal 15; terminating.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 02 09:14:19 np0005541913.localdomain sshd[119655]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:14:19 np0005541913.localdomain sshd[119655]: Server listening on 0.0.0.0 port 22.
Dec 02 09:14:19 np0005541913.localdomain sshd[119655]: Server listening on :: port 22.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: run-re073af74d5844759971be19ad57f6fae.service: Deactivated successfully.
Dec 02 09:14:19 np0005541913.localdomain systemd[1]: run-r5823f4815365465baed69fb2cb536815.service: Deactivated successfully.
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 02 09:14:20 np0005541913.localdomain sshd[119655]: Received signal 15; terminating.
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 02 09:14:20 np0005541913.localdomain sshd[119826]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:14:20 np0005541913.localdomain sshd[119826]: Server listening on 0.0.0.0 port 22.
Dec 02 09:14:20 np0005541913.localdomain sshd[119826]: Server listening on :: port 22.
Dec 02 09:14:20 np0005541913.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 02 09:14:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55368 DF PROTO=TCP SPT=38298 DPT=9101 SEQ=4057788527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478700100000000001030307) 
Dec 02 09:14:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55370 DF PROTO=TCP SPT=38298 DPT=9101 SEQ=4057788527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47870C240000000001030307) 
Dec 02 09:14:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55371 DF PROTO=TCP SPT=38298 DPT=9101 SEQ=4057788527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47871BE40000000001030307) 
Dec 02 09:14:30 np0005541913.localdomain sudo[119897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:14:30 np0005541913.localdomain sudo[119897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:14:30 np0005541913.localdomain sudo[119897]: pam_unix(sudo:session): session closed for user root
Dec 02 09:14:30 np0005541913.localdomain sudo[119916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:14:30 np0005541913.localdomain sudo[119916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:14:30 np0005541913.localdomain sudo[119916]: pam_unix(sudo:session): session closed for user root
Dec 02 09:14:31 np0005541913.localdomain sudo[119985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:14:31 np0005541913.localdomain sudo[119985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:14:31 np0005541913.localdomain sudo[119985]: pam_unix(sudo:session): session closed for user root
Dec 02 09:14:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49019 DF PROTO=TCP SPT=35002 DPT=9102 SEQ=1644456260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47872DCE0000000001030307) 
Dec 02 09:14:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35292 DF PROTO=TCP SPT=41226 DPT=9105 SEQ=3782781303 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47872E4E0000000001030307) 
Dec 02 09:14:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49021 DF PROTO=TCP SPT=35002 DPT=9102 SEQ=1644456260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478739E50000000001030307) 
Dec 02 09:14:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6050 DF PROTO=TCP SPT=45524 DPT=9100 SEQ=2387660439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478745A40000000001030307) 
Dec 02 09:14:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25741 DF PROTO=TCP SPT=48856 DPT=9882 SEQ=299773429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478751E40000000001030307) 
Dec 02 09:14:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6052 DF PROTO=TCP SPT=45524 DPT=9100 SEQ=2387660439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47875D640000000001030307) 
Dec 02 09:14:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35296 DF PROTO=TCP SPT=41226 DPT=9105 SEQ=3782781303 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478769E40000000001030307) 
Dec 02 09:14:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27554 DF PROTO=TCP SPT=53696 DPT=9101 SEQ=2980122975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787753F0000000001030307) 
Dec 02 09:14:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27556 DF PROTO=TCP SPT=53696 DPT=9101 SEQ=2980122975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478781660000000001030307) 
Dec 02 09:14:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27557 DF PROTO=TCP SPT=53696 DPT=9101 SEQ=2980122975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478791240000000001030307) 
Dec 02 09:15:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5163 DF PROTO=TCP SPT=47828 DPT=9102 SEQ=2776328889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787A2FD0000000001030307) 
Dec 02 09:15:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49720 DF PROTO=TCP SPT=40520 DPT=9105 SEQ=4015283433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787A37F0000000001030307) 
Dec 02 09:15:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5165 DF PROTO=TCP SPT=47828 DPT=9102 SEQ=2776328889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787AF240000000001030307) 
Dec 02 09:15:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57468 DF PROTO=TCP SPT=36354 DPT=9100 SEQ=3821444339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787BAE50000000001030307) 
Dec 02 09:15:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38737 DF PROTO=TCP SPT=34998 DPT=9882 SEQ=296824685 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787C7240000000001030307) 
Dec 02 09:15:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57470 DF PROTO=TCP SPT=36354 DPT=9100 SEQ=3821444339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787D2A50000000001030307) 
Dec 02 09:15:17 np0005541913.localdomain sshd[120299]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:15:17 np0005541913.localdomain sshd[120300]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:15:17 np0005541913.localdomain sshd[120300]: error: kex_exchange_identification: read: Connection reset by peer
Dec 02 09:15:17 np0005541913.localdomain sshd[120300]: Connection reset by 45.140.17.97 port 47328
Dec 02 09:15:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49724 DF PROTO=TCP SPT=40520 DPT=9105 SEQ=4015283433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787DFE40000000001030307) 
Dec 02 09:15:22 np0005541913.localdomain sshd[120305]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:15:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42766 DF PROTO=TCP SPT=49826 DPT=9101 SEQ=3659717258 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787EA700000000001030307) 
Dec 02 09:15:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42768 DF PROTO=TCP SPT=49826 DPT=9101 SEQ=3659717258 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4787F6640000000001030307) 
Dec 02 09:15:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42769 DF PROTO=TCP SPT=49826 DPT=9101 SEQ=3659717258 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478806250000000001030307) 
Dec 02 09:15:30 np0005541913.localdomain sshd[120305]: Connection closed by 138.68.131.233 port 46136 [preauth]
Dec 02 09:15:31 np0005541913.localdomain sudo[120342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:15:31 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=16 res=1
Dec 02 09:15:31 np0005541913.localdomain sudo[120342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:15:31 np0005541913.localdomain sudo[120342]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:31 np0005541913.localdomain sudo[120357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:15:31 np0005541913.localdomain sudo[120357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:15:32 np0005541913.localdomain sudo[120357]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:32 np0005541913.localdomain kernel: SELinux:  Converting 2754 SID table entries...
Dec 02 09:15:32 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 09:15:32 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 09:15:32 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 09:15:32 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 09:15:32 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 09:15:32 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 09:15:32 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 09:15:33 np0005541913.localdomain sudo[119610]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48811 DF PROTO=TCP SPT=46548 DPT=9102 SEQ=2783521122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4788182D0000000001030307) 
Dec 02 09:15:33 np0005541913.localdomain sudo[120592]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yijvphqyrlnorizftcnzjamwaewzgeyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666933.7720156-403-237458326119713/AnsiballZ_file.py
Dec 02 09:15:34 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Dec 02 09:15:34 np0005541913.localdomain sudo[120592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37572 DF PROTO=TCP SPT=50216 DPT=9105 SEQ=1091935778 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478818AE0000000001030307) 
Dec 02 09:15:34 np0005541913.localdomain python3.9[120594]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:15:34 np0005541913.localdomain sudo[120592]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:34 np0005541913.localdomain sudo[120684]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfuwwjzeenkrwddoiuqbebhebqnrqaso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666934.3913689-427-85164160909968/AnsiballZ_stat.py
Dec 02 09:15:34 np0005541913.localdomain sudo[120684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:34 np0005541913.localdomain python3.9[120686]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:15:34 np0005541913.localdomain sudo[120684]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:35 np0005541913.localdomain sudo[120757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esxjutsjsiyiblnwhzvwshmbkvitbvzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666934.3913689-427-85164160909968/AnsiballZ_copy.py
Dec 02 09:15:35 np0005541913.localdomain sudo[120757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:35 np0005541913.localdomain python3.9[120759]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764666934.3913689-427-85164160909968/.source.fact _original_basename=.oir8p_a3 follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:15:35 np0005541913.localdomain sudo[120757]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:36 np0005541913.localdomain sudo[120850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:15:36 np0005541913.localdomain sudo[120850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:15:36 np0005541913.localdomain sudo[120850]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:36 np0005541913.localdomain python3.9[120849]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:15:36 np0005541913.localdomain sudo[120960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-senmuaoaplpbqzpcdspwwagonqvphfqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666936.7145133-502-159588462274074/AnsiballZ_setup.py
Dec 02 09:15:36 np0005541913.localdomain sudo[120960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48813 DF PROTO=TCP SPT=46548 DPT=9102 SEQ=2783521122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478824240000000001030307) 
Dec 02 09:15:37 np0005541913.localdomain python3.9[120962]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:15:37 np0005541913.localdomain sudo[120960]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:37 np0005541913.localdomain sudo[121014]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggiizeuktvkmfrzoquuaxlheczwlmrbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666936.7145133-502-159588462274074/AnsiballZ_dnf.py
Dec 02 09:15:37 np0005541913.localdomain sudo[121014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:38 np0005541913.localdomain python3.9[121016]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:15:39 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25744 DF PROTO=TCP SPT=48856 DPT=9882 SEQ=299773429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47882FE50000000001030307) 
Dec 02 09:15:41 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:15:41 np0005541913.localdomain systemd-rc-local-generator[121044]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:15:41 np0005541913.localdomain systemd-sysv-generator[121051]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:15:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:15:41 np0005541913.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 09:15:42 np0005541913.localdomain sudo[121014]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6055 DF PROTO=TCP SPT=45524 DPT=9100 SEQ=2387660439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47883BE40000000001030307) 
Dec 02 09:15:44 np0005541913.localdomain sudo[121154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgmeuxuhcskjxhixummfbeqqgbifqmhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666944.032983-538-158322399442733/AnsiballZ_command.py
Dec 02 09:15:44 np0005541913.localdomain sudo[121154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:44 np0005541913.localdomain python3.9[121156]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:15:45 np0005541913.localdomain sudo[121154]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:45 np0005541913.localdomain sudo[121393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wevcpibwwholaplygvibdkjkznfrdwcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666945.3522978-562-212335460041724/AnsiballZ_selinux.py
Dec 02 09:15:45 np0005541913.localdomain sudo[121393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10755 DF PROTO=TCP SPT=51852 DPT=9100 SEQ=659875632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478847E40000000001030307) 
Dec 02 09:15:46 np0005541913.localdomain python3.9[121395]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 02 09:15:46 np0005541913.localdomain sudo[121393]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:46 np0005541913.localdomain sudo[121485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovjenugzgibzxggjxswtmilnxscymrqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666946.6383045-595-85493331303179/AnsiballZ_command.py
Dec 02 09:15:46 np0005541913.localdomain sudo[121485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:47 np0005541913.localdomain python3.9[121487]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 02 09:15:47 np0005541913.localdomain sudo[121485]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:48 np0005541913.localdomain sudo[121578]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udizwqxtrkgwvtgqibsflidxrqsgltjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666947.9257488-619-164756394173421/AnsiballZ_file.py
Dec 02 09:15:48 np0005541913.localdomain sudo[121578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:48 np0005541913.localdomain python3.9[121580]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:15:48 np0005541913.localdomain sudo[121578]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:48 np0005541913.localdomain sudo[121670]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcvyucrekxmkrxjqjjhrjydelmkwvfwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666948.55259-643-145068831536128/AnsiballZ_mount.py
Dec 02 09:15:48 np0005541913.localdomain sudo[121670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:49 np0005541913.localdomain python3.9[121672]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 02 09:15:49 np0005541913.localdomain sudo[121670]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48815 DF PROTO=TCP SPT=46548 DPT=9102 SEQ=2783521122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478853E50000000001030307) 
Dec 02 09:15:50 np0005541913.localdomain sudo[121762]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtmprehqufscdvbryblaifqzyiwloeck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666950.5479028-727-21331162951872/AnsiballZ_file.py
Dec 02 09:15:50 np0005541913.localdomain sudo[121762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:51 np0005541913.localdomain python3.9[121764]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:15:51 np0005541913.localdomain sudo[121762]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:51 np0005541913.localdomain sudo[121854]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sokzjbatbrqsbrqmlbgarkdbzgpjwuug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666951.1942809-751-252761449007449/AnsiballZ_stat.py
Dec 02 09:15:51 np0005541913.localdomain sudo[121854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:51 np0005541913.localdomain python3.9[121856]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:15:51 np0005541913.localdomain sudo[121854]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:52 np0005541913.localdomain sudo[121927]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrnnauzmsbdthdzpevxoqgzcymaeukks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666951.1942809-751-252761449007449/AnsiballZ_copy.py
Dec 02 09:15:52 np0005541913.localdomain sudo[121927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49472 DF PROTO=TCP SPT=46846 DPT=9101 SEQ=780407079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47885FA00000000001030307) 
Dec 02 09:15:52 np0005541913.localdomain python3.9[121929]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764666951.1942809-751-252761449007449/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=73226dd0fbcefd6bca2e777d65fae037e6bf10fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:15:52 np0005541913.localdomain sudo[121927]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:53 np0005541913.localdomain sudo[122019]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utmqotjntrqwuzfwziyjzpedvpugazzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666952.9531224-823-107981817528098/AnsiballZ_stat.py
Dec 02 09:15:53 np0005541913.localdomain sudo[122019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:53 np0005541913.localdomain python3.9[122021]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:15:53 np0005541913.localdomain sudo[122019]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:54 np0005541913.localdomain sudo[122113]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yytvuibcawvprsoihpsgmewppjpchsei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666953.9964924-862-168767169764011/AnsiballZ_getent.py
Dec 02 09:15:54 np0005541913.localdomain sudo[122113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:54 np0005541913.localdomain python3.9[122115]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 02 09:15:54 np0005541913.localdomain sudo[122113]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:55 np0005541913.localdomain sudo[122206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jidkxvncgenjvygaqypwfeqdppjkpzlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666955.0202773-893-49894446083461/AnsiballZ_getent.py
Dec 02 09:15:55 np0005541913.localdomain sudo[122206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49474 DF PROTO=TCP SPT=46846 DPT=9101 SEQ=780407079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47886BA40000000001030307) 
Dec 02 09:15:55 np0005541913.localdomain python3.9[122208]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 02 09:15:55 np0005541913.localdomain sudo[122206]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:56 np0005541913.localdomain sudo[122299]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhpdxjezvphzjieltnzikqnpbajquhas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666955.6663675-916-235739295188481/AnsiballZ_group.py
Dec 02 09:15:56 np0005541913.localdomain sudo[122299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:56 np0005541913.localdomain python3.9[122301]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 09:15:56 np0005541913.localdomain groupmod[122302]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Dec 02 09:15:56 np0005541913.localdomain groupmod[122302]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Dec 02 09:15:56 np0005541913.localdomain sudo[122299]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:56 np0005541913.localdomain sudo[122397]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azlekzdaulpvnpbhuawcfgccmolkelpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666956.565286-943-228906162469826/AnsiballZ_file.py
Dec 02 09:15:56 np0005541913.localdomain sudo[122397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:57 np0005541913.localdomain python3.9[122399]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 02 09:15:57 np0005541913.localdomain sudo[122397]: pam_unix(sudo:session): session closed for user root
Dec 02 09:15:58 np0005541913.localdomain sudo[122489]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuewgvfjnbptcdwciqjtiyhjfihobswk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666957.739608-976-220084106824674/AnsiballZ_dnf.py
Dec 02 09:15:58 np0005541913.localdomain sudo[122489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:15:58 np0005541913.localdomain python3.9[122491]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:15:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49475 DF PROTO=TCP SPT=46846 DPT=9101 SEQ=780407079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47887B640000000001030307) 
Dec 02 09:16:01 np0005541913.localdomain sudo[122489]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36594 DF PROTO=TCP SPT=53536 DPT=9102 SEQ=2979098041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47888D5D0000000001030307) 
Dec 02 09:16:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16070 DF PROTO=TCP SPT=36918 DPT=9105 SEQ=1378865609 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47888DE00000000001030307) 
Dec 02 09:16:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36596 DF PROTO=TCP SPT=53536 DPT=9102 SEQ=2979098041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478899640000000001030307) 
Dec 02 09:16:07 np0005541913.localdomain sudo[122583]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqphdiwyfafijzzyztlmvxbrqyxiwelr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666967.7714374-1000-85449093897603/AnsiballZ_file.py
Dec 02 09:16:07 np0005541913.localdomain sudo[122583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:08 np0005541913.localdomain python3.9[122585]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:16:08 np0005541913.localdomain sudo[122583]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3585 DF PROTO=TCP SPT=40812 DPT=9100 SEQ=1480713733 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4788A5640000000001030307) 
Dec 02 09:16:12 np0005541913.localdomain sudo[122675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xarbijyldcyoftlzefrdptbjirkksrmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666972.1213596-1024-25856825055567/AnsiballZ_stat.py
Dec 02 09:16:12 np0005541913.localdomain sudo[122675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:12 np0005541913.localdomain python3.9[122677]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:16:12 np0005541913.localdomain sudo[122675]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:12 np0005541913.localdomain sudo[122748]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mydfbnegpkxvedydjojwitduugacwqjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666972.1213596-1024-25856825055567/AnsiballZ_copy.py
Dec 02 09:16:12 np0005541913.localdomain sudo[122748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:13 np0005541913.localdomain python3.9[122750]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764666972.1213596-1024-25856825055567/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:16:13 np0005541913.localdomain sudo[122748]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41543 DF PROTO=TCP SPT=47632 DPT=9882 SEQ=4209715839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4788B1640000000001030307) 
Dec 02 09:16:15 np0005541913.localdomain sudo[122840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcchpsqyoxlmucovlmtchtvokchwxxqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666974.5440166-1069-220378938483436/AnsiballZ_systemd.py
Dec 02 09:16:15 np0005541913.localdomain sudo[122840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:15 np0005541913.localdomain python3.9[122842]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:16:15 np0005541913.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 02 09:16:15 np0005541913.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 02 09:16:15 np0005541913.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 02 09:16:15 np0005541913.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 02 09:16:15 np0005541913.localdomain systemd-modules-load[122846]: Module 'msr' is built in
Dec 02 09:16:15 np0005541913.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 02 09:16:15 np0005541913.localdomain sudo[122840]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:15 np0005541913.localdomain sudo[122937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mokmlnrcbnisxodahqccroantojvlmyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666975.6873782-1094-212866617272328/AnsiballZ_stat.py
Dec 02 09:16:15 np0005541913.localdomain sudo[122937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:16 np0005541913.localdomain python3.9[122939]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:16:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3587 DF PROTO=TCP SPT=40812 DPT=9100 SEQ=1480713733 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4788BD250000000001030307) 
Dec 02 09:16:16 np0005541913.localdomain sudo[122937]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:16 np0005541913.localdomain sudo[123010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cazhthcrkxshnaktftwvimgigxifashw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666975.6873782-1094-212866617272328/AnsiballZ_copy.py
Dec 02 09:16:16 np0005541913.localdomain sudo[123010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:16 np0005541913.localdomain python3.9[123012]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764666975.6873782-1094-212866617272328/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:16:16 np0005541913.localdomain sudo[123010]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:17 np0005541913.localdomain sudo[123102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzdoybdgjpchuyaeofwbwtsxxnunacgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666977.1641955-1147-232934661777484/AnsiballZ_dnf.py
Dec 02 09:16:17 np0005541913.localdomain sudo[123102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:17 np0005541913.localdomain python3.9[123104]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:16:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36598 DF PROTO=TCP SPT=53536 DPT=9102 SEQ=2979098041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4788C9E40000000001030307) 
Dec 02 09:16:20 np0005541913.localdomain sudo[123102]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:21 np0005541913.localdomain python3.9[123196]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:16:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12345 DF PROTO=TCP SPT=45770 DPT=9101 SEQ=498264673 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4788D4D00000000001030307) 
Dec 02 09:16:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12347 DF PROTO=TCP SPT=45770 DPT=9101 SEQ=498264673 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4788E0E40000000001030307) 
Dec 02 09:16:26 np0005541913.localdomain python3.9[123288]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 02 09:16:27 np0005541913.localdomain python3.9[123378]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:16:28 np0005541913.localdomain sudo[123468]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eclepsguojxaxjvkfqndlogszkstawfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666987.9930885-1270-257001792500556/AnsiballZ_systemd.py
Dec 02 09:16:28 np0005541913.localdomain sudo[123468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12348 DF PROTO=TCP SPT=45770 DPT=9101 SEQ=498264673 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4788F0A40000000001030307) 
Dec 02 09:16:29 np0005541913.localdomain python3.9[123470]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:16:29 np0005541913.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 02 09:16:29 np0005541913.localdomain systemd[1]: tuned.service: Deactivated successfully.
Dec 02 09:16:29 np0005541913.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 02 09:16:29 np0005541913.localdomain systemd[1]: tuned.service: Consumed 1.834s CPU time, no IO.
Dec 02 09:16:29 np0005541913.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 02 09:16:30 np0005541913.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Dec 02 09:16:30 np0005541913.localdomain sudo[123468]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:32 np0005541913.localdomain python3.9[123572]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 02 09:16:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34440 DF PROTO=TCP SPT=35194 DPT=9102 SEQ=3571250042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789028D0000000001030307) 
Dec 02 09:16:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51580 DF PROTO=TCP SPT=36988 DPT=9105 SEQ=2487867486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789030F0000000001030307) 
Dec 02 09:16:35 np0005541913.localdomain sudo[123662]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncgedmjisidotsicpauzpwdvudeuhukb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666995.172129-1441-207075065615656/AnsiballZ_systemd.py
Dec 02 09:16:35 np0005541913.localdomain sudo[123662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:35 np0005541913.localdomain python3.9[123664]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:16:35 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:16:35 np0005541913.localdomain systemd-rc-local-generator[123693]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:16:35 np0005541913.localdomain systemd-sysv-generator[123696]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:16:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:16:36 np0005541913.localdomain sudo[123662]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:36 np0005541913.localdomain sudo[123728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:16:36 np0005541913.localdomain sudo[123728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:16:36 np0005541913.localdomain sudo[123728]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:36 np0005541913.localdomain sudo[123767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:16:36 np0005541913.localdomain sudo[123767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:16:36 np0005541913.localdomain sudo[123821]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggyvuxznrnxtscgartbtzsxptilbgvbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666996.3005795-1441-79337012278277/AnsiballZ_systemd.py
Dec 02 09:16:36 np0005541913.localdomain sudo[123821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:36 np0005541913.localdomain python3.9[123823]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:16:36 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:16:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34442 DF PROTO=TCP SPT=35194 DPT=9102 SEQ=3571250042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47890EA40000000001030307) 
Dec 02 09:16:37 np0005541913.localdomain systemd-rc-local-generator[123876]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:16:37 np0005541913.localdomain systemd-sysv-generator[123880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:16:37 np0005541913.localdomain sudo[123767]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:37 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:16:37 np0005541913.localdomain sudo[123821]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:37 np0005541913.localdomain sudo[123907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:16:37 np0005541913.localdomain sudo[123907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:16:37 np0005541913.localdomain sudo[123907]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:37 np0005541913.localdomain sudo[123922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 09:16:37 np0005541913.localdomain sudo[123922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:16:37 np0005541913.localdomain sudo[123922]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:38 np0005541913.localdomain sudo[124030]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhmrwdijjnptclusabaolhjhfkjmdfud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666998.4579701-1489-111018147114725/AnsiballZ_command.py
Dec 02 09:16:38 np0005541913.localdomain sudo[124030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:38 np0005541913.localdomain python3.9[124032]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:16:38 np0005541913.localdomain sudo[124030]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:39 np0005541913.localdomain sudo[124123]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vaosqeljzdreivjsptviuqmyrvcbimsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666999.162293-1513-119412553321545/AnsiballZ_command.py
Dec 02 09:16:39 np0005541913.localdomain sudo[124123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:39 np0005541913.localdomain python3.9[124125]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:16:39 np0005541913.localdomain kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Dec 02 09:16:39 np0005541913.localdomain sudo[124123]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62114 DF PROTO=TCP SPT=55758 DPT=9100 SEQ=4126746310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47891AA40000000001030307) 
Dec 02 09:16:40 np0005541913.localdomain sudo[124216]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqzuhiytqacmnridhkoqcsxlytvekcsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764666999.876394-1537-233423468296906/AnsiballZ_command.py
Dec 02 09:16:40 np0005541913.localdomain sudo[124216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:40 np0005541913.localdomain python3.9[124218]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:16:40 np0005541913.localdomain sudo[124222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:16:40 np0005541913.localdomain sudo[124222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:16:40 np0005541913.localdomain sudo[124222]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:41 np0005541913.localdomain sudo[124216]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:42 np0005541913.localdomain sudo[124330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xljkvtbgzdhpwkwxmeufbaioestpimyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667001.8494616-1562-151745366530489/AnsiballZ_command.py
Dec 02 09:16:42 np0005541913.localdomain sudo[124330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:42 np0005541913.localdomain python3.9[124332]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:16:42 np0005541913.localdomain sudo[124330]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:42 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10758 DF PROTO=TCP SPT=51852 DPT=9100 SEQ=659875632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478925E40000000001030307) 
Dec 02 09:16:43 np0005541913.localdomain sudo[124423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngkxkvdfnyeqgjwljtbuymouvswhkfky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667003.6250322-1585-95251378762305/AnsiballZ_systemd.py
Dec 02 09:16:43 np0005541913.localdomain sudo[124423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:44 np0005541913.localdomain python3.9[124425]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:16:45 np0005541913.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 02 09:16:45 np0005541913.localdomain systemd[1]: Stopped Apply Kernel Variables.
Dec 02 09:16:45 np0005541913.localdomain systemd[1]: Stopping Apply Kernel Variables...
Dec 02 09:16:45 np0005541913.localdomain systemd[1]: Starting Apply Kernel Variables...
Dec 02 09:16:45 np0005541913.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 02 09:16:45 np0005541913.localdomain systemd[1]: Finished Apply Kernel Variables.
Dec 02 09:16:45 np0005541913.localdomain sudo[124423]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:45 np0005541913.localdomain sshd[118201]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:16:45 np0005541913.localdomain systemd[1]: session-39.scope: Deactivated successfully.
Dec 02 09:16:45 np0005541913.localdomain systemd[1]: session-39.scope: Consumed 1min 57.123s CPU time.
Dec 02 09:16:45 np0005541913.localdomain systemd-logind[757]: Session 39 logged out. Waiting for processes to exit.
Dec 02 09:16:45 np0005541913.localdomain systemd-logind[757]: Removed session 39.
Dec 02 09:16:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62116 DF PROTO=TCP SPT=55758 DPT=9100 SEQ=4126746310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478932640000000001030307) 
Dec 02 09:16:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34444 DF PROTO=TCP SPT=35194 DPT=9102 SEQ=3571250042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47893DE40000000001030307) 
Dec 02 09:16:52 np0005541913.localdomain sshd[124446]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:16:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22365 DF PROTO=TCP SPT=33142 DPT=9101 SEQ=371735630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47894A010000000001030307) 
Dec 02 09:16:52 np0005541913.localdomain sshd[124446]: Accepted publickey for zuul from 192.168.122.30 port 52422 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:16:52 np0005541913.localdomain systemd-logind[757]: New session 40 of user zuul.
Dec 02 09:16:52 np0005541913.localdomain systemd[1]: Started Session 40 of User zuul.
Dec 02 09:16:52 np0005541913.localdomain sshd[124446]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:16:53 np0005541913.localdomain python3.9[124539]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:16:55 np0005541913.localdomain python3.9[124633]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:16:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22367 DF PROTO=TCP SPT=33142 DPT=9101 SEQ=371735630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478956250000000001030307) 
Dec 02 09:16:56 np0005541913.localdomain sudo[124727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykvigedqnxfvrezwzfxxjoqkvrnkoase ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667015.9666376-111-190350014047875/AnsiballZ_command.py
Dec 02 09:16:56 np0005541913.localdomain sudo[124727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:56 np0005541913.localdomain python3.9[124729]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:16:56 np0005541913.localdomain sudo[124727]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:57 np0005541913.localdomain python3.9[124820]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:16:58 np0005541913.localdomain sudo[124914]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztbazoyjnwgvxpwqipmekiikimtyyfty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667018.037254-171-161047419559565/AnsiballZ_setup.py
Dec 02 09:16:58 np0005541913.localdomain sudo[124914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:58 np0005541913.localdomain python3.9[124916]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:16:58 np0005541913.localdomain sudo[124914]: pam_unix(sudo:session): session closed for user root
Dec 02 09:16:59 np0005541913.localdomain sudo[124968]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiywkvbmpnwrkdxiaegcnzdouqrtehlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667018.037254-171-161047419559565/AnsiballZ_dnf.py
Dec 02 09:16:59 np0005541913.localdomain sudo[124968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:16:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22368 DF PROTO=TCP SPT=33142 DPT=9101 SEQ=371735630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478965E40000000001030307) 
Dec 02 09:16:59 np0005541913.localdomain python3.9[124970]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:17:02 np0005541913.localdomain sudo[124968]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:03 np0005541913.localdomain sudo[125062]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsvmtqkbynyrhujufkdxsezuzcsnmpaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667023.2985413-207-90679433372076/AnsiballZ_setup.py
Dec 02 09:17:03 np0005541913.localdomain sudo[125062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:03 np0005541913.localdomain python3.9[125064]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:17:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57118 DF PROTO=TCP SPT=59636 DPT=9102 SEQ=2262952419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478977BE0000000001030307) 
Dec 02 09:17:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26803 DF PROTO=TCP SPT=38970 DPT=9105 SEQ=3931880294 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789783F0000000001030307) 
Dec 02 09:17:04 np0005541913.localdomain sudo[125062]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:05 np0005541913.localdomain sudo[125217]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eahoodcqffokxilseharnanmxecdnbxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667025.0232081-240-2651767602725/AnsiballZ_file.py
Dec 02 09:17:05 np0005541913.localdomain sudo[125217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:05 np0005541913.localdomain python3.9[125219]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:17:05 np0005541913.localdomain sudo[125217]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:06 np0005541913.localdomain sudo[125309]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydwaiuoexyypnkmmxzsuvgimyaorldnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667025.8545976-264-126639367208074/AnsiballZ_command.py
Dec 02 09:17:06 np0005541913.localdomain sudo[125309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:06 np0005541913.localdomain python3.9[125311]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:17:06 np0005541913.localdomain sudo[125309]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57120 DF PROTO=TCP SPT=59636 DPT=9102 SEQ=2262952419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478983E40000000001030307) 
Dec 02 09:17:07 np0005541913.localdomain sudo[125414]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjvbbvzqzisgfvjbltztpdluigcllqgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667026.8766663-288-107330103166306/AnsiballZ_stat.py
Dec 02 09:17:07 np0005541913.localdomain sudo[125414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:07 np0005541913.localdomain python3.9[125416]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:17:07 np0005541913.localdomain sudo[125414]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:07 np0005541913.localdomain sudo[125462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqhucvnzeddbturgkyclzacoiilyyltz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667026.8766663-288-107330103166306/AnsiballZ_file.py
Dec 02 09:17:07 np0005541913.localdomain sudo[125462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:07 np0005541913.localdomain python3.9[125464]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:17:07 np0005541913.localdomain sudo[125462]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:08 np0005541913.localdomain sudo[125555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oikkhhefnnopgsqfyghnxtjhrclukwse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667028.1348813-324-192139157947119/AnsiballZ_stat.py
Dec 02 09:17:08 np0005541913.localdomain sudo[125555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:08 np0005541913.localdomain python3.9[125557]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:17:08 np0005541913.localdomain sudo[125555]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:09 np0005541913.localdomain sudo[125628]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhajmanlqwzrqozbwykootyhdtrfcfnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667028.1348813-324-192139157947119/AnsiballZ_copy.py
Dec 02 09:17:09 np0005541913.localdomain sudo[125628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:09 np0005541913.localdomain python3.9[125630]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667028.1348813-324-192139157947119/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:17:09 np0005541913.localdomain sudo[125628]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:10 np0005541913.localdomain sudo[125720]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eivwllbkssaulwxjwkxnnpmevllqweto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667029.623828-372-264150080473321/AnsiballZ_ini_file.py
Dec 02 09:17:10 np0005541913.localdomain sudo[125720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33945 DF PROTO=TCP SPT=41010 DPT=9100 SEQ=2615179156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47898FA40000000001030307) 
Dec 02 09:17:10 np0005541913.localdomain python3.9[125722]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:17:10 np0005541913.localdomain sudo[125720]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:10 np0005541913.localdomain systemd-journald[47611]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Dec 02 09:17:10 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 09:17:10 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:17:10 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:17:10 np0005541913.localdomain sudo[125813]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfllxjmzuuwybqhdxjdpcuhjkofrffqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667030.398329-372-262381897953528/AnsiballZ_ini_file.py
Dec 02 09:17:10 np0005541913.localdomain sudo[125813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:17:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 4776 writes, 21K keys, 4776 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4776 writes, 569 syncs, 8.39 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:17:10 np0005541913.localdomain python3.9[125815]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:17:10 np0005541913.localdomain sudo[125813]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:11 np0005541913.localdomain sudo[125905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxzoerratrmnztppujqznlvfyvgdjtue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667030.971395-372-153839742152109/AnsiballZ_ini_file.py
Dec 02 09:17:11 np0005541913.localdomain sudo[125905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:11 np0005541913.localdomain python3.9[125907]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:17:11 np0005541913.localdomain sudo[125905]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:12 np0005541913.localdomain sudo[125997]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uivooffxtczelhnvemvqdcggyuhsbrau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667031.7900822-372-133182461891678/AnsiballZ_ini_file.py
Dec 02 09:17:12 np0005541913.localdomain sudo[125997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:12 np0005541913.localdomain python3.9[125999]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:17:12 np0005541913.localdomain sudo[125997]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27647 DF PROTO=TCP SPT=46044 DPT=9882 SEQ=2056543486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47899BE40000000001030307) 
Dec 02 09:17:13 np0005541913.localdomain python3.9[126089]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:17:13 np0005541913.localdomain sudo[126181]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsmacufsdlfupzlaedmipytatvmxhttt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667033.4604683-492-169640223979204/AnsiballZ_dnf.py
Dec 02 09:17:13 np0005541913.localdomain sudo[126181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:13 np0005541913.localdomain python3.9[126183]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 09:17:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:17:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.2 total, 600.0 interval
                                                          Cumulative writes: 5722 writes, 25K keys, 5722 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5722 writes, 780 syncs, 7.34 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:17:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33947 DF PROTO=TCP SPT=41010 DPT=9100 SEQ=2615179156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789A7640000000001030307) 
Dec 02 09:17:17 np0005541913.localdomain sudo[126181]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:17 np0005541913.localdomain sudo[126275]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkqohbvqqsvazeoawlvkipqbnmsmfdex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667037.3374686-516-37406088040885/AnsiballZ_dnf.py
Dec 02 09:17:17 np0005541913.localdomain sudo[126275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:17 np0005541913.localdomain python3.9[126277]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 09:17:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57122 DF PROTO=TCP SPT=59636 DPT=9102 SEQ=2262952419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789B3E40000000001030307) 
Dec 02 09:17:20 np0005541913.localdomain sudo[126275]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:21 np0005541913.localdomain sudo[126369]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkzbocbeexxsmzsdviakcvpmmkflvzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667041.4851074-546-247025699804869/AnsiballZ_dnf.py
Dec 02 09:17:21 np0005541913.localdomain sudo[126369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:22 np0005541913.localdomain python3.9[126371]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 09:17:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61837 DF PROTO=TCP SPT=41094 DPT=9101 SEQ=2326005047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789BF2F0000000001030307) 
Dec 02 09:17:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61839 DF PROTO=TCP SPT=41094 DPT=9101 SEQ=2326005047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789CB240000000001030307) 
Dec 02 09:17:25 np0005541913.localdomain sudo[126369]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:26 np0005541913.localdomain sudo[126469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmwdonxonkpommnlfdgcedeqzfmfjxnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667045.805786-573-104610463906881/AnsiballZ_dnf.py
Dec 02 09:17:26 np0005541913.localdomain sudo[126469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:26 np0005541913.localdomain python3.9[126471]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 09:17:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61840 DF PROTO=TCP SPT=41094 DPT=9101 SEQ=2326005047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789DAE40000000001030307) 
Dec 02 09:17:29 np0005541913.localdomain sudo[126469]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:30 np0005541913.localdomain sudo[126563]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muhzfgkuqxlmisixeofvwoncmbherwpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667049.980616-609-136224516269145/AnsiballZ_dnf.py
Dec 02 09:17:30 np0005541913.localdomain sudo[126563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:30 np0005541913.localdomain python3.9[126565]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 09:17:33 np0005541913.localdomain sudo[126563]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35409 DF PROTO=TCP SPT=54814 DPT=9102 SEQ=1314632820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789ECEE0000000001030307) 
Dec 02 09:17:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60828 DF PROTO=TCP SPT=32992 DPT=9105 SEQ=474927710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789ED6E0000000001030307) 
Dec 02 09:17:34 np0005541913.localdomain sudo[126657]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ottkjfigeuegqgakvuepnwjnwyyvqcxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667054.0048418-636-77574968294999/AnsiballZ_dnf.py
Dec 02 09:17:34 np0005541913.localdomain sudo[126657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:34 np0005541913.localdomain python3.9[126659]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 09:17:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35411 DF PROTO=TCP SPT=54814 DPT=9102 SEQ=1314632820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4789F8E50000000001030307) 
Dec 02 09:17:37 np0005541913.localdomain sudo[126657]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:38 np0005541913.localdomain sudo[126751]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fryrcbokosrhqyzszfvmtufwnagthcrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667058.5715032-663-252847731585823/AnsiballZ_dnf.py
Dec 02 09:17:38 np0005541913.localdomain sudo[126751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:39 np0005541913.localdomain python3.9[126753]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 09:17:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57878 DF PROTO=TCP SPT=36960 DPT=9100 SEQ=3294421962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A04E50000000001030307) 
Dec 02 09:17:40 np0005541913.localdomain sudo[126756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:17:40 np0005541913.localdomain sudo[126756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:17:40 np0005541913.localdomain sudo[126756]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:40 np0005541913.localdomain sudo[126771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:17:40 np0005541913.localdomain sudo[126771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:17:41 np0005541913.localdomain sudo[126771]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:41 np0005541913.localdomain sudo[126817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:17:41 np0005541913.localdomain sudo[126817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:17:41 np0005541913.localdomain sudo[126817]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:41 np0005541913.localdomain sudo[126832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 -- inventory --format=json-pretty --filter-for-batch
Dec 02 09:17:41 np0005541913.localdomain sudo[126832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:17:42 np0005541913.localdomain podman[126889]: 
Dec 02 09:17:42 np0005541913.localdomain podman[126889]: 2025-12-02 09:17:42.56454445 +0000 UTC m=+0.082199293 container create 0a7d769e135bdc49e57aeba28a6b7db501fa30678c8512d63b3db6043332eeb2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_thompson, vcs-type=git, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, version=7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:17:42 np0005541913.localdomain systemd[1]: Started libpod-conmon-0a7d769e135bdc49e57aeba28a6b7db501fa30678c8512d63b3db6043332eeb2.scope.
Dec 02 09:17:42 np0005541913.localdomain podman[126889]: 2025-12-02 09:17:42.529741281 +0000 UTC m=+0.047396214 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:17:42 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:17:42 np0005541913.localdomain podman[126889]: 2025-12-02 09:17:42.652119366 +0000 UTC m=+0.169774219 container init 0a7d769e135bdc49e57aeba28a6b7db501fa30678c8512d63b3db6043332eeb2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_thompson, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, architecture=x86_64, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=)
Dec 02 09:17:42 np0005541913.localdomain systemd[1]: tmp-crun.TpO52g.mount: Deactivated successfully.
Dec 02 09:17:42 np0005541913.localdomain podman[126889]: 2025-12-02 09:17:42.665301263 +0000 UTC m=+0.182956106 container start 0a7d769e135bdc49e57aeba28a6b7db501fa30678c8512d63b3db6043332eeb2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_thompson, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7)
Dec 02 09:17:42 np0005541913.localdomain podman[126889]: 2025-12-02 09:17:42.665790316 +0000 UTC m=+0.183445169 container attach 0a7d769e135bdc49e57aeba28a6b7db501fa30678c8512d63b3db6043332eeb2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_thompson, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, build-date=2025-11-26T19:44:28Z)
Dec 02 09:17:42 np0005541913.localdomain interesting_thompson[126909]: 167 167
Dec 02 09:17:42 np0005541913.localdomain systemd[1]: libpod-0a7d769e135bdc49e57aeba28a6b7db501fa30678c8512d63b3db6043332eeb2.scope: Deactivated successfully.
Dec 02 09:17:42 np0005541913.localdomain podman[126889]: 2025-12-02 09:17:42.66970222 +0000 UTC m=+0.187357103 container died 0a7d769e135bdc49e57aeba28a6b7db501fa30678c8512d63b3db6043332eeb2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_thompson, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhceph, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:17:42 np0005541913.localdomain podman[126914]: 2025-12-02 09:17:42.776119863 +0000 UTC m=+0.092858605 container remove 0a7d769e135bdc49e57aeba28a6b7db501fa30678c8512d63b3db6043332eeb2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_thompson, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_CLEAN=True, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:17:42 np0005541913.localdomain systemd[1]: libpod-conmon-0a7d769e135bdc49e57aeba28a6b7db501fa30678c8512d63b3db6043332eeb2.scope: Deactivated successfully.
Dec 02 09:17:42 np0005541913.localdomain podman[126937]: 
Dec 02 09:17:42 np0005541913.localdomain podman[126937]: 2025-12-02 09:17:42.991935887 +0000 UTC m=+0.062825892 container create 584cf8401d3577a809601fc10ecf5badb1bca48e7e859e3b601e9f6e8b6cfd83 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_blackwell, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218, build-date=2025-11-26T19:44:28Z, RELEASE=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:17:43 np0005541913.localdomain systemd[1]: Started libpod-conmon-584cf8401d3577a809601fc10ecf5badb1bca48e7e859e3b601e9f6e8b6cfd83.scope.
Dec 02 09:17:43 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:17:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09380d250f89949106767e72a6b6f29b2ae1bf10c044a8bbf99723f4c9324e41/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 09:17:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09380d250f89949106767e72a6b6f29b2ae1bf10c044a8bbf99723f4c9324e41/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 09:17:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09380d250f89949106767e72a6b6f29b2ae1bf10c044a8bbf99723f4c9324e41/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 09:17:43 np0005541913.localdomain podman[126937]: 2025-12-02 09:17:42.961666517 +0000 UTC m=+0.032556562 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:17:43 np0005541913.localdomain podman[126937]: 2025-12-02 09:17:43.064379382 +0000 UTC m=+0.135269397 container init 584cf8401d3577a809601fc10ecf5badb1bca48e7e859e3b601e9f6e8b6cfd83 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_blackwell, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vcs-type=git, name=rhceph, release=1763362218, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, distribution-scope=public)
Dec 02 09:17:43 np0005541913.localdomain podman[126937]: 2025-12-02 09:17:43.074593472 +0000 UTC m=+0.145483477 container start 584cf8401d3577a809601fc10ecf5badb1bca48e7e859e3b601e9f6e8b6cfd83 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_blackwell, release=1763362218, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Dec 02 09:17:43 np0005541913.localdomain podman[126937]: 2025-12-02 09:17:43.074873809 +0000 UTC m=+0.145763844 container attach 584cf8401d3577a809601fc10ecf5badb1bca48e7e859e3b601e9f6e8b6cfd83 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_blackwell, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, architecture=x86_64, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:17:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33396 DF PROTO=TCP SPT=41688 DPT=9882 SEQ=2959870546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A10E50000000001030307) 
Dec 02 09:17:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a223cc482315e6f5ca6ed7f89a417a449a6873319cee807130f2dbd9f2404549-merged.mount: Deactivated successfully.
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]: [
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:     {
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         "available": false,
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         "ceph_device": false,
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         "lsm_data": {},
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         "lvs": [],
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         "path": "/dev/sr0",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         "rejected_reasons": [
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "Insufficient space (<5GB)",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "Has a FileSystem"
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         ],
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         "sys_api": {
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "actuators": null,
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "device_nodes": "sr0",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "human_readable_size": "482.00 KB",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "id_bus": "ata",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "model": "QEMU DVD-ROM",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "nr_requests": "2",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "partitions": {},
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "path": "/dev/sr0",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "removable": "1",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "rev": "2.5+",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "ro": "0",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "rotational": "1",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "sas_address": "",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "sas_device_handle": "",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "scheduler_mode": "mq-deadline",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "sectors": 0,
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "sectorsize": "2048",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "size": 493568.0,
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "support_discard": "0",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "type": "disk",
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:             "vendor": "QEMU"
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:         }
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]:     }
Dec 02 09:17:43 np0005541913.localdomain beautiful_blackwell[126952]: ]
Dec 02 09:17:43 np0005541913.localdomain systemd[1]: libpod-584cf8401d3577a809601fc10ecf5badb1bca48e7e859e3b601e9f6e8b6cfd83.scope: Deactivated successfully.
Dec 02 09:17:43 np0005541913.localdomain podman[126937]: 2025-12-02 09:17:43.959741648 +0000 UTC m=+1.030631703 container died 584cf8401d3577a809601fc10ecf5badb1bca48e7e859e3b601e9f6e8b6cfd83 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_blackwell, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_CLEAN=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container)
Dec 02 09:17:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-09380d250f89949106767e72a6b6f29b2ae1bf10c044a8bbf99723f4c9324e41-merged.mount: Deactivated successfully.
Dec 02 09:17:44 np0005541913.localdomain podman[128377]: 2025-12-02 09:17:44.042649369 +0000 UTC m=+0.071703806 container remove 584cf8401d3577a809601fc10ecf5badb1bca48e7e859e3b601e9f6e8b6cfd83 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_blackwell, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat Ceph Storage 7)
Dec 02 09:17:44 np0005541913.localdomain systemd[1]: libpod-conmon-584cf8401d3577a809601fc10ecf5badb1bca48e7e859e3b601e9f6e8b6cfd83.scope: Deactivated successfully.
Dec 02 09:17:44 np0005541913.localdomain sudo[126832]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:44 np0005541913.localdomain sudo[128392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:17:44 np0005541913.localdomain sudo[128392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:17:44 np0005541913.localdomain sudo[128392]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57880 DF PROTO=TCP SPT=36960 DPT=9100 SEQ=3294421962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A1CA40000000001030307) 
Dec 02 09:17:49 np0005541913.localdomain sudo[126751]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35413 DF PROTO=TCP SPT=54814 DPT=9102 SEQ=1314632820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A29E40000000001030307) 
Dec 02 09:17:50 np0005541913.localdomain sudo[128563]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muwiwuimwxdhzonwjwbafrgnvlzxiyxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667070.1144593-699-135479334661930/AnsiballZ_file.py
Dec 02 09:17:50 np0005541913.localdomain sudo[128563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:50 np0005541913.localdomain python3.9[128565]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:17:50 np0005541913.localdomain sudo[128563]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:51 np0005541913.localdomain sudo[128668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxcccqoilacxvdpfimhwcbjfgbpnddlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667070.8009062-723-36403732161604/AnsiballZ_stat.py
Dec 02 09:17:51 np0005541913.localdomain sudo[128668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:51 np0005541913.localdomain python3.9[128670]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:17:51 np0005541913.localdomain sudo[128668]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:51 np0005541913.localdomain sudo[128741]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oijgkfsevwxchnyiqahjbtshprjxhenc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667070.8009062-723-36403732161604/AnsiballZ_copy.py
Dec 02 09:17:51 np0005541913.localdomain sudo[128741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:52 np0005541913.localdomain python3.9[128743]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764667070.8009062-723-36403732161604/.source.json _original_basename=.q_s3xwg4 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:17:52 np0005541913.localdomain sudo[128741]: pam_unix(sudo:session): session closed for user root
Dec 02 09:17:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58831 DF PROTO=TCP SPT=47950 DPT=9101 SEQ=281279711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A34600000000001030307) 
Dec 02 09:17:52 np0005541913.localdomain sudo[128833]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvrezbktrpkvtkrplaucqzgtvdglnvqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667072.4549005-777-79287921487826/AnsiballZ_podman_image.py
Dec 02 09:17:52 np0005541913.localdomain sudo[128833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:17:53 np0005541913.localdomain python3.9[128835]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 09:17:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58833 DF PROTO=TCP SPT=47950 DPT=9101 SEQ=281279711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A40640000000001030307) 
Dec 02 09:17:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58834 DF PROTO=TCP SPT=47950 DPT=9101 SEQ=281279711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A50240000000001030307) 
Dec 02 09:18:00 np0005541913.localdomain podman[128847]: 2025-12-02 09:17:53.266354978 +0000 UTC m=+0.039647759 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 02 09:18:01 np0005541913.localdomain sudo[128833]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:02 np0005541913.localdomain sudo[129047]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojqavubpnsjbqatciqwvvvrkxymrwkzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667082.2368653-810-136874128425756/AnsiballZ_podman_image.py
Dec 02 09:18:02 np0005541913.localdomain sudo[129047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:02 np0005541913.localdomain python3.9[129049]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 09:18:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49408 DF PROTO=TCP SPT=48156 DPT=9102 SEQ=768687781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A621D0000000001030307) 
Dec 02 09:18:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20611 DF PROTO=TCP SPT=40694 DPT=9105 SEQ=2119834166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A629F0000000001030307) 
Dec 02 09:18:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49410 DF PROTO=TCP SPT=48156 DPT=9102 SEQ=768687781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A6E240000000001030307) 
Dec 02 09:18:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27650 DF PROTO=TCP SPT=46044 DPT=9882 SEQ=2056543486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A79E50000000001030307) 
Dec 02 09:18:10 np0005541913.localdomain systemd[1]: Starting dnf makecache...
Dec 02 09:18:10 np0005541913.localdomain dnf[129112]: Updating Subscription Management repositories.
Dec 02 09:18:12 np0005541913.localdomain podman[129061]: 2025-12-02 09:18:02.867597986 +0000 UTC m=+0.050744342 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 02 09:18:12 np0005541913.localdomain sudo[129047]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:12 np0005541913.localdomain dnf[129112]: Metadata cache refreshed recently.
Dec 02 09:18:13 np0005541913.localdomain sudo[129261]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ontxhwznniuhwikkgfycmgsmhbrnudfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667092.8305385-846-198226184735371/AnsiballZ_podman_image.py
Dec 02 09:18:13 np0005541913.localdomain sudo[129261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33950 DF PROTO=TCP SPT=41010 DPT=9100 SEQ=2615179156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A85E50000000001030307) 
Dec 02 09:18:13 np0005541913.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 02 09:18:13 np0005541913.localdomain systemd[1]: Finished dnf makecache.
Dec 02 09:18:13 np0005541913.localdomain systemd[1]: dnf-makecache.service: Consumed 2.546s CPU time.
Dec 02 09:18:13 np0005541913.localdomain python3.9[129263]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 09:18:14 np0005541913.localdomain podman[129276]: 2025-12-02 09:18:13.401221417 +0000 UTC m=+0.029232963 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Dec 02 09:18:15 np0005541913.localdomain sudo[129261]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:16 np0005541913.localdomain sudo[129438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyfokewecbeneikowrskulletnkgwlze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667095.8762362-873-222106246608650/AnsiballZ_podman_image.py
Dec 02 09:18:16 np0005541913.localdomain sudo[129438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37301 DF PROTO=TCP SPT=57024 DPT=9100 SEQ=2549518513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A91E40000000001030307) 
Dec 02 09:18:16 np0005541913.localdomain python3.9[129440]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 09:18:17 np0005541913.localdomain podman[129453]: 2025-12-02 09:18:16.449342086 +0000 UTC m=+0.027142699 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 09:18:18 np0005541913.localdomain sudo[129438]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:18 np0005541913.localdomain sudo[129614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqxzziduzinzsvovjzluytocwslpxxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667098.7240477-900-115532084447971/AnsiballZ_podman_image.py
Dec 02 09:18:18 np0005541913.localdomain sudo[129614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:19 np0005541913.localdomain python3.9[129616]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 09:18:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20615 DF PROTO=TCP SPT=40694 DPT=9105 SEQ=2119834166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478A9DE40000000001030307) 
Dec 02 09:18:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=436 DF PROTO=TCP SPT=47336 DPT=9101 SEQ=103032161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478AA9900000000001030307) 
Dec 02 09:18:22 np0005541913.localdomain podman[129630]: 2025-12-02 09:18:19.318044479 +0000 UTC m=+0.035934311 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 02 09:18:22 np0005541913.localdomain sudo[129614]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:23 np0005541913.localdomain sudo[129807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dafarbdicictohzljrgwxvtymxgbihtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667103.0446305-900-248674408309440/AnsiballZ_podman_image.py
Dec 02 09:18:23 np0005541913.localdomain sudo[129807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:23 np0005541913.localdomain python3.9[129809]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 09:18:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=438 DF PROTO=TCP SPT=47336 DPT=9101 SEQ=103032161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478AB5A50000000001030307) 
Dec 02 09:18:25 np0005541913.localdomain podman[129822]: 2025-12-02 09:18:23.656698488 +0000 UTC m=+0.044917658 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Dec 02 09:18:25 np0005541913.localdomain sudo[129807]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:27 np0005541913.localdomain sshd[124446]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:18:27 np0005541913.localdomain systemd[1]: session-40.scope: Deactivated successfully.
Dec 02 09:18:27 np0005541913.localdomain systemd[1]: session-40.scope: Consumed 1min 34.098s CPU time.
Dec 02 09:18:27 np0005541913.localdomain systemd-logind[757]: Session 40 logged out. Waiting for processes to exit.
Dec 02 09:18:27 np0005541913.localdomain systemd-logind[757]: Removed session 40.
Dec 02 09:18:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=439 DF PROTO=TCP SPT=47336 DPT=9101 SEQ=103032161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478AC5650000000001030307) 
Dec 02 09:18:33 np0005541913.localdomain sshd[130353]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:18:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2138 DF PROTO=TCP SPT=36282 DPT=9102 SEQ=3021099205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478AD74D0000000001030307) 
Dec 02 09:18:33 np0005541913.localdomain sshd[130353]: Accepted publickey for zuul from 192.168.122.30 port 35558 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:18:33 np0005541913.localdomain systemd-logind[757]: New session 41 of user zuul.
Dec 02 09:18:33 np0005541913.localdomain systemd[1]: Started Session 41 of User zuul.
Dec 02 09:18:34 np0005541913.localdomain sshd[130353]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:18:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19656 DF PROTO=TCP SPT=57260 DPT=9105 SEQ=2171853980 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478AD7CE0000000001030307) 
Dec 02 09:18:36 np0005541913.localdomain python3.9[130446]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:18:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2140 DF PROTO=TCP SPT=36282 DPT=9102 SEQ=3021099205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478AE3650000000001030307) 
Dec 02 09:18:37 np0005541913.localdomain sudo[130540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrfpfoygmjqbrwledicqjtlqbwvxfcob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667116.896834-69-38141353950977/AnsiballZ_getent.py
Dec 02 09:18:37 np0005541913.localdomain sudo[130540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:38 np0005541913.localdomain python3.9[130542]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 02 09:18:38 np0005541913.localdomain sudo[130540]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:38 np0005541913.localdomain sudo[130633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyaehzvsdjidjwzmrrrmbhkrbgqcrmtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667118.5868442-105-261653491844181/AnsiballZ_setup.py
Dec 02 09:18:38 np0005541913.localdomain sudo[130633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:39 np0005541913.localdomain python3.9[130635]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:18:39 np0005541913.localdomain sudo[130633]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:39 np0005541913.localdomain sudo[130687]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysgrojviydiskayemngshsanzkxjpaob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667118.5868442-105-261653491844181/AnsiballZ_dnf.py
Dec 02 09:18:39 np0005541913.localdomain sudo[130687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11567 DF PROTO=TCP SPT=53918 DPT=9100 SEQ=565826996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478AEF240000000001030307) 
Dec 02 09:18:40 np0005541913.localdomain python3.9[130689]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 09:18:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10262 DF PROTO=TCP SPT=45458 DPT=9882 SEQ=2592248336 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478AFB640000000001030307) 
Dec 02 09:18:43 np0005541913.localdomain sudo[130687]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:45 np0005541913.localdomain sudo[130706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:18:45 np0005541913.localdomain sudo[130706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:18:45 np0005541913.localdomain sudo[130706]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:45 np0005541913.localdomain sudo[130734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:18:45 np0005541913.localdomain sudo[130734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:18:45 np0005541913.localdomain sudo[130811]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goqotmganigejxnyyknxyufjudeutzpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667125.0984256-147-165578851030793/AnsiballZ_dnf.py
Dec 02 09:18:45 np0005541913.localdomain sudo[130811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:45 np0005541913.localdomain python3.9[130813]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:18:45 np0005541913.localdomain sudo[130734]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11569 DF PROTO=TCP SPT=53918 DPT=9100 SEQ=565826996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B06E40000000001030307) 
Dec 02 09:18:48 np0005541913.localdomain sudo[130811]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:48 np0005541913.localdomain sudo[130861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:18:48 np0005541913.localdomain sudo[130861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:18:48 np0005541913.localdomain sudo[130861]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19660 DF PROTO=TCP SPT=57260 DPT=9105 SEQ=2171853980 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B13E40000000001030307) 
Dec 02 09:18:49 np0005541913.localdomain sudo[130951]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maurtcnopknbbdswkhunjgtspsupbpwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667129.0950596-171-107134039386996/AnsiballZ_systemd.py
Dec 02 09:18:49 np0005541913.localdomain sudo[130951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:49 np0005541913.localdomain python3.9[130953]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 09:18:49 np0005541913.localdomain sudo[130951]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:51 np0005541913.localdomain python3.9[131046]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:18:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29055 DF PROTO=TCP SPT=58352 DPT=9101 SEQ=3716116661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B1EC00000000001030307) 
Dec 02 09:18:52 np0005541913.localdomain sudo[131136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgcnxgyencdogqpdirrzxjbmmjljjers ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667132.1472626-225-57800308843010/AnsiballZ_sefcontext.py
Dec 02 09:18:52 np0005541913.localdomain sudo[131136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:52 np0005541913.localdomain python3.9[131138]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 02 09:18:53 np0005541913.localdomain kernel: SELinux:  Converting 2756 SID table entries...
Dec 02 09:18:53 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 09:18:53 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 09:18:53 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 09:18:53 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 09:18:53 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 09:18:53 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 09:18:53 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 09:18:53 np0005541913.localdomain sudo[131136]: pam_unix(sudo:session): session closed for user root
Dec 02 09:18:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29057 DF PROTO=TCP SPT=58352 DPT=9101 SEQ=3716116661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B2AE40000000001030307) 
Dec 02 09:18:55 np0005541913.localdomain python3.9[131233]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:18:56 np0005541913.localdomain sudo[131329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhzftiukbniyaxsltslfqeocvgkefzjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667135.864994-279-49389776439008/AnsiballZ_dnf.py
Dec 02 09:18:56 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Dec 02 09:18:56 np0005541913.localdomain sudo[131329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:18:56 np0005541913.localdomain python3.9[131331]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:18:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29058 DF PROTO=TCP SPT=58352 DPT=9101 SEQ=3716116661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B3AA40000000001030307) 
Dec 02 09:18:59 np0005541913.localdomain sudo[131329]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:02 np0005541913.localdomain sudo[131423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zojrsipbjvchpsdasioqzprbycpdljgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667141.5941048-303-75149892265524/AnsiballZ_command.py
Dec 02 09:19:02 np0005541913.localdomain sudo[131423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:02 np0005541913.localdomain python3.9[131425]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:19:02 np0005541913.localdomain sudo[131423]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:03 np0005541913.localdomain sudo[131668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waavlevqxplhivdgfvzxmarakzdmpirr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667143.1416643-327-42392159630822/AnsiballZ_file.py
Dec 02 09:19:03 np0005541913.localdomain sudo[131668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:03 np0005541913.localdomain python3.9[131670]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 09:19:03 np0005541913.localdomain sudo[131668]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45912 DF PROTO=TCP SPT=36228 DPT=9102 SEQ=2758390445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B4C7E0000000001030307) 
Dec 02 09:19:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43444 DF PROTO=TCP SPT=40744 DPT=9105 SEQ=2968734845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B4CFE0000000001030307) 
Dec 02 09:19:04 np0005541913.localdomain python3.9[131760]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:19:05 np0005541913.localdomain sudo[131852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkryrhddugbcvcdiudzxzjabftsrwemj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667144.8015165-381-188959248918857/AnsiballZ_dnf.py
Dec 02 09:19:05 np0005541913.localdomain sudo[131852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:05 np0005541913.localdomain python3.9[131854]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:19:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45914 DF PROTO=TCP SPT=36228 DPT=9102 SEQ=2758390445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B58A40000000001030307) 
Dec 02 09:19:09 np0005541913.localdomain sudo[131852]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:09 np0005541913.localdomain sudo[131946]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mszwkxwurrcuimwvpbbylpsvbapphcgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667149.2365897-405-151639583833140/AnsiballZ_dnf.py
Dec 02 09:19:09 np0005541913.localdomain sudo[131946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:09 np0005541913.localdomain python3.9[131948]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:19:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60266 DF PROTO=TCP SPT=53026 DPT=9100 SEQ=461185822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B64640000000001030307) 
Dec 02 09:19:12 np0005541913.localdomain sudo[131946]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:12 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37304 DF PROTO=TCP SPT=57024 DPT=9100 SEQ=2549518513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B6FE40000000001030307) 
Dec 02 09:19:13 np0005541913.localdomain sudo[132040]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfoqwlrqpnfpihlirxhoeanskrvnipdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667153.114194-429-138047191474904/AnsiballZ_systemd.py
Dec 02 09:19:13 np0005541913.localdomain sudo[132040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:13 np0005541913.localdomain python3.9[132042]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 09:19:13 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:19:13 np0005541913.localdomain systemd-rc-local-generator[132068]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:19:13 np0005541913.localdomain systemd-sysv-generator[132072]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:19:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:19:14 np0005541913.localdomain sudo[132040]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:14 np0005541913.localdomain sudo[132172]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnjjribuewibkolrlvwtuidoeticholr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667154.3988373-459-79964082026475/AnsiballZ_stat.py
Dec 02 09:19:14 np0005541913.localdomain sudo[132172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:14 np0005541913.localdomain python3.9[132174]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:19:14 np0005541913.localdomain sudo[132172]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:15 np0005541913.localdomain sudo[132264]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuoobjccviopwffucnizgzflybyadgpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667155.069594-486-133973607877712/AnsiballZ_ini_file.py
Dec 02 09:19:15 np0005541913.localdomain sudo[132264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:15 np0005541913.localdomain python3.9[132266]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:19:15 np0005541913.localdomain sudo[132264]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60268 DF PROTO=TCP SPT=53026 DPT=9100 SEQ=461185822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B7C240000000001030307) 
Dec 02 09:19:16 np0005541913.localdomain sudo[132358]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uktqwchzggfsbxetexhrsfwnrjjtvliv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667155.9230108-510-3731239220633/AnsiballZ_ini_file.py
Dec 02 09:19:16 np0005541913.localdomain sudo[132358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:16 np0005541913.localdomain python3.9[132360]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:19:16 np0005541913.localdomain sudo[132358]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:16 np0005541913.localdomain sudo[132450]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdsihxugotagbnbskiuefgcimpjniqtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667156.5846362-534-36101466877406/AnsiballZ_ini_file.py
Dec 02 09:19:16 np0005541913.localdomain sudo[132450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:17 np0005541913.localdomain python3.9[132452]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:19:17 np0005541913.localdomain sudo[132450]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:17 np0005541913.localdomain sudo[132542]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flqwetmwovhcygcmpjtuyfltdfemhwog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667157.455903-564-54375218309095/AnsiballZ_stat.py
Dec 02 09:19:17 np0005541913.localdomain sudo[132542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:17 np0005541913.localdomain python3.9[132544]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:19:17 np0005541913.localdomain sudo[132542]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:18 np0005541913.localdomain sudo[132615]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bacwxuwccqgfksjiuczhgxcrzwihwxsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667157.455903-564-54375218309095/AnsiballZ_copy.py
Dec 02 09:19:18 np0005541913.localdomain sudo[132615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:18 np0005541913.localdomain python3.9[132617]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667157.455903-564-54375218309095/.source _original_basename=.d0hy13k0 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:19:18 np0005541913.localdomain sudo[132615]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:18 np0005541913.localdomain sudo[132707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntvaauukbeidnhqzdyrolcteqpvgicdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667158.7450297-609-128882606083874/AnsiballZ_file.py
Dec 02 09:19:18 np0005541913.localdomain sudo[132707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45916 DF PROTO=TCP SPT=36228 DPT=9102 SEQ=2758390445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B87E40000000001030307) 
Dec 02 09:19:19 np0005541913.localdomain python3.9[132709]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:19:19 np0005541913.localdomain sudo[132707]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:19 np0005541913.localdomain sudo[132799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhfulapnnpxqwlnievqjsgtxgbpodiqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667159.388268-633-269025245424455/AnsiballZ_edpm_os_net_config_mappings.py
Dec 02 09:19:19 np0005541913.localdomain sudo[132799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:20 np0005541913.localdomain python3.9[132801]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 02 09:19:20 np0005541913.localdomain sudo[132799]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:20 np0005541913.localdomain sudo[132891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhquidabqcwthukqaflkxxcikadmrads ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667160.2359445-660-145016304352428/AnsiballZ_file.py
Dec 02 09:19:20 np0005541913.localdomain sudo[132891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:20 np0005541913.localdomain python3.9[132893]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:19:20 np0005541913.localdomain sudo[132891]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:21 np0005541913.localdomain sudo[132983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxrtrciejizawvdihdsumlhprbiwucag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667161.1020353-690-105109381684194/AnsiballZ_stat.py
Dec 02 09:19:21 np0005541913.localdomain sudo[132983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:21 np0005541913.localdomain python3.9[132985]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:19:21 np0005541913.localdomain sudo[132983]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13548 DF PROTO=TCP SPT=49356 DPT=9101 SEQ=67537735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B93F00000000001030307) 
Dec 02 09:19:22 np0005541913.localdomain sudo[133056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhzpjteouxmolkzcnsdoxhmymdflopxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667161.1020353-690-105109381684194/AnsiballZ_copy.py
Dec 02 09:19:22 np0005541913.localdomain sudo[133056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:22 np0005541913.localdomain python3.9[133058]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667161.1020353-690-105109381684194/.source.yaml _original_basename=.f3eku8ka follow=False checksum=06d744ebe702728c19f6d1a8f97158d086012058 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:19:22 np0005541913.localdomain sudo[133056]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:23 np0005541913.localdomain sudo[133148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ziamagogyzsxtcfkuoitowczvtybadic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667162.9332724-735-187972947627776/AnsiballZ_slurp.py
Dec 02 09:19:23 np0005541913.localdomain sudo[133148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:23 np0005541913.localdomain python3.9[133150]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 02 09:19:23 np0005541913.localdomain sudo[133148]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13550 DF PROTO=TCP SPT=49356 DPT=9101 SEQ=67537735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478B9FE40000000001030307) 
Dec 02 09:19:25 np0005541913.localdomain sudo[133253]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjdnfopgbnsbovpopdotpmiivekvzgio ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667164.793688-762-239923763327776/async_wrapper.py j210078628763 300 /home/zuul/.ansible/tmp/ansible-tmp-1764667164.793688-762-239923763327776/AnsiballZ_edpm_os_net_config.py _
Dec 02 09:19:25 np0005541913.localdomain sudo[133253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:25 np0005541913.localdomain ansible-async_wrapper.py[133255]: Invoked with j210078628763 300 /home/zuul/.ansible/tmp/ansible-tmp-1764667164.793688-762-239923763327776/AnsiballZ_edpm_os_net_config.py _
Dec 02 09:19:25 np0005541913.localdomain ansible-async_wrapper.py[133258]: Starting module and watcher
Dec 02 09:19:25 np0005541913.localdomain ansible-async_wrapper.py[133258]: Start watching 133259 (300)
Dec 02 09:19:25 np0005541913.localdomain ansible-async_wrapper.py[133259]: Start module (133259)
Dec 02 09:19:25 np0005541913.localdomain ansible-async_wrapper.py[133255]: Return async_wrapper task started.
Dec 02 09:19:25 np0005541913.localdomain sudo[133253]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:25 np0005541913.localdomain python3.9[133260]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False
Dec 02 09:19:26 np0005541913.localdomain ansible-async_wrapper.py[133259]: Module complete (133259)
Dec 02 09:19:29 np0005541913.localdomain sudo[133350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhqdzeikdtoewlvfevxrghnurhfwopra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667168.653562-762-99949204048520/AnsiballZ_async_status.py
Dec 02 09:19:29 np0005541913.localdomain sudo[133350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:29 np0005541913.localdomain python3.9[133352]: ansible-ansible.legacy.async_status Invoked with jid=j210078628763.133255 mode=status _async_dir=/root/.ansible_async
Dec 02 09:19:29 np0005541913.localdomain sudo[133350]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13551 DF PROTO=TCP SPT=49356 DPT=9101 SEQ=67537735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478BAFA40000000001030307) 
Dec 02 09:19:29 np0005541913.localdomain sudo[133409]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pueatkxpkuwjvpkgcuwcavxdsjgxrvvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667168.653562-762-99949204048520/AnsiballZ_async_status.py
Dec 02 09:19:29 np0005541913.localdomain sudo[133409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:29 np0005541913.localdomain python3.9[133411]: ansible-ansible.legacy.async_status Invoked with jid=j210078628763.133255 mode=cleanup _async_dir=/root/.ansible_async
Dec 02 09:19:29 np0005541913.localdomain sudo[133409]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:30 np0005541913.localdomain sudo[133501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tulhdjunehyhqknoqkjqxpwodjroehdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667170.1871982-828-73674880052935/AnsiballZ_stat.py
Dec 02 09:19:30 np0005541913.localdomain sudo[133501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:30 np0005541913.localdomain ansible-async_wrapper.py[133258]: Done in kid B.
Dec 02 09:19:30 np0005541913.localdomain python3.9[133503]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:19:30 np0005541913.localdomain sudo[133501]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:30 np0005541913.localdomain sudo[133574]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwamcyspiqriyqfxsbdmidyirlwncawe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667170.1871982-828-73674880052935/AnsiballZ_copy.py
Dec 02 09:19:30 np0005541913.localdomain sudo[133574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:31 np0005541913.localdomain python3.9[133576]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667170.1871982-828-73674880052935/.source.returncode _original_basename=.s_xqdy0a follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:19:31 np0005541913.localdomain sudo[133574]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:31 np0005541913.localdomain sudo[133666]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybmeujusanavwhhapaqqxaehiqfmfclt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667171.4304514-876-138685012028521/AnsiballZ_stat.py
Dec 02 09:19:31 np0005541913.localdomain sudo[133666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:31 np0005541913.localdomain python3.9[133668]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:19:31 np0005541913.localdomain sudo[133666]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:32 np0005541913.localdomain sudo[133739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbatpebdljxieuoymsooxcbqgvnkosqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667171.4304514-876-138685012028521/AnsiballZ_copy.py
Dec 02 09:19:32 np0005541913.localdomain sudo[133739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:33 np0005541913.localdomain python3.9[133741]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667171.4304514-876-138685012028521/.source.cfg _original_basename=.1slauv99 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:19:33 np0005541913.localdomain sudo[133739]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:33 np0005541913.localdomain sudo[133831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yanspebucmawupssiyigkctsxiqvmqua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667173.2126865-921-280897430605318/AnsiballZ_systemd.py
Dec 02 09:19:33 np0005541913.localdomain sudo[133831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:33 np0005541913.localdomain python3.9[133833]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:19:33 np0005541913.localdomain systemd[1]: Reloading Network Manager...
Dec 02 09:19:33 np0005541913.localdomain NetworkManager[5965]: <info>  [1764667173.8678] audit: op="reload" arg="0" pid=133837 uid=0 result="success"
Dec 02 09:19:33 np0005541913.localdomain NetworkManager[5965]: <info>  [1764667173.8690] config: signal: SIGHUP (no changes from disk)
Dec 02 09:19:33 np0005541913.localdomain systemd[1]: Reloaded Network Manager.
Dec 02 09:19:33 np0005541913.localdomain sudo[133831]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51444 DF PROTO=TCP SPT=49228 DPT=9102 SEQ=368648226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478BC1AE0000000001030307) 
Dec 02 09:19:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12819 DF PROTO=TCP SPT=34848 DPT=9105 SEQ=1210016272 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478BC22E0000000001030307) 
Dec 02 09:19:34 np0005541913.localdomain sshd[130353]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:19:34 np0005541913.localdomain systemd[1]: session-41.scope: Deactivated successfully.
Dec 02 09:19:34 np0005541913.localdomain systemd[1]: session-41.scope: Consumed 36.390s CPU time.
Dec 02 09:19:34 np0005541913.localdomain systemd-logind[757]: Session 41 logged out. Waiting for processes to exit.
Dec 02 09:19:34 np0005541913.localdomain systemd-logind[757]: Removed session 41.
Dec 02 09:19:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51446 DF PROTO=TCP SPT=49228 DPT=9102 SEQ=368648226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478BCDA40000000001030307) 
Dec 02 09:19:39 np0005541913.localdomain sshd[133853]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:19:39 np0005541913.localdomain sshd[133853]: Accepted publickey for zuul from 192.168.122.30 port 40238 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:19:39 np0005541913.localdomain systemd-logind[757]: New session 42 of user zuul.
Dec 02 09:19:39 np0005541913.localdomain systemd[1]: Started Session 42 of User zuul.
Dec 02 09:19:39 np0005541913.localdomain sshd[133853]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:19:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23825 DF PROTO=TCP SPT=42168 DPT=9100 SEQ=4064404138 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478BD9A40000000001030307) 
Dec 02 09:19:40 np0005541913.localdomain python3.9[133946]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:19:41 np0005541913.localdomain python3.9[134040]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:19:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11931 DF PROTO=TCP SPT=34416 DPT=9882 SEQ=238813804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478BE5A40000000001030307) 
Dec 02 09:19:44 np0005541913.localdomain python3.9[134193]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:19:45 np0005541913.localdomain sshd[133853]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:19:45 np0005541913.localdomain systemd[1]: session-42.scope: Deactivated successfully.
Dec 02 09:19:45 np0005541913.localdomain systemd[1]: session-42.scope: Consumed 2.227s CPU time.
Dec 02 09:19:45 np0005541913.localdomain systemd-logind[757]: Session 42 logged out. Waiting for processes to exit.
Dec 02 09:19:45 np0005541913.localdomain systemd-logind[757]: Removed session 42.
Dec 02 09:19:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23827 DF PROTO=TCP SPT=42168 DPT=9100 SEQ=4064404138 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478BF1640000000001030307) 
Dec 02 09:19:49 np0005541913.localdomain sudo[134209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:19:49 np0005541913.localdomain sudo[134209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:19:49 np0005541913.localdomain sudo[134209]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:49 np0005541913.localdomain sudo[134224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 09:19:49 np0005541913.localdomain sudo[134224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:19:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12823 DF PROTO=TCP SPT=34848 DPT=9105 SEQ=1210016272 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478BFDE50000000001030307) 
Dec 02 09:19:49 np0005541913.localdomain sudo[134224]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:49 np0005541913.localdomain sudo[134260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:19:49 np0005541913.localdomain sudo[134260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:19:49 np0005541913.localdomain sudo[134260]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:49 np0005541913.localdomain sudo[134275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:19:49 np0005541913.localdomain sudo[134275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:19:50 np0005541913.localdomain sudo[134275]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:51 np0005541913.localdomain sudo[134322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:19:51 np0005541913.localdomain sudo[134322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:19:51 np0005541913.localdomain sudo[134322]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:51 np0005541913.localdomain sshd[134337]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:19:51 np0005541913.localdomain sshd[134337]: Accepted publickey for zuul from 192.168.122.30 port 41358 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:19:51 np0005541913.localdomain systemd-logind[757]: New session 43 of user zuul.
Dec 02 09:19:51 np0005541913.localdomain systemd[1]: Started Session 43 of User zuul.
Dec 02 09:19:51 np0005541913.localdomain sshd[134337]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:19:52 np0005541913.localdomain python3.9[134430]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:19:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8244 DF PROTO=TCP SPT=39278 DPT=9101 SEQ=3243245265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C09200000000001030307) 
Dec 02 09:19:53 np0005541913.localdomain python3.9[134524]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:19:54 np0005541913.localdomain sudo[134618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysgllpehrfwrtfpgeyvulbiepyrflomo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667194.3416157-81-216673120237324/AnsiballZ_setup.py
Dec 02 09:19:54 np0005541913.localdomain sudo[134618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:54 np0005541913.localdomain python3.9[134620]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:19:55 np0005541913.localdomain sudo[134618]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8246 DF PROTO=TCP SPT=39278 DPT=9101 SEQ=3243245265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C15250000000001030307) 
Dec 02 09:19:55 np0005541913.localdomain sudo[134672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qopvhwiaadcfzcsjmbxkvvnitnkwruvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667194.3416157-81-216673120237324/AnsiballZ_dnf.py
Dec 02 09:19:55 np0005541913.localdomain sudo[134672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:55 np0005541913.localdomain python3.9[134674]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:19:59 np0005541913.localdomain sudo[134672]: pam_unix(sudo:session): session closed for user root
Dec 02 09:19:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8247 DF PROTO=TCP SPT=39278 DPT=9101 SEQ=3243245265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C24E50000000001030307) 
Dec 02 09:19:59 np0005541913.localdomain sudo[134766]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oevbbmysiqsizilgnwhyulvbdcjrhalm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667199.4170988-117-242037147998223/AnsiballZ_setup.py
Dec 02 09:19:59 np0005541913.localdomain sudo[134766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:19:59 np0005541913.localdomain python3.9[134768]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:20:00 np0005541913.localdomain sudo[134766]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:00 np0005541913.localdomain sudo[134921]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypeavuuuvyxlzlytnfzqshncqexmwyeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667200.5938623-150-46141580928222/AnsiballZ_file.py
Dec 02 09:20:01 np0005541913.localdomain sudo[134921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:01 np0005541913.localdomain python3.9[134923]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:01 np0005541913.localdomain sudo[134921]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:01 np0005541913.localdomain sudo[135013]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inbnpaiypjkrbdaebmleudsvuhplwawg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667201.5611827-174-88682982459928/AnsiballZ_command.py
Dec 02 09:20:01 np0005541913.localdomain sudo[135013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:02 np0005541913.localdomain python3.9[135015]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:20:02 np0005541913.localdomain sudo[135013]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:02 np0005541913.localdomain sudo[135118]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzsybhfwbotiurhppomregyuytxucrth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667202.4017065-198-268619818991059/AnsiballZ_stat.py
Dec 02 09:20:02 np0005541913.localdomain sudo[135118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:02 np0005541913.localdomain python3.9[135120]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:02 np0005541913.localdomain sudo[135118]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:03 np0005541913.localdomain sudo[135166]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epwzavdbzkstszednvhhsygltccsevoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667202.4017065-198-268619818991059/AnsiballZ_file.py
Dec 02 09:20:03 np0005541913.localdomain sudo[135166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:03 np0005541913.localdomain python3.9[135168]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:03 np0005541913.localdomain sudo[135166]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48692 DF PROTO=TCP SPT=48786 DPT=9102 SEQ=3515766592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C36DE0000000001030307) 
Dec 02 09:20:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53548 DF PROTO=TCP SPT=56554 DPT=9105 SEQ=3006067946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C375F0000000001030307) 
Dec 02 09:20:04 np0005541913.localdomain sudo[135258]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnbdwzsyjjbtryewkjcvvaxgxpffvyoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667204.1163497-234-190667293273312/AnsiballZ_stat.py
Dec 02 09:20:04 np0005541913.localdomain sudo[135258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:04 np0005541913.localdomain python3.9[135260]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:04 np0005541913.localdomain sudo[135258]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:04 np0005541913.localdomain sudo[135306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udwbfpofjwxkyrzandybbjrwvoioimtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667204.1163497-234-190667293273312/AnsiballZ_file.py
Dec 02 09:20:04 np0005541913.localdomain sudo[135306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:05 np0005541913.localdomain python3.9[135308]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:20:05 np0005541913.localdomain sudo[135306]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:06 np0005541913.localdomain sudo[135398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dytxpebgageisnluicoflcdlscncbgrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667205.9141402-273-173257620911158/AnsiballZ_ini_file.py
Dec 02 09:20:06 np0005541913.localdomain sudo[135398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:06 np0005541913.localdomain python3.9[135400]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:20:06 np0005541913.localdomain sudo[135398]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:06 np0005541913.localdomain sudo[135490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctxtinzngynklfkipiaotigojuuhslkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667206.619459-273-25405869435967/AnsiballZ_ini_file.py
Dec 02 09:20:06 np0005541913.localdomain sudo[135490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48694 DF PROTO=TCP SPT=48786 DPT=9102 SEQ=3515766592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C42E40000000001030307) 
Dec 02 09:20:07 np0005541913.localdomain python3.9[135492]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:20:07 np0005541913.localdomain sudo[135490]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:07 np0005541913.localdomain sudo[135582]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwcwpydcwogbzjdngercrgkrsulsarld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667207.2232265-273-117160184156410/AnsiballZ_ini_file.py
Dec 02 09:20:07 np0005541913.localdomain sudo[135582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:07 np0005541913.localdomain python3.9[135584]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:20:07 np0005541913.localdomain sudo[135582]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:08 np0005541913.localdomain sudo[135674]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klthdnvtwrlhcmxwqcwswezqqeuojcxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667207.985854-273-62854721045831/AnsiballZ_ini_file.py
Dec 02 09:20:08 np0005541913.localdomain sudo[135674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:08 np0005541913.localdomain python3.9[135676]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:20:08 np0005541913.localdomain sudo[135674]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:09 np0005541913.localdomain sudo[135766]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjpntsyaajiloauvnxmbqpkutadhdteg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667208.8008285-366-180942138869590/AnsiballZ_dnf.py
Dec 02 09:20:09 np0005541913.localdomain sudo[135766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:09 np0005541913.localdomain python3.9[135768]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:20:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37480 DF PROTO=TCP SPT=53030 DPT=9100 SEQ=1194465109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C4EE40000000001030307) 
Dec 02 09:20:12 np0005541913.localdomain sudo[135766]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55643 DF PROTO=TCP SPT=35440 DPT=9882 SEQ=1911252482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C5AE40000000001030307) 
Dec 02 09:20:14 np0005541913.localdomain sudo[135860]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zncduumivnhcqhavwblsdhmtzzktnhhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667213.91794-399-2058349710599/AnsiballZ_setup.py
Dec 02 09:20:14 np0005541913.localdomain sudo[135860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:14 np0005541913.localdomain python3.9[135862]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:20:14 np0005541913.localdomain sudo[135860]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:14 np0005541913.localdomain sudo[135954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvbvlrzlvwppecegqtfnkruorerrgtrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667214.6841817-423-157061740804456/AnsiballZ_stat.py
Dec 02 09:20:14 np0005541913.localdomain sudo[135954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:15 np0005541913.localdomain python3.9[135956]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:20:15 np0005541913.localdomain sudo[135954]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:16 np0005541913.localdomain sudo[136046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcccbdkfsefnmlppcbyhalwooctahmbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667215.8972712-450-181922075746709/AnsiballZ_stat.py
Dec 02 09:20:16 np0005541913.localdomain sudo[136046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37482 DF PROTO=TCP SPT=53030 DPT=9100 SEQ=1194465109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C66A50000000001030307) 
Dec 02 09:20:16 np0005541913.localdomain python3.9[136048]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:20:16 np0005541913.localdomain sudo[136046]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:16 np0005541913.localdomain sudo[136138]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjrweihhwrjvkgzsvklsrrwzpzdubevb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667216.6693187-480-82748839499574/AnsiballZ_command.py
Dec 02 09:20:16 np0005541913.localdomain sudo[136138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:17 np0005541913.localdomain python3.9[136140]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:20:17 np0005541913.localdomain sudo[136138]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:17 np0005541913.localdomain sudo[136231]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqixwuchmsmfesejubdunugpbxyyzuqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667217.3802166-510-130947298504269/AnsiballZ_service_facts.py
Dec 02 09:20:17 np0005541913.localdomain sudo[136231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:18 np0005541913.localdomain python3.9[136233]: ansible-service_facts Invoked
Dec 02 09:20:18 np0005541913.localdomain network[136250]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:20:18 np0005541913.localdomain network[136251]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:20:18 np0005541913.localdomain network[136252]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:20:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:20:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53552 DF PROTO=TCP SPT=56554 DPT=9105 SEQ=3006067946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C73E40000000001030307) 
Dec 02 09:20:21 np0005541913.localdomain sudo[136231]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8119 DF PROTO=TCP SPT=51554 DPT=9101 SEQ=373800939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C7E500000000001030307) 
Dec 02 09:20:23 np0005541913.localdomain sudo[136464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgltmrdubpyecigkjitlcorjkppvbapv ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764667223.4623263-555-43101736719754/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764667223.4623263-555-43101736719754/args
Dec 02 09:20:23 np0005541913.localdomain sudo[136464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:23 np0005541913.localdomain sudo[136464]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8121 DF PROTO=TCP SPT=51554 DPT=9101 SEQ=373800939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C8A640000000001030307) 
Dec 02 09:20:25 np0005541913.localdomain sudo[136571]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfmnlxmujmeenhdiwnnmvbtmvwziulki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667224.3971627-588-37902501705387/AnsiballZ_dnf.py
Dec 02 09:20:25 np0005541913.localdomain sudo[136571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:25 np0005541913.localdomain python3.9[136573]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:20:28 np0005541913.localdomain sudo[136571]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8122 DF PROTO=TCP SPT=51554 DPT=9101 SEQ=373800939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478C9A240000000001030307) 
Dec 02 09:20:30 np0005541913.localdomain sudo[136665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzjctaocrsjrvaotgdegcxocxtztituq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667229.7490127-627-212245712865795/AnsiballZ_package_facts.py
Dec 02 09:20:30 np0005541913.localdomain sudo[136665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:30 np0005541913.localdomain python3.9[136667]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 02 09:20:31 np0005541913.localdomain sudo[136665]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:32 np0005541913.localdomain sudo[136757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvvrydpeojtkzihwzpkuzevjrqdenceb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667231.8978214-658-100299384225247/AnsiballZ_stat.py
Dec 02 09:20:32 np0005541913.localdomain sudo[136757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:32 np0005541913.localdomain python3.9[136759]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:32 np0005541913.localdomain sudo[136757]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:32 np0005541913.localdomain sudo[136832]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuehodaxzuxwnujapzhmzqfabxmhehcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667231.8978214-658-100299384225247/AnsiballZ_copy.py
Dec 02 09:20:32 np0005541913.localdomain sudo[136832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:33 np0005541913.localdomain python3.9[136834]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667231.8978214-658-100299384225247/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:33 np0005541913.localdomain sudo[136832]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42353 DF PROTO=TCP SPT=40878 DPT=9102 SEQ=1386749150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478CAC0D0000000001030307) 
Dec 02 09:20:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1098 DF PROTO=TCP SPT=54984 DPT=9105 SEQ=3549678416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478CAC8E0000000001030307) 
Dec 02 09:20:34 np0005541913.localdomain sudo[136926]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsjmoqdwlomzkvlnmzxormymludmqhlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667233.407703-703-18285816425938/AnsiballZ_stat.py
Dec 02 09:20:34 np0005541913.localdomain sudo[136926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:34 np0005541913.localdomain python3.9[136928]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:34 np0005541913.localdomain sudo[136926]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:34 np0005541913.localdomain sudo[137001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muodlgiflijylmymqfgkpkxyrewrmehk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667233.407703-703-18285816425938/AnsiballZ_copy.py
Dec 02 09:20:34 np0005541913.localdomain sudo[137001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:34 np0005541913.localdomain python3.9[137003]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667233.407703-703-18285816425938/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:35 np0005541913.localdomain sudo[137001]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:36 np0005541913.localdomain sudo[137095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yffqrvcijhycecylijrdwiklotsruabp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667236.111947-766-170413263085143/AnsiballZ_lineinfile.py
Dec 02 09:20:36 np0005541913.localdomain sudo[137095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:36 np0005541913.localdomain python3.9[137097]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:36 np0005541913.localdomain sudo[137095]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42355 DF PROTO=TCP SPT=40878 DPT=9102 SEQ=1386749150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478CB8250000000001030307) 
Dec 02 09:20:38 np0005541913.localdomain sudo[137189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfepffnexusshpwzqviqbysgwttnyvxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667237.8389215-811-209246461994557/AnsiballZ_setup.py
Dec 02 09:20:38 np0005541913.localdomain sudo[137189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:38 np0005541913.localdomain python3.9[137191]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:20:38 np0005541913.localdomain sudo[137189]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:39 np0005541913.localdomain sudo[137243]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xydbpklihsvxvohrftmrqaavzpdojhwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667237.8389215-811-209246461994557/AnsiballZ_systemd.py
Dec 02 09:20:39 np0005541913.localdomain sudo[137243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:39 np0005541913.localdomain python3.9[137245]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:20:39 np0005541913.localdomain sudo[137243]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11934 DF PROTO=TCP SPT=34416 DPT=9882 SEQ=238813804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478CC3E50000000001030307) 
Dec 02 09:20:40 np0005541913.localdomain sudo[137337]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjtboiikraktrbekbqlljlkeujtiwxia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667240.5849946-858-226579790495567/AnsiballZ_setup.py
Dec 02 09:20:40 np0005541913.localdomain sudo[137337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:41 np0005541913.localdomain python3.9[137339]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:20:41 np0005541913.localdomain sudo[137337]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:41 np0005541913.localdomain sudo[137391]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbamcehsdtgmobpfotvrzlirjcizwloi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667240.5849946-858-226579790495567/AnsiballZ_systemd.py
Dec 02 09:20:41 np0005541913.localdomain sudo[137391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:41 np0005541913.localdomain python3.9[137393]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:20:41 np0005541913.localdomain systemd[1]: Stopping NTP client/server...
Dec 02 09:20:41 np0005541913.localdomain chronyd[25712]: chronyd exiting
Dec 02 09:20:41 np0005541913.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Dec 02 09:20:41 np0005541913.localdomain systemd[1]: Stopped NTP client/server.
Dec 02 09:20:41 np0005541913.localdomain systemd[1]: Starting NTP client/server...
Dec 02 09:20:41 np0005541913.localdomain chronyd[137402]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 02 09:20:41 np0005541913.localdomain chronyd[137402]: Frequency -26.465 +/- 0.266 ppm read from /var/lib/chrony/drift
Dec 02 09:20:41 np0005541913.localdomain chronyd[137402]: Loaded seccomp filter (level 2)
Dec 02 09:20:41 np0005541913.localdomain systemd[1]: Started NTP client/server.
Dec 02 09:20:42 np0005541913.localdomain sudo[137391]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:42 np0005541913.localdomain sshd[134337]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:20:42 np0005541913.localdomain systemd[1]: session-43.scope: Deactivated successfully.
Dec 02 09:20:42 np0005541913.localdomain systemd[1]: session-43.scope: Consumed 28.564s CPU time.
Dec 02 09:20:42 np0005541913.localdomain systemd-logind[757]: Session 43 logged out. Waiting for processes to exit.
Dec 02 09:20:42 np0005541913.localdomain systemd-logind[757]: Removed session 43.
Dec 02 09:20:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23830 DF PROTO=TCP SPT=42168 DPT=9100 SEQ=4064404138 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478CCFE50000000001030307) 
Dec 02 09:20:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39345 DF PROTO=TCP SPT=41560 DPT=9100 SEQ=3376193355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478CDBE40000000001030307) 
Dec 02 09:20:47 np0005541913.localdomain sshd[137418]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:20:48 np0005541913.localdomain sshd[137418]: Accepted publickey for zuul from 192.168.122.30 port 50268 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:20:48 np0005541913.localdomain systemd-logind[757]: New session 44 of user zuul.
Dec 02 09:20:48 np0005541913.localdomain systemd[1]: Started Session 44 of User zuul.
Dec 02 09:20:48 np0005541913.localdomain sshd[137418]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:20:49 np0005541913.localdomain python3.9[137511]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:20:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42357 DF PROTO=TCP SPT=40878 DPT=9102 SEQ=1386749150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478CE7E40000000001030307) 
Dec 02 09:20:50 np0005541913.localdomain sudo[137605]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpwmamnspnqmkcvsowagodwxpofraqui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667249.6609442-60-143460209189470/AnsiballZ_file.py
Dec 02 09:20:50 np0005541913.localdomain sudo[137605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:50 np0005541913.localdomain python3.9[137607]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:50 np0005541913.localdomain sudo[137605]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:50 np0005541913.localdomain sudo[137710]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzvhszqnwtmwkiuzgaxfrzkmlazxdzjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667250.5010984-84-39585887353363/AnsiballZ_stat.py
Dec 02 09:20:50 np0005541913.localdomain sudo[137710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:51 np0005541913.localdomain python3.9[137712]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:51 np0005541913.localdomain sudo[137713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:20:51 np0005541913.localdomain sudo[137713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:20:51 np0005541913.localdomain sudo[137713]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:51 np0005541913.localdomain sudo[137710]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:51 np0005541913.localdomain sudo[137730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:20:51 np0005541913.localdomain sudo[137730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:20:51 np0005541913.localdomain sudo[137788]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvztkcfykohkyqqraysynaelwrqczqyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667250.5010984-84-39585887353363/AnsiballZ_file.py
Dec 02 09:20:51 np0005541913.localdomain sudo[137788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:51 np0005541913.localdomain python3.9[137790]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.vnldosby recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:51 np0005541913.localdomain sudo[137788]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:51 np0005541913.localdomain sudo[137730]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25920 DF PROTO=TCP SPT=57488 DPT=9101 SEQ=205900510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478CF3800000000001030307) 
Dec 02 09:20:52 np0005541913.localdomain sudo[137911]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qifzkvidqqwlogwgsgmgiqzcsvymiypu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667252.019576-144-253399547474831/AnsiballZ_stat.py
Dec 02 09:20:52 np0005541913.localdomain sudo[137911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:52 np0005541913.localdomain python3.9[137913]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:52 np0005541913.localdomain sudo[137914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:20:52 np0005541913.localdomain sudo[137914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:20:52 np0005541913.localdomain sudo[137914]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:52 np0005541913.localdomain sudo[137911]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:52 np0005541913.localdomain sudo[138001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbygfdncwvkwfaseednbapnjdoyraztk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667252.019576-144-253399547474831/AnsiballZ_copy.py
Dec 02 09:20:52 np0005541913.localdomain sudo[138001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:53 np0005541913.localdomain python3.9[138003]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667252.019576-144-253399547474831/.source _original_basename=.z74m1wd5 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:53 np0005541913.localdomain sudo[138001]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:53 np0005541913.localdomain sudo[138093]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzdmkjrfocrpanmmejkfkucmrucymaep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667253.4120908-192-165118529304585/AnsiballZ_file.py
Dec 02 09:20:53 np0005541913.localdomain sudo[138093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:53 np0005541913.localdomain python3.9[138095]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:20:53 np0005541913.localdomain sudo[138093]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:54 np0005541913.localdomain auditd[710]: Audit daemon rotating log files
Dec 02 09:20:54 np0005541913.localdomain sudo[138185]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdcxzgspnrdtudaoodbgkklplkwnfxfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667254.6289587-216-13661394177182/AnsiballZ_stat.py
Dec 02 09:20:54 np0005541913.localdomain sudo[138185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:55 np0005541913.localdomain python3.9[138187]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:55 np0005541913.localdomain sudo[138185]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:55 np0005541913.localdomain sudo[138258]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cckspieqilqbponbrukctlazbrexegsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667254.6289587-216-13661394177182/AnsiballZ_copy.py
Dec 02 09:20:55 np0005541913.localdomain sudo[138258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25922 DF PROTO=TCP SPT=57488 DPT=9101 SEQ=205900510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478CFFA50000000001030307) 
Dec 02 09:20:55 np0005541913.localdomain python3.9[138260]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667254.6289587-216-13661394177182/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:20:55 np0005541913.localdomain sudo[138258]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:56 np0005541913.localdomain sudo[138350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvwwpxrczyjuxyeytvfspbbiqopkzkvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667255.623115-216-44223108629827/AnsiballZ_stat.py
Dec 02 09:20:56 np0005541913.localdomain sudo[138350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:56 np0005541913.localdomain python3.9[138352]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:56 np0005541913.localdomain sudo[138350]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:57 np0005541913.localdomain sudo[138423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsevlcyeuwxowyctpodazbkdynvqxnph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667255.623115-216-44223108629827/AnsiballZ_copy.py
Dec 02 09:20:57 np0005541913.localdomain sudo[138423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:57 np0005541913.localdomain python3.9[138425]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667255.623115-216-44223108629827/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:20:57 np0005541913.localdomain sudo[138423]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:57 np0005541913.localdomain sudo[138515]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-josffxhzupjexzwimglvifcetveusjnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667257.5683727-303-126563951383601/AnsiballZ_file.py
Dec 02 09:20:57 np0005541913.localdomain sudo[138515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:58 np0005541913.localdomain python3.9[138517]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:58 np0005541913.localdomain sudo[138515]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:58 np0005541913.localdomain sudo[138607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfrxilzvplotzxzieyjhqejncyftbgdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667258.2117448-327-102065873164020/AnsiballZ_stat.py
Dec 02 09:20:58 np0005541913.localdomain sudo[138607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:58 np0005541913.localdomain python3.9[138609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:58 np0005541913.localdomain sudo[138607]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:58 np0005541913.localdomain sudo[138680]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdwhxhvzrttbyphbxowjcouefjpusbzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667258.2117448-327-102065873164020/AnsiballZ_copy.py
Dec 02 09:20:58 np0005541913.localdomain sudo[138680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:59 np0005541913.localdomain python3.9[138682]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667258.2117448-327-102065873164020/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:20:59 np0005541913.localdomain sudo[138680]: pam_unix(sudo:session): session closed for user root
Dec 02 09:20:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25923 DF PROTO=TCP SPT=57488 DPT=9101 SEQ=205900510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D0F640000000001030307) 
Dec 02 09:20:59 np0005541913.localdomain sudo[138772]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lslvyorbjgobptxryaxwxzbfoueotckn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667259.3473492-372-23090602061529/AnsiballZ_stat.py
Dec 02 09:20:59 np0005541913.localdomain sudo[138772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:20:59 np0005541913.localdomain python3.9[138774]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:20:59 np0005541913.localdomain sudo[138772]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:00 np0005541913.localdomain sudo[138845]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uicjujeaxxuozefkgcggxayysbkmakur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667259.3473492-372-23090602061529/AnsiballZ_copy.py
Dec 02 09:21:00 np0005541913.localdomain sudo[138845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:00 np0005541913.localdomain python3.9[138847]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667259.3473492-372-23090602061529/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:00 np0005541913.localdomain sudo[138845]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:01 np0005541913.localdomain sudo[138937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vebwddjamkwrlzbznisgspksiocvwdmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667260.5809536-417-87074718426560/AnsiballZ_systemd.py
Dec 02 09:21:01 np0005541913.localdomain sudo[138937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:01 np0005541913.localdomain python3.9[138939]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:21:01 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:21:01 np0005541913.localdomain systemd-sysv-generator[138965]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:21:01 np0005541913.localdomain systemd-rc-local-generator[138962]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:21:01 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:21:01 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:21:01 np0005541913.localdomain systemd-rc-local-generator[139002]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:21:01 np0005541913.localdomain systemd-sysv-generator[139006]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:21:01 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:21:02 np0005541913.localdomain systemd[1]: Starting EDPM Container Shutdown...
Dec 02 09:21:02 np0005541913.localdomain systemd[1]: Finished EDPM Container Shutdown.
Dec 02 09:21:02 np0005541913.localdomain sudo[138937]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:02 np0005541913.localdomain sudo[139105]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypinoqejeodjivnvnbapjdapldogspdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667262.2951865-441-231971350971928/AnsiballZ_stat.py
Dec 02 09:21:02 np0005541913.localdomain sudo[139105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:02 np0005541913.localdomain python3.9[139107]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:02 np0005541913.localdomain sudo[139105]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:03 np0005541913.localdomain sudo[139178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iboewfyofvdlrxxchogceleyvnwijvmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667262.2951865-441-231971350971928/AnsiballZ_copy.py
Dec 02 09:21:03 np0005541913.localdomain sudo[139178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:03 np0005541913.localdomain python3.9[139180]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667262.2951865-441-231971350971928/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:03 np0005541913.localdomain sudo[139178]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:03 np0005541913.localdomain sudo[139270]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmhvzsdwomgbpozfpivluoageqshrxvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667263.4492674-486-63915310449455/AnsiballZ_stat.py
Dec 02 09:21:03 np0005541913.localdomain sudo[139270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:03 np0005541913.localdomain python3.9[139272]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:03 np0005541913.localdomain sudo[139270]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27965 DF PROTO=TCP SPT=58890 DPT=9102 SEQ=1000295349 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D213D0000000001030307) 
Dec 02 09:21:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24456 DF PROTO=TCP SPT=34966 DPT=9105 SEQ=697319510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D21BE0000000001030307) 
Dec 02 09:21:05 np0005541913.localdomain sudo[139343]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuthnwiwvgzfieinakwqtoagyfajxpei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667263.4492674-486-63915310449455/AnsiballZ_copy.py
Dec 02 09:21:05 np0005541913.localdomain sudo[139343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:05 np0005541913.localdomain python3.9[139345]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667263.4492674-486-63915310449455/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:05 np0005541913.localdomain sudo[139343]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:05 np0005541913.localdomain sudo[139435]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwgzdrwvpjmzaareczysjquzyrcdpwip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667265.5606093-531-35391411554366/AnsiballZ_systemd.py
Dec 02 09:21:05 np0005541913.localdomain sudo[139435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:06 np0005541913.localdomain python3.9[139437]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:21:06 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:21:06 np0005541913.localdomain systemd-sysv-generator[139466]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:21:06 np0005541913.localdomain systemd-rc-local-generator[139463]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:21:06 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:21:06 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 09:21:06 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 09:21:06 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 09:21:06 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 09:21:06 np0005541913.localdomain sudo[139435]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27967 DF PROTO=TCP SPT=58890 DPT=9102 SEQ=1000295349 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D2D640000000001030307) 
Dec 02 09:21:08 np0005541913.localdomain python3.9[139569]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:21:08 np0005541913.localdomain network[139586]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:21:08 np0005541913.localdomain network[139587]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:21:08 np0005541913.localdomain network[139588]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:21:09 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:21:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59818 DF PROTO=TCP SPT=58568 DPT=9100 SEQ=829029851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D39240000000001030307) 
Dec 02 09:21:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32512 DF PROTO=TCP SPT=35890 DPT=9882 SEQ=1419328620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D45640000000001030307) 
Dec 02 09:21:13 np0005541913.localdomain sudo[139787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ruprvafvomyrjvzhwhwwesofwpdcifdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667273.498382-609-230974842590917/AnsiballZ_stat.py
Dec 02 09:21:13 np0005541913.localdomain sudo[139787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:13 np0005541913.localdomain python3.9[139789]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:13 np0005541913.localdomain sudo[139787]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:14 np0005541913.localdomain sudo[139862]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilcreskrgvfbhotjsjysfgpzoerscufh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667273.498382-609-230974842590917/AnsiballZ_copy.py
Dec 02 09:21:14 np0005541913.localdomain sudo[139862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:14 np0005541913.localdomain python3.9[139864]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667273.498382-609-230974842590917/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:14 np0005541913.localdomain sudo[139862]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:15 np0005541913.localdomain sudo[139955]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzcylhptjfhxoygcvkvoekudifdfuhwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667275.3259354-655-74390451440597/AnsiballZ_systemd.py
Dec 02 09:21:15 np0005541913.localdomain sudo[139955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:15 np0005541913.localdomain python3.9[139957]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:21:15 np0005541913.localdomain systemd[1]: Reloading OpenSSH server daemon...
Dec 02 09:21:15 np0005541913.localdomain systemd[1]: Reloaded OpenSSH server daemon.
Dec 02 09:21:15 np0005541913.localdomain sshd[119826]: Received SIGHUP; restarting.
Dec 02 09:21:15 np0005541913.localdomain sshd[119826]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:21:15 np0005541913.localdomain sshd[119826]: Server listening on 0.0.0.0 port 22.
Dec 02 09:21:15 np0005541913.localdomain sshd[119826]: Server listening on :: port 22.
Dec 02 09:21:15 np0005541913.localdomain sudo[139955]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59820 DF PROTO=TCP SPT=58568 DPT=9100 SEQ=829029851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D50E50000000001030307) 
Dec 02 09:21:17 np0005541913.localdomain sudo[140051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkhjigamaebffzcqnumktbirswuyyzbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667277.427663-678-39368227368805/AnsiballZ_file.py
Dec 02 09:21:17 np0005541913.localdomain sudo[140051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:17 np0005541913.localdomain python3.9[140053]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:17 np0005541913.localdomain sudo[140051]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:18 np0005541913.localdomain sudo[140143]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgjfaaxsdmqbdaxlzurmkkutoigmiuil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667278.128267-702-108910026521099/AnsiballZ_stat.py
Dec 02 09:21:18 np0005541913.localdomain sudo[140143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:18 np0005541913.localdomain python3.9[140145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:18 np0005541913.localdomain sudo[140143]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:18 np0005541913.localdomain sudo[140216]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sghtfmjyvjtokovzfibkohvzmtwrbqum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667278.128267-702-108910026521099/AnsiballZ_copy.py
Dec 02 09:21:18 np0005541913.localdomain sudo[140216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:19 np0005541913.localdomain python3.9[140218]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667278.128267-702-108910026521099/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:19 np0005541913.localdomain sudo[140216]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24460 DF PROTO=TCP SPT=34966 DPT=9105 SEQ=697319510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D5DE40000000001030307) 
Dec 02 09:21:19 np0005541913.localdomain sudo[140308]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkjhyibwumcqhktrpyzfobssbcqlnyuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667279.5041437-756-264575475392190/AnsiballZ_timezone.py
Dec 02 09:21:19 np0005541913.localdomain sudo[140308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:20 np0005541913.localdomain python3.9[140310]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 02 09:21:20 np0005541913.localdomain systemd[1]: Starting Time & Date Service...
Dec 02 09:21:20 np0005541913.localdomain systemd[1]: Started Time & Date Service.
Dec 02 09:21:20 np0005541913.localdomain sudo[140308]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:20 np0005541913.localdomain sudo[140404]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icfktldjsdqtwmkqqqutklpjhgqluhqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667280.5129995-783-254803891904205/AnsiballZ_file.py
Dec 02 09:21:20 np0005541913.localdomain sudo[140404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:20 np0005541913.localdomain python3.9[140406]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:20 np0005541913.localdomain sudo[140404]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:21 np0005541913.localdomain sudo[140497]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyrudduckpxglspowbdzqrqptvynpjxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667281.1775408-807-257800377320808/AnsiballZ_stat.py
Dec 02 09:21:21 np0005541913.localdomain sudo[140497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:21 np0005541913.localdomain python3.9[140499]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:21 np0005541913.localdomain sudo[140497]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:21 np0005541913.localdomain sudo[140570]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuaxyvhiqlmuteytgsxmzzybftxwhjsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667281.1775408-807-257800377320808/AnsiballZ_copy.py
Dec 02 09:21:21 np0005541913.localdomain sudo[140570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:22 np0005541913.localdomain python3.9[140572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667281.1775408-807-257800377320808/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:22 np0005541913.localdomain sudo[140570]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51573 DF PROTO=TCP SPT=40064 DPT=9101 SEQ=1377099721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D68B00000000001030307) 
Dec 02 09:21:22 np0005541913.localdomain sudo[140662]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxsbzdrsykpqegdwxgtqddfywmotqvai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667282.3402638-852-80149997879267/AnsiballZ_stat.py
Dec 02 09:21:22 np0005541913.localdomain sudo[140662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:22 np0005541913.localdomain python3.9[140664]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:22 np0005541913.localdomain sudo[140662]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:23 np0005541913.localdomain sudo[140735]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbblzmruzzydwgnnocxbnnxqpoxmpzoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667282.3402638-852-80149997879267/AnsiballZ_copy.py
Dec 02 09:21:23 np0005541913.localdomain sudo[140735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:23 np0005541913.localdomain python3.9[140737]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667282.3402638-852-80149997879267/.source.yaml _original_basename=.eggbihc9 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:23 np0005541913.localdomain sudo[140735]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:23 np0005541913.localdomain sudo[140827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqmkckdsyheficbljpgobkcttorrnmmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667283.5345683-897-35753284200341/AnsiballZ_stat.py
Dec 02 09:21:23 np0005541913.localdomain sudo[140827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:23 np0005541913.localdomain python3.9[140829]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:24 np0005541913.localdomain sudo[140827]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:24 np0005541913.localdomain sudo[140902]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tawdefsrgcczgznlkxemykqbjymgibqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667283.5345683-897-35753284200341/AnsiballZ_copy.py
Dec 02 09:21:24 np0005541913.localdomain sudo[140902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:24 np0005541913.localdomain python3.9[140904]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667283.5345683-897-35753284200341/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:24 np0005541913.localdomain sudo[140902]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:25 np0005541913.localdomain sudo[140994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkwbkngfraxuzgcvpetpxylyztoleqzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667284.788637-942-19109714217690/AnsiballZ_command.py
Dec 02 09:21:25 np0005541913.localdomain sudo[140994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51575 DF PROTO=TCP SPT=40064 DPT=9101 SEQ=1377099721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D74A40000000001030307) 
Dec 02 09:21:25 np0005541913.localdomain python3.9[140996]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:21:25 np0005541913.localdomain sudo[140994]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:25 np0005541913.localdomain sudo[141087]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sodrgyzxnjeztzjlzlzusscmjqvikxon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667285.602621-966-144945427778709/AnsiballZ_command.py
Dec 02 09:21:25 np0005541913.localdomain sudo[141087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:26 np0005541913.localdomain python3.9[141089]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:21:26 np0005541913.localdomain sudo[141087]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:27 np0005541913.localdomain sudo[141180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxhcrwrbakxfnwuckloodpkpkoyrhvzm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764667286.2823193-990-186235307442515/AnsiballZ_edpm_nftables_from_files.py
Dec 02 09:21:27 np0005541913.localdomain sudo[141180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:27 np0005541913.localdomain python3[141182]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 09:21:27 np0005541913.localdomain sudo[141180]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:27 np0005541913.localdomain sudo[141272]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpuuiyiczamuuqknadvblalokeicjtnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667287.636131-1014-228115917145281/AnsiballZ_stat.py
Dec 02 09:21:27 np0005541913.localdomain sudo[141272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:28 np0005541913.localdomain python3.9[141274]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:28 np0005541913.localdomain sudo[141272]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:28 np0005541913.localdomain sudo[141345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yitvqjzdcatislvdfzbhpkbacxxfmbbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667287.636131-1014-228115917145281/AnsiballZ_copy.py
Dec 02 09:21:28 np0005541913.localdomain sudo[141345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:28 np0005541913.localdomain python3.9[141347]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667287.636131-1014-228115917145281/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:28 np0005541913.localdomain sudo[141345]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51576 DF PROTO=TCP SPT=40064 DPT=9101 SEQ=1377099721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D84650000000001030307) 
Dec 02 09:21:29 np0005541913.localdomain sudo[141437]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thayoigmyifqtbmwyhiglcbiiurwwiia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667289.0264647-1059-170007372066710/AnsiballZ_stat.py
Dec 02 09:21:29 np0005541913.localdomain sudo[141437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:29 np0005541913.localdomain python3.9[141439]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:29 np0005541913.localdomain sudo[141437]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:29 np0005541913.localdomain sudo[141510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvcvutzptyjrwprlamvqaqjdthysxolx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667289.0264647-1059-170007372066710/AnsiballZ_copy.py
Dec 02 09:21:29 np0005541913.localdomain sudo[141510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:30 np0005541913.localdomain python3.9[141512]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667289.0264647-1059-170007372066710/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:30 np0005541913.localdomain sudo[141510]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:30 np0005541913.localdomain sudo[141602]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-praaccltlibucyxmtesanecfuvmmjhvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667290.297134-1104-232397804750560/AnsiballZ_stat.py
Dec 02 09:21:30 np0005541913.localdomain sudo[141602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:30 np0005541913.localdomain python3.9[141604]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:30 np0005541913.localdomain sudo[141602]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:31 np0005541913.localdomain sudo[141675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsrgujxirqngsxrqsucxhxssyqoqkshn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667290.297134-1104-232397804750560/AnsiballZ_copy.py
Dec 02 09:21:31 np0005541913.localdomain sudo[141675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:31 np0005541913.localdomain python3.9[141677]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667290.297134-1104-232397804750560/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:31 np0005541913.localdomain sudo[141675]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:31 np0005541913.localdomain sudo[141767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezwhbqfztebioblyroasceewrjxdbcjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667291.479275-1150-74123217406858/AnsiballZ_stat.py
Dec 02 09:21:31 np0005541913.localdomain sudo[141767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:31 np0005541913.localdomain python3.9[141769]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:31 np0005541913.localdomain sudo[141767]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:32 np0005541913.localdomain sudo[141840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzvrvuowhpsvypvrhldlfcrasfodibgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667291.479275-1150-74123217406858/AnsiballZ_copy.py
Dec 02 09:21:32 np0005541913.localdomain sudo[141840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:32 np0005541913.localdomain python3.9[141842]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667291.479275-1150-74123217406858/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:32 np0005541913.localdomain sudo[141840]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:33 np0005541913.localdomain sudo[141932]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jesiuxqbohemxjzgzfhpxfkqwveunvig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667292.6718388-1194-166597013808890/AnsiballZ_stat.py
Dec 02 09:21:33 np0005541913.localdomain sudo[141932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:33 np0005541913.localdomain python3.9[141934]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:33 np0005541913.localdomain sudo[141932]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:33 np0005541913.localdomain sudo[142005]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qujllqpvycxjeamkicwoselwbqgrheen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667292.6718388-1194-166597013808890/AnsiballZ_copy.py
Dec 02 09:21:33 np0005541913.localdomain sudo[142005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15219 DF PROTO=TCP SPT=59858 DPT=9102 SEQ=515532362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D966D0000000001030307) 
Dec 02 09:21:33 np0005541913.localdomain python3.9[142007]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667292.6718388-1194-166597013808890/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:33 np0005541913.localdomain sudo[142005]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52300 DF PROTO=TCP SPT=39974 DPT=9105 SEQ=1851322314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478D96EF0000000001030307) 
Dec 02 09:21:34 np0005541913.localdomain sudo[142097]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qatmdfymgunopmqyftmjeykhtbaqsfsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667294.1449943-1239-281028883056471/AnsiballZ_file.py
Dec 02 09:21:34 np0005541913.localdomain sudo[142097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:34 np0005541913.localdomain python3.9[142099]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:34 np0005541913.localdomain sudo[142097]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:35 np0005541913.localdomain sudo[142189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqdngcuonykoyqksllnsangkrjvxfzzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667294.7638187-1263-164490234587709/AnsiballZ_command.py
Dec 02 09:21:35 np0005541913.localdomain sudo[142189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:35 np0005541913.localdomain python3.9[142191]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:21:35 np0005541913.localdomain sudo[142189]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:35 np0005541913.localdomain sudo[142284]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iejhmeoovvlokwacfndiatovjjahmske ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667295.4326694-1287-44582572975670/AnsiballZ_blockinfile.py
Dec 02 09:21:35 np0005541913.localdomain sudo[142284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:36 np0005541913.localdomain python3.9[142286]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:36 np0005541913.localdomain sudo[142284]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:36 np0005541913.localdomain sudo[142377]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiedviotoooxwnlfaqnvvxxjjcoewist ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667296.3229804-1314-145781884817757/AnsiballZ_file.py
Dec 02 09:21:36 np0005541913.localdomain sudo[142377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:36 np0005541913.localdomain python3.9[142379]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:36 np0005541913.localdomain sudo[142377]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:37 np0005541913.localdomain sudo[142469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hixzrpmckbgsmympbhxtkekkpoeokosw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667296.8661902-1314-185980356779871/AnsiballZ_file.py
Dec 02 09:21:37 np0005541913.localdomain sudo[142469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:37 np0005541913.localdomain python3.9[142471]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:37 np0005541913.localdomain sudo[142469]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1104 DF PROTO=TCP SPT=54984 DPT=9105 SEQ=3549678416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478DA5E40000000001030307) 
Dec 02 09:21:38 np0005541913.localdomain sudo[142561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znnllcurwezyogtkoogfswbyoysaabik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667297.9545445-1359-227765115196792/AnsiballZ_mount.py
Dec 02 09:21:38 np0005541913.localdomain sudo[142561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:38 np0005541913.localdomain python3.9[142563]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 02 09:21:38 np0005541913.localdomain sudo[142561]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:39 np0005541913.localdomain sudo[142654]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djxkfkexrxbbspavyyvcaalknewjhrgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667298.841649-1359-111989543231922/AnsiballZ_mount.py
Dec 02 09:21:39 np0005541913.localdomain sudo[142654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:39 np0005541913.localdomain python3.9[142656]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 02 09:21:39 np0005541913.localdomain sudo[142654]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:39 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14171 DF PROTO=TCP SPT=45958 DPT=9882 SEQ=3463406250 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478DADE40000000001030307) 
Dec 02 09:21:40 np0005541913.localdomain sshd[137418]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:21:40 np0005541913.localdomain systemd[1]: session-44.scope: Deactivated successfully.
Dec 02 09:21:40 np0005541913.localdomain systemd[1]: session-44.scope: Consumed 28.197s CPU time.
Dec 02 09:21:40 np0005541913.localdomain systemd-logind[757]: Session 44 logged out. Waiting for processes to exit.
Dec 02 09:21:40 np0005541913.localdomain systemd-logind[757]: Removed session 44.
Dec 02 09:21:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39348 DF PROTO=TCP SPT=41560 DPT=9100 SEQ=3376193355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478DB9E50000000001030307) 
Dec 02 09:21:45 np0005541913.localdomain sshd[142672]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:21:46 np0005541913.localdomain sshd[142672]: Accepted publickey for zuul from 192.168.122.30 port 36310 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:21:46 np0005541913.localdomain systemd-logind[757]: New session 45 of user zuul.
Dec 02 09:21:46 np0005541913.localdomain systemd[1]: Started Session 45 of User zuul.
Dec 02 09:21:46 np0005541913.localdomain sshd[142672]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:21:46 np0005541913.localdomain sudo[142765]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qujccxjrujjxzjjydrmoxapcfobribeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667306.169681-22-184828984280759/AnsiballZ_tempfile.py
Dec 02 09:21:46 np0005541913.localdomain sudo[142765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:46 np0005541913.localdomain python3.9[142767]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 02 09:21:46 np0005541913.localdomain sudo[142765]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:48 np0005541913.localdomain sudo[142857]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syrdxnqtzoagleklmfbddzqrzxhkjydo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667308.1823747-94-265490506430424/AnsiballZ_stat.py
Dec 02 09:21:48 np0005541913.localdomain sudo[142857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:48 np0005541913.localdomain python3.9[142859]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:21:48 np0005541913.localdomain sudo[142857]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:50 np0005541913.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 02 09:21:50 np0005541913.localdomain sudo[142953]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuizuypjrhhxsnposwumgphbskbolfhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667310.2025635-142-82217952862067/AnsiballZ_slurp.py
Dec 02 09:21:50 np0005541913.localdomain sudo[142953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:50 np0005541913.localdomain python3.9[142955]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec 02 09:21:50 np0005541913.localdomain sudo[142953]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:51 np0005541913.localdomain sudo[143045]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdkukclubghzpxlilhekgrzttkfqoyfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667311.584219-190-247129849797279/AnsiballZ_stat.py
Dec 02 09:21:51 np0005541913.localdomain sudo[143045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:52 np0005541913.localdomain python3.9[143047]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.cabvjj22 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:21:52 np0005541913.localdomain sudo[143045]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12255 DF PROTO=TCP SPT=40070 DPT=9101 SEQ=1751620103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478DDDE10000000001030307) 
Dec 02 09:21:52 np0005541913.localdomain sudo[143120]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klskwlrcfifjdsomzivfmyxsytqajius ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667311.584219-190-247129849797279/AnsiballZ_copy.py
Dec 02 09:21:52 np0005541913.localdomain sudo[143120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:52 np0005541913.localdomain sudo[143123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:21:52 np0005541913.localdomain sudo[143123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:21:52 np0005541913.localdomain sudo[143123]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:52 np0005541913.localdomain sudo[143138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:21:52 np0005541913.localdomain sudo[143138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:21:52 np0005541913.localdomain python3.9[143122]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.cabvjj22 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667311.584219-190-247129849797279/.source.cabvjj22 _original_basename=.rd_st6mg follow=False checksum=9674ae9a797ab88dd38896b99c4666372998fea7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:52 np0005541913.localdomain sudo[143120]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:53 np0005541913.localdomain podman[143240]: 2025-12-02 09:21:53.692925914 +0000 UTC m=+0.111924453 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, name=rhceph, CEPH_POINT_RELEASE=, release=1763362218)
Dec 02 09:21:53 np0005541913.localdomain podman[143240]: 2025-12-02 09:21:53.806147271 +0000 UTC m=+0.225145770 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, io.openshift.expose-services=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph)
Dec 02 09:21:54 np0005541913.localdomain sudo[143138]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:54 np0005541913.localdomain sudo[143339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:21:54 np0005541913.localdomain sudo[143339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:21:54 np0005541913.localdomain sudo[143339]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:54 np0005541913.localdomain sudo[143354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:21:54 np0005541913.localdomain sudo[143354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:21:54 np0005541913.localdomain sudo[143421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtubgdurvokxtxbusxivgiumubwzhgjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667314.0584793-280-2046501252154/AnsiballZ_setup.py
Dec 02 09:21:54 np0005541913.localdomain sudo[143421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:54 np0005541913.localdomain python3.9[143429]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:21:54 np0005541913.localdomain sudo[143421]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:54 np0005541913.localdomain sudo[143354]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:55 np0005541913.localdomain sudo[143461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:21:55 np0005541913.localdomain sudo[143461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:21:55 np0005541913.localdomain sudo[143461]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:56 np0005541913.localdomain sudo[143551]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umxoadbpislunbikxtacowinxvzcknxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667315.8773005-329-2859779613183/AnsiballZ_blockinfile.py
Dec 02 09:21:56 np0005541913.localdomain sudo[143551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:56 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25926 DF PROTO=TCP SPT=57488 DPT=9101 SEQ=205900510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478DEDE40000000001030307) 
Dec 02 09:21:56 np0005541913.localdomain python3.9[143553]: ansible-ansible.builtin.blockinfile Invoked with block=np0005541914.localdomain,192.168.122.108,np0005541914* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCHh7115UF/t7QzqWY1fk2wHPOuHuMPRhaYTC/yfMWr+nqJ5/TNZTuFxq0aW/1gHanB2usmC0wpWf4c1KsPZ71Ehs/j5nV1wfGtNVEq5Zj7uhs0ea/SQToF2RS406RoIzJW6ogv4Kl3nxGEK6c44WCu8+Ki98dCQ4wesh5kSBkqgiSq2IZkL2gjoAKeXdracGRJ596gTB0yfsMl/qdJDneVHMq/rptlFhabLeiEN+7C0o0gsZwYsxCd2oSB+DD9KfXhWIBeXRr1B7mFcMZpGNG7pG0d1IjYOUmqjvVpECHrLvjiitS3800ZEFwygU4sbM/DWHelobjtJB/fxxPTtGNlbH4MK/OGFh2mm5jB1LMqWSsifA/ZAHASAAffWDwKtF+xJ06OHRDT6gjzOd7VJpc8kR9Jn9pT7UnjypnrM12GtrO0CH8Lf3rin71kf9iZRIphqWXhiLN3G/mdJC2XPIxJp7NQ1Mqc5IhHciCv80bvsGrzLCtAr16/b+cPYo7vIGU=
                                                            np0005541914.localdomain,192.168.122.108,np0005541914* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGGWCSLJV2aPwMTfOaIZ+xjv1QFJPyldmo6H+V71SAll
                                                            np0005541914.localdomain,192.168.122.108,np0005541914* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFoWDrioobP7nWM6onZB+AZBuk/AQQ7zXxT58XHHnNVCXAZxKDdYUpn8CqfQBodfVNr1sWDyzBr0D5lMGYZypzo=
                                                            np0005541909.localdomain,192.168.122.103,np0005541909* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0b4xecJ9cZa0s7FCPYSs6kLrfHyBh8YL/KS+tj3DrfUU03KCcmbHQesHBBcRxB6PDYjueAsvx5rGXzjMojO5Jz2DlZoSPaBM9tm/HAKWhaiL+seTfrRsNLFvxfWyxU/x0FUSOTf01ZThrT/IJ5WkfJD4UgZQSzUPucffImwFt4y2oERfa96sAwSwE4o5RuLzRdKuWB3npxcApj2/3+pyWR59yubokMiU506MI37Hbg8xCaC5qn4ISKB8WBJObICoNQoatrbcqSOrrUEFv/vcWANDYUEw6XzTTwkuIu6dJPJiJh8j5TzDnnvKSK+f3eEG7OCiz814F+o82tDo7U6k5ERO0xmElXdOlPYsiuM5+CTQmmm6xmFN2L3HIvZlyPn3oF26oV+INAd3XsF5MIFcfpGUXH5b04gE7LhpdVLVfLGGYSVWjZhzxl/Wa0OiHoMaDUYoN2bPG0h5SPUDIyDv2jW3FDxhOWANR/9ITUCQpz3gSwl/1AVN3HCWf+RUeLuE=
                                                            np0005541909.localdomain,192.168.122.103,np0005541909* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIA7RcuDge6wF/g+qZxY6m8WG6IEuMAvvdJQnnCjLs+Z1
                                                            np0005541909.localdomain,192.168.122.103,np0005541909* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP5sNXub2DBEGdchrrXonnWitouBamsCHQlfu1Eq48/u/VA5EJmoCHsMI/KSOMxMnSS+uUeGceHpl9AyeHtY2NU=
                                                            np0005541910.localdomain,192.168.122.104,np0005541910* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDOmh2HMG9Y5+9VA8Ap3pHIOQhG/GfAsIqnmfJJuGwKb8N2T9r1Yd+kmoP7Xs41cto4h6Fw1f4Pa6Tw050y3LmwpXvDN+2Qq1qYI0rT4pqOiYBkyMbOQhqLF5tA+MNYGdibQj/fWkG+gKa8wwzkTgCEAn6PgEZiqR9LFJrqr4RfQDxaWCLmXM96+AVGG5/SXWx5u6T3lanUnpcfISvB2yx4HifsINAHPgLR4weEzra/b7e0QNyxItxvlDseasPyeYHD3Hdi2PNuUmoZC+zWEoWoU3BMAQeXR7lmEcdtyK5wr0pIBmf0CKFdvGrdVWrzAUbDc8ZHXmWyKlWHHZvHch1V2r/S4J2983UsG3sJwM8954Tj325LgS1nldIYBSjwMGfhZFYzmy9obAN7ZSV5qwD0h+rxt/I9RNdXS3SRu9tOZI+AN59De44cF23OJS5MfrfnB7JUnBOv4ScVML4rPjPx9L4/omOlfbBVJx42b1RlboXEk52J7Aa3xRseA4Elvuk=
                                                            np0005541910.localdomain,192.168.122.104,np0005541910* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIx+QMGsIWmPvyCeFcRzy+Z3KrW6oIHjAujq2mTiluKE
                                                            np0005541910.localdomain,192.168.122.104,np0005541910* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPiujdvwsNBrUjQMVBj6TBCEcpbfZIgHcCBzjuRUWPac2ltR7NNO2aF0KEDTH4F4qoWK7fw0fn0UFKuTrY4INV8=
                                                            np0005541913.localdomain,192.168.122.107,np0005541913* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDYXeXWwxJkeR9i2V9hYiVGqEGSbkwFIKUbTm3m8em9m5o380jUORSYXOITLm0CAl/waSYEc4fiPu2sAYDISig1zqAItfAODEdayFoKK63ui7vq92ZPKayhmjahj2jNo3KMAZ5aFzNBcowsRooRqLNJ7R9BAQ4H8kdqL9xdRjy5bvfWJHGrm8PvWcUaRYebCQ35j+7nHq4RFRYsd964NKjrq+FxkjyOSs2AxE+SHYOVgAAd8Jp2uyr3dR56IzWy8WqQzPj6tlsER8+/Kt1lASATcuMFeteA0M7tbjZxEIAPyfktPVQOq9mgeFOFmTf8oTbt94Rk2QmyNI4oE7sQHFWo9UWrvZd9LpDDartUls5uHunn4SzvgvtRimO3e1hNXn0VQLGNfSUwGij0R3iOYJpACHgly3J7sbX3tROvwRpawZlGIGZY46vaYRMXGClXz+lUCa6ZZO+f6BX6bEt0VfYWX8IVmnH2oJXEJBYJPVXZML+OcczJc8zEfHxBylpZn4k=
                                                            np0005541913.localdomain,192.168.122.107,np0005541913* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEGKyrd1x8JIpNEVeXNPog2z4+Z1Gyh32lFLn9uh2H3I
                                                            np0005541913.localdomain,192.168.122.107,np0005541913* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAGOHjEHyYQ71qgLjQWD4LGL0rAKniN6cBK/Yx+b+dGqDveVXKGlkaXQOOfCp4GEX5fDI6bqBjCB02Ool/6wTT8=
                                                            np0005541911.localdomain,192.168.122.105,np0005541911* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzI5YTDMvj8zBlKqeNplIMBQQJ43gcDfB5cRE7DwwpHBRcqOuhSoIm7r0C3h5ABQJYkTXEGRY0i5HC5eMErD7SKRJJ3q9aZ+uv4VvUGagr7M9S/JGUjZej2+ACXZ7L+d9MLt389xVtIuuNh5Cy3U8muIBEAS1b4mXOJ95eiW3M5b2hxmol0DTjUMX/bLtJU/MQ09wE72pj6Uqz/CCFsUwDBZlQ3jcVK74fYwgItCNkLJ+D2E4wTl4Ei8XOlEY9cV8B1E+aK6iUKesiya0Vfi/Ant77ONQDeCsI21AJDbi5wtUXg4qXBu3Z/zObZiEmedzqWj7K46Nv8lDlQoeoKuxzTCwxgn0PaorQgkUvUdAyk5Qo4BaUOv8ojICiZvRy9QZ3jblr1dCM/Jy3g4Sz6Hz4QHxtV21nUw//sBN2X6jCHQVGTJeZrbVvgGNcGiqcCzQTW/4NoiOB0ho7RVNtD+oYb5UE+Lh+Ibua3bv7zfnLjsw1GiyclsCgrQTKBl8Netc=
                                                            np0005541911.localdomain,192.168.122.105,np0005541911* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILT7VjxC/vKVj4DmZTIjCQwrK+UN5wih4A5ddEFb5wLX
                                                            np0005541911.localdomain,192.168.122.105,np0005541911* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEJ5o8j1+/xDc8zMV2yChXY+U6nf1GT6sS3GGAkd+aR/6mUWuiQzjkFESsidYGPHaqz55q4REeXXQtW6T8mmqzU=
                                                            np0005541912.localdomain,192.168.122.106,np0005541912* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDKgyHtHHKWFdaOqx5AsvOJPmNsbjVxvzh05A7Hy02rgbdg4zBUd/E0mqG+tYVGg12fIdbRNgjUfM+PEGJznZdEQnZCtLgMhbpRC33IbCXMw7Ev/tRfkffpP+H8VdyGL83zCFFnMIMD2IDWU+MjTf/ais63Zv/UiBL24pkZ18u3nypjN3uN2FdeDF4JNtnSVK6i1a+wE6wLmdSAfX8ovFbLhZMgAAPU3I3Fu5D/pSa6OjKshEcNy0m6KCKwQoT6cbDGsnMjd2sdE1Vc+KgkrBN3fMmrChdgi2Ig7CpkdGvQF0G/t53cwNatjp78FrNCHjpLcIAFw3QgfepiTiXQbXQ/jC5xkdM+5wIcSmB3rf3GKaUgaxnjk55GAXxrHwAFwOi+ltxSNPszH9vfIBLluThUdmQmvtCOCvEFZ5uuVuu94A5frS9BzOIzz7ylrqau3nHGaPjbT80XubnqZsHlOahsovbk1mu3ewvoitAVb0E+BBroNWeHT9BbA8Igh+sxwGM=
                                                            np0005541912.localdomain,192.168.122.106,np0005541912* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJZZ0KsiMflqlnr0GTYoucjExbwZ18yPSOiSsfRMt90v
                                                            np0005541912.localdomain,192.168.122.106,np0005541912* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGm4CXNWO0ZHMO4eJHc4n6NO7LQlY2+Ctp7F81Y3AEXQl3GIl2c/UCuL0O5ZJj6nEB654FSLAuOOifViFW8rlDc=
                                                             create=True mode=0644 path=/tmp/ansible.cabvjj22 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:21:56 np0005541913.localdomain sudo[143551]: pam_unix(sudo:session): session closed for user root
Dec 02 09:21:58 np0005541913.localdomain sudo[143643]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiuhhtioxtglqqtdftidphuqmwjdgsex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667317.8153365-377-27500996278026/AnsiballZ_command.py
Dec 02 09:21:58 np0005541913.localdomain sudo[143643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:21:58 np0005541913.localdomain python3.9[143645]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.cabvjj22' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:21:58 np0005541913.localdomain sudo[143643]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:00 np0005541913.localdomain sudo[143737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpcxpaaywltbxbqyhbgjtaqcmiwzakgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667319.1732063-425-25372338500630/AnsiballZ_file.py
Dec 02 09:22:00 np0005541913.localdomain sudo[143737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:00 np0005541913.localdomain python3.9[143739]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.cabvjj22 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:00 np0005541913.localdomain sudo[143737]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:01 np0005541913.localdomain sshd[142672]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:22:01 np0005541913.localdomain systemd[1]: session-45.scope: Deactivated successfully.
Dec 02 09:22:01 np0005541913.localdomain systemd[1]: session-45.scope: Consumed 4.216s CPU time.
Dec 02 09:22:01 np0005541913.localdomain systemd-logind[757]: Session 45 logged out. Waiting for processes to exit.
Dec 02 09:22:01 np0005541913.localdomain systemd-logind[757]: Removed session 45.
Dec 02 09:22:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31968 DF PROTO=TCP SPT=42600 DPT=9102 SEQ=1946441030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E0BAD0000000001030307) 
Dec 02 09:22:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4054 DF PROTO=TCP SPT=42590 DPT=9105 SEQ=3265164041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E0C1F0000000001030307) 
Dec 02 09:22:06 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8300 DF PROTO=TCP SPT=50462 DPT=9882 SEQ=1476744479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E13DC0000000001030307) 
Dec 02 09:22:07 np0005541913.localdomain sshd[143754]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:22:07 np0005541913.localdomain sshd[143754]: Accepted publickey for zuul from 192.168.122.30 port 60892 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:22:07 np0005541913.localdomain systemd-logind[757]: New session 46 of user zuul.
Dec 02 09:22:07 np0005541913.localdomain systemd[1]: Started Session 46 of User zuul.
Dec 02 09:22:07 np0005541913.localdomain sshd[143754]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:22:09 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50099 DF PROTO=TCP SPT=34458 DPT=9100 SEQ=3341923684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E1F8B0000000001030307) 
Dec 02 09:22:09 np0005541913.localdomain python3.9[143847]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:22:10 np0005541913.localdomain sudo[143941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idjkttwnvjsditonlnjpdpdyjmaptwtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667330.0968938-57-50380269921669/AnsiballZ_systemd.py
Dec 02 09:22:10 np0005541913.localdomain sudo[143941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:10 np0005541913.localdomain python3.9[143943]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 09:22:11 np0005541913.localdomain sudo[143941]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:11 np0005541913.localdomain sudo[144035]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sufhzcjvkwxmgqqkmaoprsleqgisjsua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667331.1954126-81-112788740698623/AnsiballZ_systemd.py
Dec 02 09:22:11 np0005541913.localdomain sudo[144035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:11 np0005541913.localdomain python3.9[144037]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:22:11 np0005541913.localdomain sudo[144035]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:12 np0005541913.localdomain sudo[144128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhrcygcgvmpqznpweqaskqnphtlicjfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667331.9980762-108-263518026499046/AnsiballZ_command.py
Dec 02 09:22:12 np0005541913.localdomain sudo[144128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:12 np0005541913.localdomain python3.9[144130]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:22:12 np0005541913.localdomain sudo[144128]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:13 np0005541913.localdomain sudo[144221]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-carldmayntyvyexdsdfeqzuohmirafvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667332.7296185-132-167116616769097/AnsiballZ_stat.py
Dec 02 09:22:13 np0005541913.localdomain sudo[144221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:13 np0005541913.localdomain python3.9[144223]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:22:13 np0005541913.localdomain sudo[144221]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:13 np0005541913.localdomain sudo[144315]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-todkxoptlkaxbjouuofndxkxqypnkvos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667333.507437-156-200482133285320/AnsiballZ_command.py
Dec 02 09:22:13 np0005541913.localdomain sudo[144315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:13 np0005541913.localdomain python3.9[144317]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:22:14 np0005541913.localdomain sudo[144315]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:14 np0005541913.localdomain sudo[144410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukqmxpwtolxtbhlbcaoychjnqwpgqnxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667334.1534195-180-261202409666923/AnsiballZ_file.py
Dec 02 09:22:14 np0005541913.localdomain sudo[144410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:14 np0005541913.localdomain python3.9[144412]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:14 np0005541913.localdomain sudo[144410]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:15 np0005541913.localdomain sshd[143754]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:22:15 np0005541913.localdomain systemd-logind[757]: Session 46 logged out. Waiting for processes to exit.
Dec 02 09:22:15 np0005541913.localdomain systemd[1]: session-46.scope: Deactivated successfully.
Dec 02 09:22:15 np0005541913.localdomain systemd[1]: session-46.scope: Consumed 3.973s CPU time.
Dec 02 09:22:15 np0005541913.localdomain systemd-logind[757]: Removed session 46.
Dec 02 09:22:21 np0005541913.localdomain sshd[144427]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:22:21 np0005541913.localdomain sshd[144427]: Accepted publickey for zuul from 192.168.122.30 port 43562 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:22:21 np0005541913.localdomain systemd-logind[757]: New session 47 of user zuul.
Dec 02 09:22:21 np0005541913.localdomain systemd[1]: Started Session 47 of User zuul.
Dec 02 09:22:21 np0005541913.localdomain sshd[144427]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:22:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12459 DF PROTO=TCP SPT=60278 DPT=9101 SEQ=3332007714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E530F0000000001030307) 
Dec 02 09:22:22 np0005541913.localdomain python3.9[144520]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:22:23 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12460 DF PROTO=TCP SPT=60278 DPT=9101 SEQ=3332007714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E57240000000001030307) 
Dec 02 09:22:23 np0005541913.localdomain sudo[144614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kukuhinfddwlmzwwwtfxzaucecguadju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667343.0865822-63-81911651931586/AnsiballZ_setup.py
Dec 02 09:22:23 np0005541913.localdomain sudo[144614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:23 np0005541913.localdomain python3.9[144616]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:22:24 np0005541913.localdomain sudo[144614]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:24 np0005541913.localdomain sudo[144668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hknplpfvxubzzuwcttjygtuwjdqzytdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667343.0865822-63-81911651931586/AnsiballZ_dnf.py
Dec 02 09:22:24 np0005541913.localdomain sudo[144668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:24 np0005541913.localdomain python3.9[144670]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 09:22:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12461 DF PROTO=TCP SPT=60278 DPT=9101 SEQ=3332007714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E5F240000000001030307) 
Dec 02 09:22:28 np0005541913.localdomain sudo[144668]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:28 np0005541913.localdomain python3.9[144762]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:22:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12462 DF PROTO=TCP SPT=60278 DPT=9101 SEQ=3332007714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E6EE40000000001030307) 
Dec 02 09:22:30 np0005541913.localdomain sudo[144853]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpovascjwfxxpsrfjdpdifzkevlkuxwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667350.1748142-126-63142375040062/AnsiballZ_file.py
Dec 02 09:22:30 np0005541913.localdomain sudo[144853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:30 np0005541913.localdomain python3.9[144855]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:30 np0005541913.localdomain sudo[144853]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:31 np0005541913.localdomain sudo[144945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osvixflijryefkivgnxngaavskcpieve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667350.9675307-150-217927820979879/AnsiballZ_file.py
Dec 02 09:22:31 np0005541913.localdomain sudo[144945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:31 np0005541913.localdomain python3.9[144947]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:31 np0005541913.localdomain sudo[144945]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:32 np0005541913.localdomain sudo[145037]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayocwiusyjughitmqiiqwxlyauhupzhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667351.9398718-174-142414688971500/AnsiballZ_lineinfile.py
Dec 02 09:22:32 np0005541913.localdomain sudo[145037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:32 np0005541913.localdomain python3.9[145039]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                            Core libraries or services have been updated since boot-up:
                                                              * systemd
                                                            
                                                            Reboot is required to fully utilize these updates.
                                                            More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:32 np0005541913.localdomain sudo[145037]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:33 np0005541913.localdomain python3.9[145129]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 09:22:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62073 DF PROTO=TCP SPT=43312 DPT=9102 SEQ=1471303209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E80CE0000000001030307) 
Dec 02 09:22:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19204 DF PROTO=TCP SPT=52262 DPT=9105 SEQ=1934263459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E814F0000000001030307) 
Dec 02 09:22:34 np0005541913.localdomain python3.9[145219]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:22:34 np0005541913.localdomain python3.9[145311]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:22:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62074 DF PROTO=TCP SPT=43312 DPT=9102 SEQ=1471303209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E84E40000000001030307) 
Dec 02 09:22:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19205 DF PROTO=TCP SPT=52262 DPT=9105 SEQ=1934263459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E85640000000001030307) 
Dec 02 09:22:35 np0005541913.localdomain sshd[144427]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:22:35 np0005541913.localdomain systemd[1]: session-47.scope: Deactivated successfully.
Dec 02 09:22:35 np0005541913.localdomain systemd[1]: session-47.scope: Consumed 9.854s CPU time.
Dec 02 09:22:35 np0005541913.localdomain systemd-logind[757]: Session 47 logged out. Waiting for processes to exit.
Dec 02 09:22:35 np0005541913.localdomain systemd-logind[757]: Removed session 47.
Dec 02 09:22:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64815 DF PROTO=TCP SPT=51430 DPT=9882 SEQ=2114928465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E890C0000000001030307) 
Dec 02 09:22:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62075 DF PROTO=TCP SPT=43312 DPT=9102 SEQ=1471303209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E8CE40000000001030307) 
Dec 02 09:22:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64816 DF PROTO=TCP SPT=51430 DPT=9882 SEQ=2114928465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E8D240000000001030307) 
Dec 02 09:22:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19206 DF PROTO=TCP SPT=52262 DPT=9105 SEQ=1934263459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E8D650000000001030307) 
Dec 02 09:22:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24045 DF PROTO=TCP SPT=46790 DPT=9100 SEQ=46980730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478E98E40000000001030307) 
Dec 02 09:22:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64818 DF PROTO=TCP SPT=51430 DPT=9882 SEQ=2114928465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478EA4E50000000001030307) 
Dec 02 09:22:43 np0005541913.localdomain sshd[145326]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:22:43 np0005541913.localdomain sshd[145326]: Accepted publickey for zuul from 192.168.122.30 port 41704 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:22:43 np0005541913.localdomain systemd-logind[757]: New session 48 of user zuul.
Dec 02 09:22:43 np0005541913.localdomain systemd[1]: Started Session 48 of User zuul.
Dec 02 09:22:43 np0005541913.localdomain sshd[145326]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:22:44 np0005541913.localdomain python3.9[145419]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:22:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24047 DF PROTO=TCP SPT=46790 DPT=9100 SEQ=46980730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478EB0A40000000001030307) 
Dec 02 09:22:46 np0005541913.localdomain sudo[145513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vobjvtfcdtmmkoozdxsxasontminknbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667366.0600305-161-70719726935240/AnsiballZ_file.py
Dec 02 09:22:46 np0005541913.localdomain sudo[145513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:46 np0005541913.localdomain python3.9[145515]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:22:46 np0005541913.localdomain sudo[145513]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:47 np0005541913.localdomain sudo[145605]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixdfohshofincmfkrlmvarrbafaoldci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667366.7862518-182-237833545214715/AnsiballZ_stat.py
Dec 02 09:22:47 np0005541913.localdomain sudo[145605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:47 np0005541913.localdomain python3.9[145607]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:22:47 np0005541913.localdomain sudo[145605]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:48 np0005541913.localdomain sudo[145678]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kegikebbipwundooedbxdvewmankdbvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667366.7862518-182-237833545214715/AnsiballZ_copy.py
Dec 02 09:22:48 np0005541913.localdomain sudo[145678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:48 np0005541913.localdomain python3.9[145680]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667366.7862518-182-237833545214715/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=73226dd0fbcefd6bca2e777d65fae037e6bf10fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:48 np0005541913.localdomain sudo[145678]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:48 np0005541913.localdomain sudo[145770]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aquemiufolvntxrmwthswxdepuavwljb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667368.4951851-232-140754725437905/AnsiballZ_file.py
Dec 02 09:22:48 np0005541913.localdomain sudo[145770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:48 np0005541913.localdomain python3.9[145772]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:22:48 np0005541913.localdomain sudo[145770]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:49 np0005541913.localdomain sudo[145862]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oriaiekrycyfpzvaeirpicormwrrsxat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667369.0751545-253-39627067167492/AnsiballZ_stat.py
Dec 02 09:22:49 np0005541913.localdomain sudo[145862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62077 DF PROTO=TCP SPT=43312 DPT=9102 SEQ=1471303209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478EBDE40000000001030307) 
Dec 02 09:22:49 np0005541913.localdomain python3.9[145864]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:22:49 np0005541913.localdomain sudo[145862]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:50 np0005541913.localdomain sudo[145935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-przcegzrfrcktpfdvbvraiahqhvpxcnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667369.0751545-253-39627067167492/AnsiballZ_copy.py
Dec 02 09:22:50 np0005541913.localdomain sudo[145935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:50 np0005541913.localdomain python3.9[145937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667369.0751545-253-39627067167492/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=73226dd0fbcefd6bca2e777d65fae037e6bf10fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:50 np0005541913.localdomain sudo[145935]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:50 np0005541913.localdomain sudo[146027]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvqfogdxaltxedhbxigisxkzxitodeoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667370.6606874-300-52328201432466/AnsiballZ_file.py
Dec 02 09:22:50 np0005541913.localdomain sudo[146027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:51 np0005541913.localdomain python3.9[146029]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:22:51 np0005541913.localdomain sudo[146027]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:51 np0005541913.localdomain chronyd[137402]: Selected source 162.159.200.123 (pool.ntp.org)
Dec 02 09:22:51 np0005541913.localdomain sudo[146119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbhgmwcnqsrtuhxaoxnqvpnxlgjzcsvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667371.2581842-324-253539572682250/AnsiballZ_stat.py
Dec 02 09:22:51 np0005541913.localdomain sudo[146119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:51 np0005541913.localdomain python3.9[146121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:22:51 np0005541913.localdomain sudo[146119]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:52 np0005541913.localdomain sudo[146192]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwzatpuadarajsktigypykrdzcbalniz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667371.2581842-324-253539572682250/AnsiballZ_copy.py
Dec 02 09:22:52 np0005541913.localdomain sudo[146192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36387 DF PROTO=TCP SPT=59992 DPT=9101 SEQ=1923884833 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478EC8400000000001030307) 
Dec 02 09:22:52 np0005541913.localdomain python3.9[146194]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667371.2581842-324-253539572682250/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=73226dd0fbcefd6bca2e777d65fae037e6bf10fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:52 np0005541913.localdomain sudo[146192]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:52 np0005541913.localdomain sudo[146284]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgfogyaosnjdbybxpixdurqedkejnqkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667372.6001482-365-49453322537608/AnsiballZ_file.py
Dec 02 09:22:52 np0005541913.localdomain sudo[146284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:53 np0005541913.localdomain python3.9[146286]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:22:53 np0005541913.localdomain sudo[146284]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:53 np0005541913.localdomain sudo[146376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uapdycgpjwtvuzabjghziidrklrwzjqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667373.2502956-392-32427731805147/AnsiballZ_stat.py
Dec 02 09:22:53 np0005541913.localdomain sudo[146376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:53 np0005541913.localdomain python3.9[146378]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:22:53 np0005541913.localdomain sudo[146376]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:54 np0005541913.localdomain sudo[146449]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kaxwpuoxlqedkagrhdhdfvnofpfqsjwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667373.2502956-392-32427731805147/AnsiballZ_copy.py
Dec 02 09:22:54 np0005541913.localdomain sudo[146449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:54 np0005541913.localdomain python3.9[146451]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667373.2502956-392-32427731805147/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=73226dd0fbcefd6bca2e777d65fae037e6bf10fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:54 np0005541913.localdomain sudo[146449]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:54 np0005541913.localdomain sudo[146541]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlpvrsrqhvloetldzvpljtfeqodpsafr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667374.6199422-440-168115344824082/AnsiballZ_file.py
Dec 02 09:22:54 np0005541913.localdomain sudo[146541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:55 np0005541913.localdomain python3.9[146543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:22:55 np0005541913.localdomain sudo[146541]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36389 DF PROTO=TCP SPT=59992 DPT=9101 SEQ=1923884833 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478ED4640000000001030307) 
Dec 02 09:22:55 np0005541913.localdomain sudo[146633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrhfbyemmyfugyvdjdcacjltwsosgjns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667375.2379694-464-190194588618507/AnsiballZ_stat.py
Dec 02 09:22:55 np0005541913.localdomain sudo[146633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:55 np0005541913.localdomain sudo[146636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:22:55 np0005541913.localdomain sudo[146636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:22:55 np0005541913.localdomain sudo[146636]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:55 np0005541913.localdomain python3.9[146635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:22:55 np0005541913.localdomain sudo[146633]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:55 np0005541913.localdomain sudo[146651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:22:55 np0005541913.localdomain sudo[146651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:22:56 np0005541913.localdomain sudo[146736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpwkrbjjkqygcxqtyxbphxoqwkzphdxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667375.2379694-464-190194588618507/AnsiballZ_copy.py
Dec 02 09:22:56 np0005541913.localdomain sudo[146736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:56 np0005541913.localdomain python3.9[146739]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667375.2379694-464-190194588618507/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=73226dd0fbcefd6bca2e777d65fae037e6bf10fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:56 np0005541913.localdomain sudo[146736]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:56 np0005541913.localdomain sudo[146651]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:56 np0005541913.localdomain sudo[146859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgihbnasmrskegbpqlakyxceqcjxogmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667376.458752-509-233812079372316/AnsiballZ_file.py
Dec 02 09:22:56 np0005541913.localdomain sudo[146859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:56 np0005541913.localdomain python3.9[146861]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:22:56 np0005541913.localdomain sudo[146859]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:57 np0005541913.localdomain sudo[146876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:22:57 np0005541913.localdomain sudo[146876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:22:57 np0005541913.localdomain sudo[146876]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:57 np0005541913.localdomain sudo[146966]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zoxqsnpmtgzkzxepzegwaykuyywxahvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667377.0993915-534-211431127967123/AnsiballZ_stat.py
Dec 02 09:22:57 np0005541913.localdomain sudo[146966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:57 np0005541913.localdomain python3.9[146968]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:22:57 np0005541913.localdomain sudo[146966]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:57 np0005541913.localdomain sudo[147039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlsgqfupnvwivspdmtypfcsfrrmjmord ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667377.0993915-534-211431127967123/AnsiballZ_copy.py
Dec 02 09:22:57 np0005541913.localdomain sudo[147039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:58 np0005541913.localdomain python3.9[147041]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667377.0993915-534-211431127967123/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=73226dd0fbcefd6bca2e777d65fae037e6bf10fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:58 np0005541913.localdomain sudo[147039]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:58 np0005541913.localdomain sudo[147131]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-renshnolzcyypkoihpbrdmdfizpadlcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667378.3123384-579-101949115196749/AnsiballZ_file.py
Dec 02 09:22:58 np0005541913.localdomain sudo[147131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:58 np0005541913.localdomain python3.9[147133]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:22:58 np0005541913.localdomain sudo[147131]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:59 np0005541913.localdomain sudo[147223]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfijpykqaaikqouhtqtmmuanduxhmwiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667378.9759178-604-188095820318620/AnsiballZ_stat.py
Dec 02 09:22:59 np0005541913.localdomain sudo[147223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36390 DF PROTO=TCP SPT=59992 DPT=9101 SEQ=1923884833 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478EE4240000000001030307) 
Dec 02 09:22:59 np0005541913.localdomain python3.9[147225]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:22:59 np0005541913.localdomain sudo[147223]: pam_unix(sudo:session): session closed for user root
Dec 02 09:22:59 np0005541913.localdomain sudo[147296]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oagunppkgzodecvfqezdnzjyxquecsel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667378.9759178-604-188095820318620/AnsiballZ_copy.py
Dec 02 09:22:59 np0005541913.localdomain sudo[147296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:22:59 np0005541913.localdomain python3.9[147298]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667378.9759178-604-188095820318620/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=73226dd0fbcefd6bca2e777d65fae037e6bf10fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:22:59 np0005541913.localdomain sudo[147296]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:00 np0005541913.localdomain sudo[147388]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jykuisgxjmrxccmgsnokkmzsjbcgugwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667380.4525235-649-103762045278651/AnsiballZ_file.py
Dec 02 09:23:00 np0005541913.localdomain sudo[147388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:00 np0005541913.localdomain python3.9[147390]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:23:00 np0005541913.localdomain sudo[147388]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:01 np0005541913.localdomain sudo[147480]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfhxybdwmqyijdmnvjnagnljwwzrcwgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667381.0581806-670-115853626865740/AnsiballZ_stat.py
Dec 02 09:23:01 np0005541913.localdomain sudo[147480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:01 np0005541913.localdomain python3.9[147482]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:23:01 np0005541913.localdomain sudo[147480]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:02 np0005541913.localdomain sudo[147553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjkrgeeltuckescldmdoiaknbwvlnydd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667381.0581806-670-115853626865740/AnsiballZ_copy.py
Dec 02 09:23:02 np0005541913.localdomain sudo[147553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:02 np0005541913.localdomain python3.9[147555]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667381.0581806-670-115853626865740/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=73226dd0fbcefd6bca2e777d65fae037e6bf10fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:23:02 np0005541913.localdomain sudo[147553]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44924 DF PROTO=TCP SPT=51692 DPT=9102 SEQ=2714315369 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478EF5FD0000000001030307) 
Dec 02 09:23:03 np0005541913.localdomain sshd[145326]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:23:03 np0005541913.localdomain systemd[1]: session-48.scope: Deactivated successfully.
Dec 02 09:23:03 np0005541913.localdomain systemd[1]: session-48.scope: Consumed 12.104s CPU time.
Dec 02 09:23:03 np0005541913.localdomain systemd-logind[757]: Session 48 logged out. Waiting for processes to exit.
Dec 02 09:23:03 np0005541913.localdomain systemd-logind[757]: Removed session 48.
Dec 02 09:23:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6054 DF PROTO=TCP SPT=34474 DPT=9105 SEQ=2347424361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478EF67F0000000001030307) 
Dec 02 09:23:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44926 DF PROTO=TCP SPT=51692 DPT=9102 SEQ=2714315369 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F02240000000001030307) 
Dec 02 09:23:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11815 DF PROTO=TCP SPT=38638 DPT=9100 SEQ=3358605897 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F0DE40000000001030307) 
Dec 02 09:23:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27888 DF PROTO=TCP SPT=57682 DPT=9882 SEQ=1675163659 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F1A240000000001030307) 
Dec 02 09:23:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11817 DF PROTO=TCP SPT=38638 DPT=9100 SEQ=3358605897 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F25A50000000001030307) 
Dec 02 09:23:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6058 DF PROTO=TCP SPT=34474 DPT=9105 SEQ=2347424361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F31E40000000001030307) 
Dec 02 09:23:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32648 DF PROTO=TCP SPT=58712 DPT=9101 SEQ=3898644437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F3D700000000001030307) 
Dec 02 09:23:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32650 DF PROTO=TCP SPT=58712 DPT=9101 SEQ=3898644437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F49640000000001030307) 
Dec 02 09:23:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32651 DF PROTO=TCP SPT=58712 DPT=9101 SEQ=3898644437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F59240000000001030307) 
Dec 02 09:23:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11928 DF PROTO=TCP SPT=57130 DPT=9102 SEQ=1338226220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F6B2D0000000001030307) 
Dec 02 09:23:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2690 DF PROTO=TCP SPT=59382 DPT=9105 SEQ=1675744302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F6BAF0000000001030307) 
Dec 02 09:23:35 np0005541913.localdomain sshd[147571]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:23:35 np0005541913.localdomain sshd[147571]: Accepted publickey for zuul from 192.168.122.30 port 47202 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:23:35 np0005541913.localdomain systemd-logind[757]: New session 49 of user zuul.
Dec 02 09:23:35 np0005541913.localdomain systemd[1]: Started Session 49 of User zuul.
Dec 02 09:23:35 np0005541913.localdomain sshd[147571]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:23:36 np0005541913.localdomain sudo[147664]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdwjfbvcyzqjanzvbbybbggtgqawbnjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667416.0300996-27-231630281612730/AnsiballZ_file.py
Dec 02 09:23:36 np0005541913.localdomain sudo[147664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:36 np0005541913.localdomain python3.9[147666]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:23:36 np0005541913.localdomain sudo[147664]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11930 DF PROTO=TCP SPT=57130 DPT=9102 SEQ=1338226220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F77240000000001030307) 
Dec 02 09:23:37 np0005541913.localdomain sudo[147756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbrtcxprxmkgxmezybztbvwbeutepqfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667416.8278933-63-246619730869614/AnsiballZ_stat.py
Dec 02 09:23:37 np0005541913.localdomain sudo[147756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:37 np0005541913.localdomain python3.9[147758]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:23:37 np0005541913.localdomain sudo[147756]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:37 np0005541913.localdomain sudo[147829]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcxfoatqgubnlnolrfgxzigailhoocpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667416.8278933-63-246619730869614/AnsiballZ_copy.py
Dec 02 09:23:37 np0005541913.localdomain sudo[147829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:38 np0005541913.localdomain python3.9[147831]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667416.8278933-63-246619730869614/.source.conf _original_basename=ceph.conf follow=False checksum=bb050c8012c4b6ce73dbd1d555a91a361a703a4d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:23:38 np0005541913.localdomain sudo[147829]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:38 np0005541913.localdomain sudo[147921]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojglkozlupsuxwlbdavmwtayhtpbqiui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667418.1812358-63-26626235553072/AnsiballZ_stat.py
Dec 02 09:23:38 np0005541913.localdomain sudo[147921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:38 np0005541913.localdomain python3.9[147923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:23:38 np0005541913.localdomain sudo[147921]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:38 np0005541913.localdomain sudo[147994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjxzouuhkcrpctdzmzpkqivwlwltaahd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667418.1812358-63-26626235553072/AnsiballZ_copy.py
Dec 02 09:23:38 np0005541913.localdomain sudo[147994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:39 np0005541913.localdomain python3.9[147996]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667418.1812358-63-26626235553072/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=55e6802793866e8195bd7dc6c06395cc4184e741 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:23:39 np0005541913.localdomain sudo[147994]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:39 np0005541913.localdomain sshd[147571]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:23:39 np0005541913.localdomain systemd[1]: session-49.scope: Deactivated successfully.
Dec 02 09:23:39 np0005541913.localdomain systemd[1]: session-49.scope: Consumed 2.272s CPU time.
Dec 02 09:23:39 np0005541913.localdomain systemd-logind[757]: Session 49 logged out. Waiting for processes to exit.
Dec 02 09:23:39 np0005541913.localdomain systemd-logind[757]: Removed session 49.
Dec 02 09:23:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53855 DF PROTO=TCP SPT=37604 DPT=9100 SEQ=2804230835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F83240000000001030307) 
Dec 02 09:23:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52376 DF PROTO=TCP SPT=40320 DPT=9882 SEQ=2329003095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F8F240000000001030307) 
Dec 02 09:23:45 np0005541913.localdomain sshd[148012]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:23:45 np0005541913.localdomain sshd[148012]: Accepted publickey for zuul from 192.168.122.30 port 34318 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:23:45 np0005541913.localdomain systemd-logind[757]: New session 50 of user zuul.
Dec 02 09:23:45 np0005541913.localdomain systemd[1]: Started Session 50 of User zuul.
Dec 02 09:23:45 np0005541913.localdomain sshd[148012]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:23:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53857 DF PROTO=TCP SPT=37604 DPT=9100 SEQ=2804230835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478F9AE50000000001030307) 
Dec 02 09:23:46 np0005541913.localdomain python3.9[148105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:23:47 np0005541913.localdomain sudo[148199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yikqeqvwjhmrzhslpvfjhbujsqbpksgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667427.0385284-63-85713560890181/AnsiballZ_file.py
Dec 02 09:23:47 np0005541913.localdomain sudo[148199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:47 np0005541913.localdomain python3.9[148201]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:23:47 np0005541913.localdomain sudo[148199]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:47 np0005541913.localdomain sudo[148291]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vugxjaohtdonovyxznufdyfdkdfamomj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667427.6843252-63-210066025660566/AnsiballZ_file.py
Dec 02 09:23:47 np0005541913.localdomain sudo[148291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:48 np0005541913.localdomain python3.9[148293]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:23:48 np0005541913.localdomain sudo[148291]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:48 np0005541913.localdomain python3.9[148383]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:23:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2694 DF PROTO=TCP SPT=59382 DPT=9105 SEQ=1675744302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478FA7E50000000001030307) 
Dec 02 09:23:50 np0005541913.localdomain sudo[148473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyjdhhmditparpcleorrmweyrtmzbkcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667429.8331394-132-68187981962165/AnsiballZ_seboolean.py
Dec 02 09:23:50 np0005541913.localdomain sudo[148473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:50 np0005541913.localdomain python3.9[148475]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 02 09:23:50 np0005541913.localdomain sudo[148473]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:51 np0005541913.localdomain sudo[148565]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qunqgqgvwblgoxyfgwfqcvvnenmsoplj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667431.5280526-162-248469637574602/AnsiballZ_setup.py
Dec 02 09:23:51 np0005541913.localdomain sudo[148565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:52 np0005541913.localdomain python3.9[148567]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:23:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43256 DF PROTO=TCP SPT=59866 DPT=9101 SEQ=553843952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478FB29F0000000001030307) 
Dec 02 09:23:52 np0005541913.localdomain sudo[148565]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:52 np0005541913.localdomain sudo[148619]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdzjtrshjfgiutzkxpspwzpjkihubxnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667431.5280526-162-248469637574602/AnsiballZ_dnf.py
Dec 02 09:23:52 np0005541913.localdomain sudo[148619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:53 np0005541913.localdomain python3.9[148621]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:23:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43258 DF PROTO=TCP SPT=59866 DPT=9101 SEQ=553843952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478FBEA40000000001030307) 
Dec 02 09:23:56 np0005541913.localdomain sudo[148619]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:57 np0005541913.localdomain sudo[148713]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfuqgwjpzdpottheadcrnjsgdtgvndep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667436.5610719-198-256606936523220/AnsiballZ_systemd.py
Dec 02 09:23:57 np0005541913.localdomain sudo[148713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:57 np0005541913.localdomain sudo[148716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:23:57 np0005541913.localdomain sudo[148716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:23:57 np0005541913.localdomain sudo[148716]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:57 np0005541913.localdomain python3.9[148715]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 09:23:57 np0005541913.localdomain sudo[148731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:23:57 np0005541913.localdomain sudo[148731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:23:57 np0005541913.localdomain sudo[148713]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:58 np0005541913.localdomain sudo[148867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfhvchbnrsmhybfogyvdeqinwiftfvrq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764667437.6714933-222-19539811129356/AnsiballZ_edpm_nftables_snippet.py
Dec 02 09:23:58 np0005541913.localdomain sudo[148867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:58 np0005541913.localdomain sudo[148731]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:58 np0005541913.localdomain python3[148871]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                            rule:
                                                              proto: udp
                                                              dport: 4789
                                                          - rule_name: 119 neutron geneve networks
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              state: ["UNTRACKED"]
                                                          - rule_name: 120 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: OUTPUT
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                          - rule_name: 121 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: PREROUTING
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 02 09:23:58 np0005541913.localdomain sudo[148867]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:58 np0005541913.localdomain sudo[148918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:23:58 np0005541913.localdomain sudo[148918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:23:58 np0005541913.localdomain sudo[148918]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:58 np0005541913.localdomain sudo[148976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrdogesimdtxbyzgrfocadzjljqaxlww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667438.564255-249-41785718341299/AnsiballZ_file.py
Dec 02 09:23:58 np0005541913.localdomain sudo[148976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:58 np0005541913.localdomain python3.9[148978]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:23:59 np0005541913.localdomain sudo[148976]: pam_unix(sudo:session): session closed for user root
Dec 02 09:23:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43259 DF PROTO=TCP SPT=59866 DPT=9101 SEQ=553843952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478FCE650000000001030307) 
Dec 02 09:23:59 np0005541913.localdomain sudo[149068]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uloltublzpenzwwlriootqxlqeqhwdei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667439.182615-273-221713751626454/AnsiballZ_stat.py
Dec 02 09:23:59 np0005541913.localdomain sudo[149068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:23:59 np0005541913.localdomain python3.9[149070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:23:59 np0005541913.localdomain sudo[149068]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:00 np0005541913.localdomain sudo[149116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pozcjlahtgtbgjxylvhwsxepfcajhgtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667439.182615-273-221713751626454/AnsiballZ_file.py
Dec 02 09:24:00 np0005541913.localdomain sudo[149116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:00 np0005541913.localdomain python3.9[149118]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:00 np0005541913.localdomain sudo[149116]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:01 np0005541913.localdomain sudo[149208]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmrfkzpgwzslgwgskgbxjhekpmmnvbsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667440.8443136-309-219098357259159/AnsiballZ_stat.py
Dec 02 09:24:01 np0005541913.localdomain sudo[149208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:01 np0005541913.localdomain python3.9[149210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:01 np0005541913.localdomain sudo[149208]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:01 np0005541913.localdomain sudo[149256]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqtagyjihsxfwofoytmnqlhswrhesaou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667440.8443136-309-219098357259159/AnsiballZ_file.py
Dec 02 09:24:01 np0005541913.localdomain sudo[149256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:01 np0005541913.localdomain python3.9[149258]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rtbs0qex recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:01 np0005541913.localdomain sudo[149256]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:02 np0005541913.localdomain sudo[149348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvczeolvhpnlksiaisesmwjyfitjgsgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667442.2007906-345-209073016759543/AnsiballZ_stat.py
Dec 02 09:24:02 np0005541913.localdomain sudo[149348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:02 np0005541913.localdomain python3.9[149350]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:02 np0005541913.localdomain sudo[149348]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:02 np0005541913.localdomain sudo[149396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hayvrdsyhzvonnorpmzetdakhzuxzveo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667442.2007906-345-209073016759543/AnsiballZ_file.py
Dec 02 09:24:02 np0005541913.localdomain sudo[149396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:03 np0005541913.localdomain python3.9[149398]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:03 np0005541913.localdomain sudo[149396]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:03 np0005541913.localdomain sudo[149488]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctqeqteeblfieckplnghpowlkdbbdifj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667443.347314-384-31900455550515/AnsiballZ_command.py
Dec 02 09:24:03 np0005541913.localdomain sudo[149488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43689 DF PROTO=TCP SPT=42948 DPT=9102 SEQ=538564102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478FE05E0000000001030307) 
Dec 02 09:24:03 np0005541913.localdomain python3.9[149490]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:24:03 np0005541913.localdomain sudo[149488]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46099 DF PROTO=TCP SPT=33754 DPT=9105 SEQ=3098192455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478FE0DF0000000001030307) 
Dec 02 09:24:05 np0005541913.localdomain sudo[149581]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsikvhayxcrvdmioipueuknfeffbaplw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764667445.0741873-408-237376784383615/AnsiballZ_edpm_nftables_from_files.py
Dec 02 09:24:05 np0005541913.localdomain sudo[149581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:05 np0005541913.localdomain python3[149583]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 09:24:05 np0005541913.localdomain sudo[149581]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:06 np0005541913.localdomain sudo[149673]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iykjemisvfgbjyibseridtgvujnsjxvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667445.824281-432-109213117566183/AnsiballZ_stat.py
Dec 02 09:24:06 np0005541913.localdomain sudo[149673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:06 np0005541913.localdomain python3.9[149675]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:06 np0005541913.localdomain sudo[149673]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:06 np0005541913.localdomain sudo[149748]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vravkwewshwvukytojynxvveqshieebe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667445.824281-432-109213117566183/AnsiballZ_copy.py
Dec 02 09:24:06 np0005541913.localdomain sudo[149748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:06 np0005541913.localdomain python3.9[149750]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667445.824281-432-109213117566183/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:06 np0005541913.localdomain sudo[149748]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43691 DF PROTO=TCP SPT=42948 DPT=9102 SEQ=538564102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478FEC650000000001030307) 
Dec 02 09:24:07 np0005541913.localdomain sudo[149840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgdwcekdczjgyiocdlieretwqmrcmtzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667447.1232858-477-260009033598817/AnsiballZ_stat.py
Dec 02 09:24:07 np0005541913.localdomain sudo[149840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:07 np0005541913.localdomain python3.9[149842]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:07 np0005541913.localdomain sudo[149840]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:07 np0005541913.localdomain sudo[149915]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkogdopxeporwmswdabgepxaihwndeml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667447.1232858-477-260009033598817/AnsiballZ_copy.py
Dec 02 09:24:07 np0005541913.localdomain sudo[149915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:08 np0005541913.localdomain python3.9[149917]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667447.1232858-477-260009033598817/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:08 np0005541913.localdomain sudo[149915]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:09 np0005541913.localdomain sudo[150007]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsxsnrmdgfblbaluttfkawaonyfgpxmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667448.3171823-522-100340655239830/AnsiballZ_stat.py
Dec 02 09:24:09 np0005541913.localdomain sudo[150007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:09 np0005541913.localdomain python3.9[150009]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:09 np0005541913.localdomain sudo[150007]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:09 np0005541913.localdomain sudo[150082]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htylbiamlqjenogezubrsghhyzgplelq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667448.3171823-522-100340655239830/AnsiballZ_copy.py
Dec 02 09:24:09 np0005541913.localdomain sudo[150082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:09 np0005541913.localdomain python3.9[150084]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667448.3171823-522-100340655239830/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:09 np0005541913.localdomain sudo[150082]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:09 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27891 DF PROTO=TCP SPT=57682 DPT=9882 SEQ=1675163659 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A478FF7E40000000001030307) 
Dec 02 09:24:10 np0005541913.localdomain sudo[150174]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzramzrspikpvqyumfioarudmnsslbnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667449.932977-567-153691141035749/AnsiballZ_stat.py
Dec 02 09:24:10 np0005541913.localdomain sudo[150174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:11 np0005541913.localdomain python3.9[150176]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:11 np0005541913.localdomain sudo[150174]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:11 np0005541913.localdomain sudo[150249]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqwexbqaxjtjeazgyynpcxcnirgceovn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667449.932977-567-153691141035749/AnsiballZ_copy.py
Dec 02 09:24:11 np0005541913.localdomain sudo[150249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:11 np0005541913.localdomain python3.9[150251]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667449.932977-567-153691141035749/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:11 np0005541913.localdomain sudo[150249]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:12 np0005541913.localdomain sudo[150341]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpmqcvmaiuckrylghtajxrsijhbmcfzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667451.8318982-612-132805547251105/AnsiballZ_stat.py
Dec 02 09:24:12 np0005541913.localdomain sudo[150341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:12 np0005541913.localdomain python3.9[150343]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:12 np0005541913.localdomain sudo[150341]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11820 DF PROTO=TCP SPT=38638 DPT=9100 SEQ=3358605897 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479003E40000000001030307) 
Dec 02 09:24:13 np0005541913.localdomain sudo[150416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbwozrwfmofosnjrogxpllhpsvoafazi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667451.8318982-612-132805547251105/AnsiballZ_copy.py
Dec 02 09:24:13 np0005541913.localdomain sudo[150416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:13 np0005541913.localdomain python3.9[150418]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667451.8318982-612-132805547251105/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:13 np0005541913.localdomain sudo[150416]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:13 np0005541913.localdomain sudo[150508]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqzhbkkhhupeutcjbcmoojlwavpdpdsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667453.6746829-657-139465171600019/AnsiballZ_file.py
Dec 02 09:24:13 np0005541913.localdomain sudo[150508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:14 np0005541913.localdomain python3.9[150510]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:14 np0005541913.localdomain sudo[150508]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:14 np0005541913.localdomain sudo[150600]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlxviupyyhmibbumokmzbjevhnatnygw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667454.2737992-681-175040453127333/AnsiballZ_command.py
Dec 02 09:24:14 np0005541913.localdomain sudo[150600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:14 np0005541913.localdomain python3.9[150602]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:24:14 np0005541913.localdomain sudo[150600]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:15 np0005541913.localdomain sudo[150695]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfzxrvmookoxyxogqfwgvjzsqsmfhndl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667454.929055-705-252822325693245/AnsiballZ_blockinfile.py
Dec 02 09:24:15 np0005541913.localdomain sudo[150695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:15 np0005541913.localdomain python3.9[150697]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:15 np0005541913.localdomain sudo[150695]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:15 np0005541913.localdomain sudo[150787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxfqxsrlycjgpyvvmhjxdnlijvkhjllu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667455.801773-732-53452859349270/AnsiballZ_command.py
Dec 02 09:24:15 np0005541913.localdomain sudo[150787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:16 np0005541913.localdomain python3.9[150789]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:24:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53873 DF PROTO=TCP SPT=55440 DPT=9100 SEQ=2759797587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479010240000000001030307) 
Dec 02 09:24:16 np0005541913.localdomain sudo[150787]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:16 np0005541913.localdomain sudo[150880]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxzgbyskuaeifttwlllxjleemqdpnvrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667456.4498875-756-200834159988130/AnsiballZ_stat.py
Dec 02 09:24:16 np0005541913.localdomain sudo[150880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:16 np0005541913.localdomain python3.9[150882]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:24:16 np0005541913.localdomain sudo[150880]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:17 np0005541913.localdomain sudo[150974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaxoctntupfdjcdhauaynawnswtfsgtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667457.0916748-780-66641989103468/AnsiballZ_command.py
Dec 02 09:24:17 np0005541913.localdomain sudo[150974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:17 np0005541913.localdomain python3.9[150976]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:24:17 np0005541913.localdomain sudo[150974]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:17 np0005541913.localdomain sudo[151069]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqrzaufgurvlwmamsvlnjoredenkgbqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667457.6946862-804-187651009187656/AnsiballZ_file.py
Dec 02 09:24:17 np0005541913.localdomain sudo[151069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:18 np0005541913.localdomain python3.9[151071]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:18 np0005541913.localdomain sudo[151069]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43693 DF PROTO=TCP SPT=42948 DPT=9102 SEQ=538564102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47901BE50000000001030307) 
Dec 02 09:24:19 np0005541913.localdomain python3.9[151161]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:24:20 np0005541913.localdomain sudo[151252]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-miaadizvqonbnpdjogkkgsrztiysskrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667459.926815-924-17546676496198/AnsiballZ_command.py
Dec 02 09:24:20 np0005541913.localdomain sudo[151252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:20 np0005541913.localdomain python3.9[151254]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005541913.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:80:ac:27:10" external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:24:20 np0005541913.localdomain ovs-vsctl[151255]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005541913.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:80:ac:27:10 external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 02 09:24:20 np0005541913.localdomain sudo[151252]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:21 np0005541913.localdomain sudo[151345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzzsutrqgfyvaargcdejfessdwjpmagv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667460.630393-951-215655126657155/AnsiballZ_command.py
Dec 02 09:24:21 np0005541913.localdomain sudo[151345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:21 np0005541913.localdomain python3.9[151347]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ovs-vsctl show | grep -q "Manager"
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:24:22 np0005541913.localdomain sudo[151345]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49273 DF PROTO=TCP SPT=32970 DPT=9101 SEQ=228231276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479027D00000000001030307) 
Dec 02 09:24:22 np0005541913.localdomain python3.9[151440]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:24:24 np0005541913.localdomain sudo[151532]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxiyjepnzaxswtojyjndqvheipwdkwog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667463.8271744-1005-123336829605858/AnsiballZ_file.py
Dec 02 09:24:24 np0005541913.localdomain sudo[151532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:24 np0005541913.localdomain python3.9[151534]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:24:24 np0005541913.localdomain sudo[151532]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:24 np0005541913.localdomain sudo[151624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgfjfkrpckzybdkasdhyykqjcsilzzlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667464.4331546-1029-225509203060572/AnsiballZ_stat.py
Dec 02 09:24:24 np0005541913.localdomain sudo[151624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:24 np0005541913.localdomain python3.9[151626]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:24 np0005541913.localdomain sudo[151624]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:25 np0005541913.localdomain sudo[151672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moycxpfbrtpevquicgpgsfofujbburgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667464.4331546-1029-225509203060572/AnsiballZ_file.py
Dec 02 09:24:25 np0005541913.localdomain sudo[151672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49275 DF PROTO=TCP SPT=32970 DPT=9101 SEQ=228231276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479033E40000000001030307) 
Dec 02 09:24:25 np0005541913.localdomain python3.9[151674]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:24:25 np0005541913.localdomain sudo[151672]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:25 np0005541913.localdomain sudo[151764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-looadviueoxqbszpfsuzarctoehiawfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667465.494678-1029-62637872197780/AnsiballZ_stat.py
Dec 02 09:24:25 np0005541913.localdomain sudo[151764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:26 np0005541913.localdomain python3.9[151766]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:26 np0005541913.localdomain sudo[151764]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:26 np0005541913.localdomain sudo[151812]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujjjdhlplnevqfuipibeyfibywenntph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667465.494678-1029-62637872197780/AnsiballZ_file.py
Dec 02 09:24:26 np0005541913.localdomain sudo[151812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:26 np0005541913.localdomain python3.9[151814]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:24:26 np0005541913.localdomain sudo[151812]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:26 np0005541913.localdomain sudo[151904]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnrqxjaomovjwcsxzvypotgrozktobdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667466.6310422-1098-178298051358931/AnsiballZ_file.py
Dec 02 09:24:26 np0005541913.localdomain sudo[151904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:27 np0005541913.localdomain python3.9[151906]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:27 np0005541913.localdomain sudo[151904]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:27 np0005541913.localdomain sudo[151996]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rltksxszckvahczgjwhrnomjpoxrvmri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667467.2844377-1122-218238094257133/AnsiballZ_stat.py
Dec 02 09:24:27 np0005541913.localdomain sudo[151996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:27 np0005541913.localdomain python3.9[151998]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:27 np0005541913.localdomain sudo[151996]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:27 np0005541913.localdomain sudo[152044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isgvemzuzgnmjquujjpqoxdyanuzfqqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667467.2844377-1122-218238094257133/AnsiballZ_file.py
Dec 02 09:24:27 np0005541913.localdomain sudo[152044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:28 np0005541913.localdomain python3.9[152046]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:28 np0005541913.localdomain sudo[152044]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:28 np0005541913.localdomain sudo[152136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frdfndspgjptrfhengwnckgxtdeoulaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667468.3459501-1158-686540560437/AnsiballZ_stat.py
Dec 02 09:24:28 np0005541913.localdomain sudo[152136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:28 np0005541913.localdomain python3.9[152138]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:28 np0005541913.localdomain sudo[152136]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:29 np0005541913.localdomain sudo[152184]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bucuqghnusdfgbcmntjxqaidfspqrino ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667468.3459501-1158-686540560437/AnsiballZ_file.py
Dec 02 09:24:29 np0005541913.localdomain sudo[152184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:29 np0005541913.localdomain python3.9[152186]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:29 np0005541913.localdomain sudo[152184]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49276 DF PROTO=TCP SPT=32970 DPT=9101 SEQ=228231276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479043A40000000001030307) 
Dec 02 09:24:29 np0005541913.localdomain sudo[152276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-expfuasrebkhawwmroanidaymxamnenv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667469.450159-1194-96540704817771/AnsiballZ_systemd.py
Dec 02 09:24:29 np0005541913.localdomain sudo[152276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:29 np0005541913.localdomain python3.9[152278]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:24:29 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:24:30 np0005541913.localdomain systemd-rc-local-generator[152299]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:24:30 np0005541913.localdomain systemd-sysv-generator[152302]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:24:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:24:30 np0005541913.localdomain sudo[152276]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:31 np0005541913.localdomain sudo[152405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvwzwcjnluejvamdfrgvuzvywdvivlme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667471.0715914-1218-113550663328880/AnsiballZ_stat.py
Dec 02 09:24:31 np0005541913.localdomain sudo[152405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:31 np0005541913.localdomain python3.9[152407]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:31 np0005541913.localdomain sudo[152405]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:32 np0005541913.localdomain sudo[152453]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhidbwnrbuibluxnnzrtgqyeomxwzlvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667471.0715914-1218-113550663328880/AnsiballZ_file.py
Dec 02 09:24:32 np0005541913.localdomain sudo[152453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:32 np0005541913.localdomain python3.9[152455]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:32 np0005541913.localdomain sudo[152453]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:33 np0005541913.localdomain sudo[152545]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmcqaammabpexonywdjwrbubzqteqyxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667472.8643947-1254-246783530854147/AnsiballZ_stat.py
Dec 02 09:24:33 np0005541913.localdomain sudo[152545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:33 np0005541913.localdomain python3.9[152547]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:33 np0005541913.localdomain sudo[152545]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:33 np0005541913.localdomain sudo[152593]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojszujlkfvpbevchmlpofzfnoyvbyfuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667472.8643947-1254-246783530854147/AnsiballZ_file.py
Dec 02 09:24:33 np0005541913.localdomain sudo[152593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:33 np0005541913.localdomain python3.9[152595]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:33 np0005541913.localdomain sudo[152593]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28061 DF PROTO=TCP SPT=42486 DPT=9102 SEQ=1993625355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790558D0000000001030307) 
Dec 02 09:24:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5165 DF PROTO=TCP SPT=45900 DPT=9105 SEQ=587892334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790560F0000000001030307) 
Dec 02 09:24:34 np0005541913.localdomain sudo[152685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggtjbltrpnxjoflqkyneewodsvqrosyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667474.3244832-1290-127160573268198/AnsiballZ_systemd.py
Dec 02 09:24:34 np0005541913.localdomain sudo[152685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:34 np0005541913.localdomain python3.9[152687]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:24:34 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:24:34 np0005541913.localdomain systemd-sysv-generator[152713]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:24:34 np0005541913.localdomain systemd-rc-local-generator[152709]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:24:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:24:35 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 09:24:35 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 09:24:35 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 09:24:35 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 09:24:35 np0005541913.localdomain sudo[152685]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:35 np0005541913.localdomain sudo[152818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqhyrwcajpxqejuwtiytisjgeicndjax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667475.5949183-1320-213245658873837/AnsiballZ_file.py
Dec 02 09:24:35 np0005541913.localdomain sudo[152818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:36 np0005541913.localdomain python3.9[152820]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:24:36 np0005541913.localdomain sudo[152818]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:36 np0005541913.localdomain sudo[152910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuzvyxlfdqvevdjdvwyeskqzjuufzrtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667476.268329-1344-229882260860214/AnsiballZ_stat.py
Dec 02 09:24:36 np0005541913.localdomain sudo[152910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:36 np0005541913.localdomain python3.9[152912]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:36 np0005541913.localdomain sudo[152910]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28063 DF PROTO=TCP SPT=42486 DPT=9102 SEQ=1993625355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479061A40000000001030307) 
Dec 02 09:24:37 np0005541913.localdomain sudo[152983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqpdlazwbctczrgddgagetozaqlupzza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667476.268329-1344-229882260860214/AnsiballZ_copy.py
Dec 02 09:24:37 np0005541913.localdomain sudo[152983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:37 np0005541913.localdomain python3.9[152985]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667476.268329-1344-229882260860214/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:24:37 np0005541913.localdomain sudo[152983]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:38 np0005541913.localdomain sudo[153075]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgverrusrldcoxhmkohldwzgouxegspm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667477.886071-1395-128875259903336/AnsiballZ_file.py
Dec 02 09:24:38 np0005541913.localdomain sudo[153075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:38 np0005541913.localdomain python3.9[153077]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:24:38 np0005541913.localdomain sudo[153075]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:38 np0005541913.localdomain sudo[153167]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avsptkgetqmitlvclchipoynrdoqkrvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667478.5515296-1419-120214077552658/AnsiballZ_stat.py
Dec 02 09:24:38 np0005541913.localdomain sudo[153167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:38 np0005541913.localdomain python3.9[153169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:24:38 np0005541913.localdomain sudo[153167]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:39 np0005541913.localdomain sudo[153242]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahskvsqjcvwebgqbmlcclsmvbbpksaje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667478.5515296-1419-120214077552658/AnsiballZ_copy.py
Dec 02 09:24:39 np0005541913.localdomain sudo[153242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:39 np0005541913.localdomain python3.9[153244]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667478.5515296-1419-120214077552658/.source.json _original_basename=.9j20tbq7 follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:39 np0005541913.localdomain sudo[153242]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:39 np0005541913.localdomain sudo[153334]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uryddinfzlrwdxeifegugjmzluxkjjaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667479.7151902-1464-37564531734276/AnsiballZ_file.py
Dec 02 09:24:39 np0005541913.localdomain sudo[153334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47126 DF PROTO=TCP SPT=59134 DPT=9100 SEQ=540302236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47906DA40000000001030307) 
Dec 02 09:24:40 np0005541913.localdomain python3.9[153336]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:40 np0005541913.localdomain sudo[153334]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:40 np0005541913.localdomain sudo[153426]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axddwmrrnhfvaonauhcwdtpzugbvpqiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667480.4708145-1488-120276658808/AnsiballZ_stat.py
Dec 02 09:24:40 np0005541913.localdomain sudo[153426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:40 np0005541913.localdomain sudo[153426]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:41 np0005541913.localdomain sudo[153499]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gumuxarwicwybjsmqfvojsygselxvoke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667480.4708145-1488-120276658808/AnsiballZ_copy.py
Dec 02 09:24:41 np0005541913.localdomain sudo[153499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:41 np0005541913.localdomain sudo[153499]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:42 np0005541913.localdomain sudo[153591]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omtnxhkvyqvzfkltikhuukzrypznaxlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667481.7346647-1539-12590929944094/AnsiballZ_container_config_data.py
Dec 02 09:24:42 np0005541913.localdomain sudo[153591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:42 np0005541913.localdomain python3.9[153593]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 02 09:24:42 np0005541913.localdomain sudo[153591]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:42 np0005541913.localdomain sudo[153683]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yopjdaqvvbfpyzqqbkkbomjyviwkoobt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667482.5518537-1566-139844994289061/AnsiballZ_container_config_hash.py
Dec 02 09:24:42 np0005541913.localdomain sudo[153683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:43 np0005541913.localdomain python3.9[153685]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:24:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13043 DF PROTO=TCP SPT=60766 DPT=9882 SEQ=722403974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479079A40000000001030307) 
Dec 02 09:24:43 np0005541913.localdomain sudo[153683]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:43 np0005541913.localdomain sudo[153775]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzitzowechdpcmmidigzxhkzvepplxll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667483.427336-1593-64376983744739/AnsiballZ_podman_container_info.py
Dec 02 09:24:43 np0005541913.localdomain sudo[153775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:44 np0005541913.localdomain python3.9[153777]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 09:24:44 np0005541913.localdomain sudo[153775]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47128 DF PROTO=TCP SPT=59134 DPT=9100 SEQ=540302236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479085640000000001030307) 
Dec 02 09:24:47 np0005541913.localdomain sudo[153894]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iujpneequphdtmfmtlvdinqbwgpxxfxf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764667487.403857-1632-129741290250829/AnsiballZ_edpm_container_manage.py
Dec 02 09:24:47 np0005541913.localdomain sudo[153894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:48 np0005541913.localdomain python3[153896]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:24:48 np0005541913.localdomain python3[153896]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c",
                                                                    "Digest": "sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:38:47.246477714Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 345722821,
                                                                    "VirtualSize": 345722821,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:ba9362d2aeb297e34b0679b2fc8168350c70a5b0ec414daf293bf2bc013e9088",
                                                                              "sha256:aae3b8a85314314b9db80a043fdf3f3b1d0b69927faca0303c73969a23dddd0f"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:22.759131427Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:13:25.258260855Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:13:28.025145079Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:38:13.535675197Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:38:47.244104142Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:38:48.759416475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 02 09:24:48 np0005541913.localdomain podman[153947]: 2025-12-02 09:24:48.486539178 +0000 UTC m=+0.061695866 container remove e7eecdc150df71e60e827f323f5c39356b385a1381d6f79abb35c9f2026b626b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 02 09:24:48 np0005541913.localdomain python3[153896]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Dec 02 09:24:48 np0005541913.localdomain podman[153961]: 
Dec 02 09:24:48 np0005541913.localdomain podman[153961]: 2025-12-02 09:24:48.550597866 +0000 UTC m=+0.050724052 container create cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:24:48 np0005541913.localdomain podman[153961]: 2025-12-02 09:24:48.526322435 +0000 UTC m=+0.026448651 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 02 09:24:48 np0005541913.localdomain python3[153896]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 02 09:24:48 np0005541913.localdomain sudo[153894]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:49 np0005541913.localdomain sudo[154089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzcynouyysykvytablsofjrncuuajkdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667488.9836493-1656-274093172960197/AnsiballZ_stat.py
Dec 02 09:24:49 np0005541913.localdomain sudo[154089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5169 DF PROTO=TCP SPT=45900 DPT=9105 SEQ=587892334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479091E40000000001030307) 
Dec 02 09:24:49 np0005541913.localdomain python3.9[154091]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:24:49 np0005541913.localdomain sudo[154089]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:49 np0005541913.localdomain sudo[154183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqmnyrlhfeiykvmwkeyyiumvtkkxjxlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667489.658328-1683-143455372405504/AnsiballZ_file.py
Dec 02 09:24:49 np0005541913.localdomain sudo[154183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:50 np0005541913.localdomain python3.9[154185]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:50 np0005541913.localdomain sudo[154183]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:50 np0005541913.localdomain sudo[154229]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwgvzagjglztnrnhatrenmhuupfydoyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667489.658328-1683-143455372405504/AnsiballZ_stat.py
Dec 02 09:24:50 np0005541913.localdomain sudo[154229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:50 np0005541913.localdomain python3.9[154231]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:24:50 np0005541913.localdomain sudo[154229]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:51 np0005541913.localdomain sudo[154320]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxkktzbdziznxpxylaogsauedwquhlij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667490.5944312-1683-175000602378591/AnsiballZ_copy.py
Dec 02 09:24:51 np0005541913.localdomain sudo[154320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:51 np0005541913.localdomain python3.9[154322]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764667490.5944312-1683-175000602378591/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:24:51 np0005541913.localdomain sudo[154320]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:51 np0005541913.localdomain sudo[154366]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvbnuljfltftnkyckmucicbgieofcriw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667490.5944312-1683-175000602378591/AnsiballZ_systemd.py
Dec 02 09:24:51 np0005541913.localdomain sudo[154366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:51 np0005541913.localdomain python3.9[154368]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:24:51 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:24:51 np0005541913.localdomain systemd-rc-local-generator[154390]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:24:51 np0005541913.localdomain systemd-sysv-generator[154393]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:24:51 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:24:52 np0005541913.localdomain sudo[154366]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54235 DF PROTO=TCP SPT=39826 DPT=9101 SEQ=595113565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47909D000000000001030307) 
Dec 02 09:24:52 np0005541913.localdomain sudo[154448]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmmwoeqmbtqpkmdsyrvpfqtbfbatqzwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667490.5944312-1683-175000602378591/AnsiballZ_systemd.py
Dec 02 09:24:52 np0005541913.localdomain sudo[154448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:52 np0005541913.localdomain python3.9[154450]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:24:52 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:24:52 np0005541913.localdomain systemd-sysv-generator[154480]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:24:52 np0005541913.localdomain systemd-rc-local-generator[154475]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:24:52 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Starting ovn_controller container...
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:24:53 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34407dfb17e4d44a7094dfc01c3723ad0f2347db77e802073af38c7ff4fca0cd/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:24:53 np0005541913.localdomain podman[154491]: 2025-12-02 09:24:53.22714928 +0000 UTC m=+0.159252883 container init cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: + sudo -E kolla_set_configs
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:24:53 np0005541913.localdomain podman[154491]: 2025-12-02 09:24:53.271769757 +0000 UTC m=+0.203873310 container start cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:24:53 np0005541913.localdomain edpm-start-podman-container[154491]: ovn_controller
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 02 09:24:53 np0005541913.localdomain edpm-start-podman-container[154490]: Creating additional drop-in dependency for "ovn_controller" (cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782)
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:24:53 np0005541913.localdomain podman[154512]: 2025-12-02 09:24:53.415149692 +0000 UTC m=+0.140036087 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:24:53 np0005541913.localdomain podman[154512]: 2025-12-02 09:24:53.432546339 +0000 UTC m=+0.157432754 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:24:53 np0005541913.localdomain podman[154512]: unhealthy
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Queued start job for default target Main User Target.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Created slice User Application Slice.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Reached target Paths.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Reached target Timers.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Starting D-Bus User Message Bus Socket...
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Starting Create User's Volatile Files and Directories...
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Finished Create User's Volatile Files and Directories.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Listening on D-Bus User Message Bus Socket.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Reached target Sockets.
Dec 02 09:24:53 np0005541913.localdomain systemd-rc-local-generator[154594]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Reached target Basic System.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Reached target Main User Target.
Dec 02 09:24:53 np0005541913.localdomain systemd[154535]: Startup finished in 113ms.
Dec 02 09:24:53 np0005541913.localdomain systemd-sysv-generator[154598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: tmp-crun.cddUDf.mount: Deactivated successfully.
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Started User Manager for UID 0.
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Started ovn_controller container.
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Failed with result 'exit-code'.
Dec 02 09:24:53 np0005541913.localdomain systemd-journald[47611]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation.
Dec 02 09:24:53 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 09:24:53 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:24:53 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:24:53 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Started Session c12 of User root.
Dec 02 09:24:53 np0005541913.localdomain sudo[154448]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: INFO:__main__:Validating config file
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: INFO:__main__:Writing out command to execute
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: session-c12.scope: Deactivated successfully.
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: ++ cat /run_command
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: + ARGS=
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: + sudo kolla_copy_cacerts
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: Started Session c13 of User root.
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: + [[ ! -n '' ]]
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: + . kolla_extend_start
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: + umask 0022
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Dec 02 09:24:53 np0005541913.localdomain systemd[1]: session-c13.scope: Deactivated successfully.
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00013|main|INFO|OVS feature set changed, force recompute.
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00018|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-be95dc-0
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-2587fe-0
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00025|ovn_bfd|INFO|Disabled BFD on interface ovn-4d166c-0
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00026|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00027|binding|INFO|Claiming lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a for this chassis.
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00028|binding|INFO|4a318f6a-b3c1-4690-8246-f7d046ccd64a: Claiming fa:16:3e:26:b2:03 192.168.0.102
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00029|binding|INFO|Removing lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a ovn-installed in OVS
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-be95dc-0
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-2587fe-0
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-4d166c-0
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00033|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00034|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00035|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00036|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:53Z|00037|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:54Z|00038|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:54 np0005541913.localdomain sudo[154706]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqdxzkaezzkdlnupgrdwgtlgxqvlghwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667494.0910702-1767-22582121594810/AnsiballZ_command.py
Dec 02 09:24:54 np0005541913.localdomain sudo[154706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:54 np0005541913.localdomain python3.9[154708]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:24:54 np0005541913.localdomain ovs-vsctl[154709]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 02 09:24:54 np0005541913.localdomain sudo[154706]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:54Z|00039|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:55 np0005541913.localdomain sudo[154799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyhywxjgueoubxdhfitqrvvvkbhrtznn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667494.7794013-1791-37698591716100/AnsiballZ_command.py
Dec 02 09:24:55 np0005541913.localdomain sudo[154799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:55Z|00040|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:55 np0005541913.localdomain python3.9[154801]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:24:55 np0005541913.localdomain ovs-vsctl[154803]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 02 09:24:55 np0005541913.localdomain sudo[154799]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54237 DF PROTO=TCP SPT=39826 DPT=9101 SEQ=595113565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790A9250000000001030307) 
Dec 02 09:24:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:24:55Z|00041|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:24:56 np0005541913.localdomain sudo[154894]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiwdfpzvwatoarziqezooxymkgrhsati ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667495.8002207-1833-102219522376413/AnsiballZ_command.py
Dec 02 09:24:56 np0005541913.localdomain sudo[154894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:24:56 np0005541913.localdomain python3.9[154896]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:24:57 np0005541913.localdomain ovs-vsctl[154897]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 02 09:24:57 np0005541913.localdomain sudo[154894]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:57 np0005541913.localdomain sshd[148012]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:24:57 np0005541913.localdomain systemd[1]: session-50.scope: Deactivated successfully.
Dec 02 09:24:57 np0005541913.localdomain systemd[1]: session-50.scope: Consumed 40.687s CPU time.
Dec 02 09:24:57 np0005541913.localdomain systemd-logind[757]: Session 50 logged out. Waiting for processes to exit.
Dec 02 09:24:57 np0005541913.localdomain systemd-logind[757]: Removed session 50.
Dec 02 09:24:58 np0005541913.localdomain sudo[154912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:24:58 np0005541913.localdomain sudo[154912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:24:58 np0005541913.localdomain sudo[154912]: pam_unix(sudo:session): session closed for user root
Dec 02 09:24:58 np0005541913.localdomain sudo[154927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:24:58 np0005541913.localdomain sudo[154927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:24:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54238 DF PROTO=TCP SPT=39826 DPT=9101 SEQ=595113565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790B8E40000000001030307) 
Dec 02 09:24:59 np0005541913.localdomain sudo[154927]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:00 np0005541913.localdomain sudo[154976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:25:00 np0005541913.localdomain sudo[154976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:25:00 np0005541913.localdomain sudo[154976]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:25:01Z|00042|binding|INFO|Setting lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a ovn-installed in OVS
Dec 02 09:25:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:25:01Z|00043|binding|INFO|Setting lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a up in Southbound
Dec 02 09:25:03 np0005541913.localdomain sshd[154991]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:25:03 np0005541913.localdomain sshd[154991]: Accepted publickey for zuul from 192.168.122.30 port 50740 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:25:03 np0005541913.localdomain systemd-logind[757]: New session 52 of user zuul.
Dec 02 09:25:03 np0005541913.localdomain systemd[1]: Started Session 52 of User zuul.
Dec 02 09:25:03 np0005541913.localdomain sshd[154991]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:25:03 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Activating special unit Exit the Session...
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Stopped target Main User Target.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Stopped target Basic System.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Stopped target Paths.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Stopped target Sockets.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Stopped target Timers.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Closed D-Bus User Message Bus Socket.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Stopped Create User's Volatile Files and Directories.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Removed slice User Application Slice.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Reached target Shutdown.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Finished Exit the Session.
Dec 02 09:25:03 np0005541913.localdomain systemd[154535]: Reached target Exit the Session.
Dec 02 09:25:03 np0005541913.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 02 09:25:03 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 02 09:25:03 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 02 09:25:03 np0005541913.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 02 09:25:03 np0005541913.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 02 09:25:03 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 02 09:25:03 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 02 09:25:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12676 DF PROTO=TCP SPT=48914 DPT=9102 SEQ=1810178562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790CABE0000000001030307) 
Dec 02 09:25:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38771 DF PROTO=TCP SPT=45276 DPT=9105 SEQ=683573260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790CB3E0000000001030307) 
Dec 02 09:25:04 np0005541913.localdomain python3.9[155086]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:25:05 np0005541913.localdomain sudo[155180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngfqblyjixdzqjletpghckmkgcdzikrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667505.1105115-63-167629979499566/AnsiballZ_file.py
Dec 02 09:25:05 np0005541913.localdomain sudo[155180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:05 np0005541913.localdomain python3.9[155182]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:05 np0005541913.localdomain sudo[155180]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:06 np0005541913.localdomain sudo[155272]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzqkzxshkvoaldueeyyfrxzamluxhuxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667505.846153-63-206885677273615/AnsiballZ_file.py
Dec 02 09:25:06 np0005541913.localdomain sudo[155272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:06 np0005541913.localdomain python3.9[155274]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:06 np0005541913.localdomain sudo[155272]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:06 np0005541913.localdomain sudo[155364]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqovsxebvkftlvzatiancqsfdbuhovyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667506.4271834-63-262788416048035/AnsiballZ_file.py
Dec 02 09:25:06 np0005541913.localdomain sudo[155364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:06 np0005541913.localdomain python3.9[155366]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:06 np0005541913.localdomain sudo[155364]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12678 DF PROTO=TCP SPT=48914 DPT=9102 SEQ=1810178562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790D6E50000000001030307) 
Dec 02 09:25:10 np0005541913.localdomain sudo[155456]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaonzzozulwxaotdptwimdrienkifnzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667508.7846859-63-68142598047510/AnsiballZ_file.py
Dec 02 09:25:10 np0005541913.localdomain sudo[155456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42918 DF PROTO=TCP SPT=34084 DPT=9100 SEQ=4267583192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790E2A40000000001030307) 
Dec 02 09:25:10 np0005541913.localdomain python3.9[155458]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:10 np0005541913.localdomain sudo[155456]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:10 np0005541913.localdomain sudo[155548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqqsrszvfrwxcgyzoifuzzrbsirszyfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667510.3578124-63-200274023857098/AnsiballZ_file.py
Dec 02 09:25:10 np0005541913.localdomain sudo[155548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:10 np0005541913.localdomain python3.9[155550]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:10 np0005541913.localdomain sudo[155548]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:11 np0005541913.localdomain python3.9[155641]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:25:12 np0005541913.localdomain sudo[155732]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehsqibpbrqkheizosfbtinmmyueqiznq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667511.7068295-195-192586077011349/AnsiballZ_seboolean.py
Dec 02 09:25:12 np0005541913.localdomain sudo[155732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:12 np0005541913.localdomain python3.9[155734]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 02 09:25:12 np0005541913.localdomain sudo[155732]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5607 DF PROTO=TCP SPT=33202 DPT=9882 SEQ=1109634363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790EEE40000000001030307) 
Dec 02 09:25:13 np0005541913.localdomain python3.9[155824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:13 np0005541913.localdomain python3.9[155897]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667512.672773-219-149392302382990/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:14 np0005541913.localdomain python3.9[155987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:15 np0005541913.localdomain python3.9[156060]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667514.000656-264-193791946797672/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:15 np0005541913.localdomain sudo[156150]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qceyagdybahgmkzzakifvfajlciostnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667515.4778283-315-185884426641209/AnsiballZ_setup.py
Dec 02 09:25:15 np0005541913.localdomain sudo[156150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:16 np0005541913.localdomain python3.9[156152]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:25:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42920 DF PROTO=TCP SPT=34084 DPT=9100 SEQ=4267583192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4790FA650000000001030307) 
Dec 02 09:25:16 np0005541913.localdomain sudo[156150]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:16 np0005541913.localdomain sudo[156204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dizxfzfuvipmrcfepwoabifnwrfetseb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667515.4778283-315-185884426641209/AnsiballZ_dnf.py
Dec 02 09:25:16 np0005541913.localdomain sudo[156204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:16 np0005541913.localdomain python3.9[156206]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:25:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12680 DF PROTO=TCP SPT=48914 DPT=9102 SEQ=1810178562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479107E40000000001030307) 
Dec 02 09:25:20 np0005541913.localdomain sudo[156204]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:21 np0005541913.localdomain sudo[156298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-feorsifhwnrrdhsovdhbqftfersvxkxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667520.4579349-351-185617376074615/AnsiballZ_systemd.py
Dec 02 09:25:21 np0005541913.localdomain sudo[156298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:21 np0005541913.localdomain python3.9[156300]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 09:25:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19090 DF PROTO=TCP SPT=42826 DPT=9101 SEQ=3732848012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479112300000000001030307) 
Dec 02 09:25:22 np0005541913.localdomain sudo[156298]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:23 np0005541913.localdomain python3.9[156393]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:23 np0005541913.localdomain python3.9[156464]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667522.6247692-375-266210652669539/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:24 np0005541913.localdomain python3.9[156554]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:25:24 np0005541913.localdomain podman[156623]: 2025-12-02 09:25:24.455173358 +0000 UTC m=+0.091788707 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:25:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:25:24Z|00044|memory|INFO|18748 kB peak resident set size after 30.7 seconds
Dec 02 09:25:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:25:24Z|00045|memory|INFO|idl-cells-OVN_Southbound:4028 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:289 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:154 ofctrl_installed_flow_usage-KB:111 ofctrl_sb_flow_ref_usage-KB:67
Dec 02 09:25:24 np0005541913.localdomain podman[156623]: 2025-12-02 09:25:24.495115584 +0000 UTC m=+0.131730953 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:25:24 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:25:24 np0005541913.localdomain python3.9[156626]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667523.6740825-375-101572207563676/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19092 DF PROTO=TCP SPT=42826 DPT=9101 SEQ=3732848012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47911E250000000001030307) 
Dec 02 09:25:26 np0005541913.localdomain python3.9[156740]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:26 np0005541913.localdomain python3.9[156811]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667525.7406945-507-85302101226911/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:27 np0005541913.localdomain python3.9[156901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:28 np0005541913.localdomain python3.9[156972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667526.8203483-507-214270713255369/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:29 np0005541913.localdomain python3.9[157062]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:25:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19093 DF PROTO=TCP SPT=42826 DPT=9101 SEQ=3732848012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47912DE50000000001030307) 
Dec 02 09:25:30 np0005541913.localdomain sudo[157154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhjwnrptuzcekmvcombgptlleplbymhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667529.78857-621-45338831587180/AnsiballZ_file.py
Dec 02 09:25:30 np0005541913.localdomain sudo[157154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:30 np0005541913.localdomain python3.9[157156]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:30 np0005541913.localdomain sudo[157154]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:30 np0005541913.localdomain sudo[157246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-criplnyiilnulsjcnhxzzlebmemprblf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667530.4448113-645-142033001892889/AnsiballZ_stat.py
Dec 02 09:25:30 np0005541913.localdomain sudo[157246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:30 np0005541913.localdomain python3.9[157248]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:30 np0005541913.localdomain sudo[157246]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:31 np0005541913.localdomain sudo[157294]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgqgnbouzjiwcrjgztbalowycztjpvsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667530.4448113-645-142033001892889/AnsiballZ_file.py
Dec 02 09:25:31 np0005541913.localdomain sudo[157294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:31 np0005541913.localdomain python3.9[157296]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:31 np0005541913.localdomain sudo[157294]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:31 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:25:31Z|00046|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Dec 02 09:25:32 np0005541913.localdomain sudo[157386]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-guaznjrjaqfxbyhsgomijxngtgumbqjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667531.7655604-645-73557158241147/AnsiballZ_stat.py
Dec 02 09:25:32 np0005541913.localdomain sudo[157386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:32 np0005541913.localdomain python3.9[157388]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:32 np0005541913.localdomain sudo[157386]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:32 np0005541913.localdomain sudo[157434]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxoyooovzagvdbwcgjpfpknhsmpiildn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667531.7655604-645-73557158241147/AnsiballZ_file.py
Dec 02 09:25:32 np0005541913.localdomain sudo[157434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:32 np0005541913.localdomain python3.9[157436]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:32 np0005541913.localdomain sudo[157434]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:33 np0005541913.localdomain sudo[157526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxnjmrvfxyhempklxwxctkundezpmjgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667532.8092594-714-80682825646361/AnsiballZ_file.py
Dec 02 09:25:33 np0005541913.localdomain sudo[157526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:33 np0005541913.localdomain python3.9[157528]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:25:33 np0005541913.localdomain sudo[157526]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:33 np0005541913.localdomain sudo[157618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uskjodyrimeqzucnemxldzlosjbrswlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667533.515783-738-198106666828817/AnsiballZ_stat.py
Dec 02 09:25:33 np0005541913.localdomain sudo[157618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24345 DF PROTO=TCP SPT=46936 DPT=9102 SEQ=4063656171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47913FED0000000001030307) 
Dec 02 09:25:34 np0005541913.localdomain python3.9[157620]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:34 np0005541913.localdomain sudo[157618]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54783 DF PROTO=TCP SPT=42942 DPT=9105 SEQ=1689485826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791406E0000000001030307) 
Dec 02 09:25:34 np0005541913.localdomain sudo[157666]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-peobzdhlwdjfqpxgacswnlhebuysnvwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667533.515783-738-198106666828817/AnsiballZ_file.py
Dec 02 09:25:34 np0005541913.localdomain sudo[157666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:34 np0005541913.localdomain python3.9[157668]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:25:34 np0005541913.localdomain sudo[157666]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:34 np0005541913.localdomain sudo[157758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edvvfctmkjnwctodmtnqikkjhzpohfqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667534.619712-774-1838578985620/AnsiballZ_stat.py
Dec 02 09:25:34 np0005541913.localdomain sudo[157758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:35 np0005541913.localdomain python3.9[157760]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:35 np0005541913.localdomain sudo[157758]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:35 np0005541913.localdomain sudo[157806]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbmajlboqqcltpizyjpptbztikfbwgtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667534.619712-774-1838578985620/AnsiballZ_file.py
Dec 02 09:25:35 np0005541913.localdomain sudo[157806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:35 np0005541913.localdomain python3.9[157808]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:25:35 np0005541913.localdomain sudo[157806]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:35 np0005541913.localdomain sudo[157898]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boucnfykocltwfrxpjfjdltrolwfktrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667535.6964676-810-178725832136046/AnsiballZ_systemd.py
Dec 02 09:25:35 np0005541913.localdomain sudo[157898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:36 np0005541913.localdomain python3.9[157900]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:25:36 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:25:36 np0005541913.localdomain systemd-rc-local-generator[157925]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:25:36 np0005541913.localdomain systemd-sysv-generator[157930]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:25:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:25:36 np0005541913.localdomain sudo[157898]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24347 DF PROTO=TCP SPT=46936 DPT=9102 SEQ=4063656171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47914BE40000000001030307) 
Dec 02 09:25:37 np0005541913.localdomain sudo[158028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhzcdtfrydijqlpevdscvgfphkepqscn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667536.7778182-834-153886633119257/AnsiballZ_stat.py
Dec 02 09:25:37 np0005541913.localdomain sudo[158028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:37 np0005541913.localdomain python3.9[158030]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:37 np0005541913.localdomain sudo[158028]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:37 np0005541913.localdomain sudo[158076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnxkpzwhrbwmlrrhskwxhnenlcyicurs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667536.7778182-834-153886633119257/AnsiballZ_file.py
Dec 02 09:25:37 np0005541913.localdomain sudo[158076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:38 np0005541913.localdomain python3.9[158078]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:25:38 np0005541913.localdomain sudo[158076]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:38 np0005541913.localdomain sudo[158168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzbefpnilpwkuwbabrhomphdmlaydumm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667538.4471166-870-94770379321755/AnsiballZ_stat.py
Dec 02 09:25:38 np0005541913.localdomain sudo[158168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:38 np0005541913.localdomain python3.9[158170]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:38 np0005541913.localdomain sudo[158168]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:39 np0005541913.localdomain sudo[158216]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iacfaoobrhrgufkzqgdnwyzkbxeqyjoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667538.4471166-870-94770379321755/AnsiballZ_file.py
Dec 02 09:25:39 np0005541913.localdomain sudo[158216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:39 np0005541913.localdomain python3.9[158218]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:25:39 np0005541913.localdomain sudo[158216]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61653 DF PROTO=TCP SPT=55884 DPT=9100 SEQ=1331993231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479157E50000000001030307) 
Dec 02 09:25:40 np0005541913.localdomain sudo[158308]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unamawbktbxlbmehjeuqnitpopylxhrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667539.9382548-906-238059487817339/AnsiballZ_systemd.py
Dec 02 09:25:40 np0005541913.localdomain sudo[158308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:40 np0005541913.localdomain python3.9[158310]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:25:40 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:25:40 np0005541913.localdomain systemd-rc-local-generator[158337]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:25:40 np0005541913.localdomain systemd-sysv-generator[158341]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:25:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:25:40 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 09:25:40 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 09:25:40 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 09:25:40 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 09:25:40 np0005541913.localdomain sudo[158308]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:42 np0005541913.localdomain sudo[158443]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uemnrqulxuxpzlghlohmyzqpohenhwdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667542.1649578-936-232925768646295/AnsiballZ_file.py
Dec 02 09:25:42 np0005541913.localdomain sudo[158443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:42 np0005541913.localdomain python3.9[158445]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:42 np0005541913.localdomain sudo[158443]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:43 np0005541913.localdomain sudo[158535]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azoejougdojsvbkiqifipouazfainljf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667542.8706791-960-222331736408878/AnsiballZ_stat.py
Dec 02 09:25:43 np0005541913.localdomain sudo[158535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28816 DF PROTO=TCP SPT=58718 DPT=9882 SEQ=1619514852 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479163E40000000001030307) 
Dec 02 09:25:43 np0005541913.localdomain python3.9[158537]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:43 np0005541913.localdomain sudo[158535]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:43 np0005541913.localdomain sudo[158608]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbmkfmvyunvvvszbloyntbziiojgzoms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667542.8706791-960-222331736408878/AnsiballZ_copy.py
Dec 02 09:25:43 np0005541913.localdomain sudo[158608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:43 np0005541913.localdomain python3.9[158610]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667542.8706791-960-222331736408878/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:43 np0005541913.localdomain sudo[158608]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:44 np0005541913.localdomain sudo[158700]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzgsvtkmszmdgjnjerqnvaoyicavurzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667544.2425761-1011-173796486938867/AnsiballZ_file.py
Dec 02 09:25:44 np0005541913.localdomain sudo[158700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:44 np0005541913.localdomain python3.9[158702]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:25:44 np0005541913.localdomain sudo[158700]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:45 np0005541913.localdomain sudo[158792]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xklyruslmqysmdtfnhgtyqxkaudsnzvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667544.956208-1035-274216101953328/AnsiballZ_stat.py
Dec 02 09:25:45 np0005541913.localdomain sudo[158792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:45 np0005541913.localdomain python3.9[158794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:25:45 np0005541913.localdomain sudo[158792]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:45 np0005541913.localdomain sudo[158867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sctfsnyqsbbcloqhfvvfacicnitwdcgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667544.956208-1035-274216101953328/AnsiballZ_copy.py
Dec 02 09:25:45 np0005541913.localdomain sudo[158867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:45 np0005541913.localdomain python3.9[158869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667544.956208-1035-274216101953328/.source.json _original_basename=.tbd4n10c follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:25:46 np0005541913.localdomain sudo[158867]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61655 DF PROTO=TCP SPT=55884 DPT=9100 SEQ=1331993231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47916FA40000000001030307) 
Dec 02 09:25:46 np0005541913.localdomain sudo[158959]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wquvyudgkuddfbexkgimnjuskvyxqfxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667546.2173085-1080-162943200111075/AnsiballZ_file.py
Dec 02 09:25:46 np0005541913.localdomain sudo[158959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:46 np0005541913.localdomain python3.9[158961]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:25:46 np0005541913.localdomain sudo[158959]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:47 np0005541913.localdomain sudo[159051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmksfxbwilgmqkajutmgmlqxluirinxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667546.9194493-1104-209235547937346/AnsiballZ_stat.py
Dec 02 09:25:47 np0005541913.localdomain sudo[159051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:47 np0005541913.localdomain sudo[159051]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:47 np0005541913.localdomain sudo[159124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlaillvvlgmosvfqilnnstowgrhjsujf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667546.9194493-1104-209235547937346/AnsiballZ_copy.py
Dec 02 09:25:47 np0005541913.localdomain sudo[159124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:47 np0005541913.localdomain sudo[159124]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54787 DF PROTO=TCP SPT=42942 DPT=9105 SEQ=1689485826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47917BE40000000001030307) 
Dec 02 09:25:49 np0005541913.localdomain sudo[159216]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrkrfpouavtabszbstwsntvfsoukbrha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667548.8905747-1155-31134736315610/AnsiballZ_container_config_data.py
Dec 02 09:25:49 np0005541913.localdomain sudo[159216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:49 np0005541913.localdomain python3.9[159218]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 02 09:25:49 np0005541913.localdomain sudo[159216]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:50 np0005541913.localdomain sudo[159308]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iapbtodcmzygbqcxgzyupevkgxbywwwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667549.6988375-1182-48983792432128/AnsiballZ_container_config_hash.py
Dec 02 09:25:50 np0005541913.localdomain sudo[159308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:50 np0005541913.localdomain python3.9[159310]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:25:50 np0005541913.localdomain sudo[159308]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:51 np0005541913.localdomain sudo[159400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dweveynwkuhaycwghicemezszwegdokc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667551.0306184-1209-7276657553941/AnsiballZ_podman_container_info.py
Dec 02 09:25:51 np0005541913.localdomain sudo[159400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:51 np0005541913.localdomain python3.9[159402]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 09:25:52 np0005541913.localdomain sudo[159400]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47284 DF PROTO=TCP SPT=59402 DPT=9101 SEQ=3481145413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791875F0000000001030307) 
Dec 02 09:25:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47286 DF PROTO=TCP SPT=59402 DPT=9101 SEQ=3481145413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479193650000000001030307) 
Dec 02 09:25:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:25:55 np0005541913.localdomain podman[159479]: 2025-12-02 09:25:55.435349595 +0000 UTC m=+0.074772593 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Dec 02 09:25:55 np0005541913.localdomain sudo[159541]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qewbkjjchdrcdcokcdwnsebmqaeugote ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764667555.0456533-1248-30093033419415/AnsiballZ_edpm_container_manage.py
Dec 02 09:25:55 np0005541913.localdomain sudo[159541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:55 np0005541913.localdomain podman[159479]: 2025-12-02 09:25:55.559117674 +0000 UTC m=+0.198540652 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:25:55 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:25:55 np0005541913.localdomain python3[159544]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:25:56 np0005541913.localdomain python3[159544]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9",
                                                                    "Digest": "sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:29:20.327314945Z",
                                                                    "Config": {
                                                                         "User": "neutron",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 784141054,
                                                                    "VirtualSize": 784141054,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53/diff:/var/lib/containers/storage/overlay/2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:75abaaa40a93c0e2bba524b6f8d4eb5f1c4c9a33db70c892c7582ec5b0827e5e",
                                                                              "sha256:01f43f620d1ea2a9e584abe0cc14c336bedcf55765127c000d743f536dd36f25",
                                                                              "sha256:0bf5bd378602f28be423f5e84abddff3b103396fae3c167031b6e3fcfcf6f120"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "neutron",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:50.18897737Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:50.762138914Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:13.720608935Z",
                                                                              "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:27.636630318Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:40.546186661Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:52.875291445Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:27:22.608862134Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:28:35.764559413Z",
                                                                              "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:28:40.983506098Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:28:44.803537768Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:20.324920691Z",
                                                                              "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:20.324983383Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:24.215761584Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 02 09:25:56 np0005541913.localdomain podman[159597]: 2025-12-02 09:25:56.255154238 +0000 UTC m=+0.081471723 container remove 1843037d120f89137a6810a44de8833dcc9687cfe6e1d91311f20b712d090d85 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd1544001d5773d0045aaf61439ef5e02'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 02 09:25:56 np0005541913.localdomain python3[159544]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Dec 02 09:25:56 np0005541913.localdomain podman[159611]: 
Dec 02 09:25:56 np0005541913.localdomain podman[159611]: 2025-12-02 09:25:56.367969044 +0000 UTC m=+0.092505836 container create 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true)
Dec 02 09:25:56 np0005541913.localdomain podman[159611]: 2025-12-02 09:25:56.321605189 +0000 UTC m=+0.046142001 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 02 09:25:56 np0005541913.localdomain python3[159544]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 02 09:25:56 np0005541913.localdomain sudo[159541]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:56 np0005541913.localdomain sudo[159739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnipgnbrhtixwsifuzbjopgoubgifvpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667556.720452-1272-21384461600997/AnsiballZ_stat.py
Dec 02 09:25:56 np0005541913.localdomain sudo[159739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:57 np0005541913.localdomain python3.9[159741]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:25:57 np0005541913.localdomain sudo[159739]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:57 np0005541913.localdomain sudo[159833]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lywhayffpfbsnonofbgfpfmtmwtabybl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667557.472794-1299-157382518984541/AnsiballZ_file.py
Dec 02 09:25:57 np0005541913.localdomain sudo[159833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:57 np0005541913.localdomain python3.9[159835]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:25:57 np0005541913.localdomain sudo[159833]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:58 np0005541913.localdomain sudo[159879]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dylsvnlnjdtuykbpywshmxlmdgrjtxjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667557.472794-1299-157382518984541/AnsiballZ_stat.py
Dec 02 09:25:58 np0005541913.localdomain sudo[159879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:58 np0005541913.localdomain python3.9[159881]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:25:58 np0005541913.localdomain sudo[159879]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:58 np0005541913.localdomain sudo[159970]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxjiuvpnmmhcycmgbtsjurktsyjziaqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667558.395957-1299-261375459293728/AnsiballZ_copy.py
Dec 02 09:25:58 np0005541913.localdomain sudo[159970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:59 np0005541913.localdomain python3.9[159972]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764667558.395957-1299-261375459293728/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:25:59 np0005541913.localdomain sudo[159970]: pam_unix(sudo:session): session closed for user root
Dec 02 09:25:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47287 DF PROTO=TCP SPT=59402 DPT=9101 SEQ=3481145413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791A3240000000001030307) 
Dec 02 09:25:59 np0005541913.localdomain sudo[160016]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmlngrsjqumwgxzitpcljnmlhxijypfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667558.395957-1299-261375459293728/AnsiballZ_systemd.py
Dec 02 09:25:59 np0005541913.localdomain sudo[160016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:25:59 np0005541913.localdomain python3.9[160018]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:25:59 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:26:00 np0005541913.localdomain systemd-rc-local-generator[160040]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:26:00 np0005541913.localdomain systemd-sysv-generator[160046]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:26:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:26:00 np0005541913.localdomain sudo[160016]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:00 np0005541913.localdomain sudo[160055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:26:00 np0005541913.localdomain sudo[160055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:26:00 np0005541913.localdomain sudo[160055]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:00 np0005541913.localdomain sudo[160083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:26:00 np0005541913.localdomain sudo[160083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:26:00 np0005541913.localdomain sudo[160128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-farmdtsqsucppitgcjmnybaqbihhubck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667558.395957-1299-261375459293728/AnsiballZ_systemd.py
Dec 02 09:26:00 np0005541913.localdomain sudo[160128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:00 np0005541913.localdomain python3.9[160130]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:26:00 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:26:00 np0005541913.localdomain systemd-rc-local-generator[160173]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:26:00 np0005541913.localdomain systemd-sysv-generator[160178]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:26:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:26:01 np0005541913.localdomain sudo[160083]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:01 np0005541913.localdomain systemd[1]: Starting ovn_metadata_agent container...
Dec 02 09:26:01 np0005541913.localdomain systemd[1]: tmp-crun.6cWxLT.mount: Deactivated successfully.
Dec 02 09:26:01 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:26:01 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f360a981842424a8567fc7a0067d84cd0b544fe5f86f8a9d8455b05b782d3b1b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 02 09:26:01 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f360a981842424a8567fc7a0067d84cd0b544fe5f86f8a9d8455b05b782d3b1b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 09:26:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:26:01 np0005541913.localdomain podman[160202]: 2025-12-02 09:26:01.27821411 +0000 UTC m=+0.159534024 container init 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: + sudo -E kolla_set_configs
Dec 02 09:26:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:26:01 np0005541913.localdomain podman[160202]: 2025-12-02 09:26:01.317544298 +0000 UTC m=+0.198864222 container start 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 09:26:01 np0005541913.localdomain edpm-start-podman-container[160202]: ovn_metadata_agent
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Validating config file
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Copying service configuration files
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Writing out command to execute
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/595e1c9b-709c-41d2-9212-0b18b13291a8.pid.haproxy
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/595e1c9b-709c-41d2-9212-0b18b13291a8.conf
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: ++ cat /run_command
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: + CMD=neutron-ovn-metadata-agent
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: + ARGS=
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: + sudo kolla_copy_cacerts
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: Running command: 'neutron-ovn-metadata-agent'
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: + [[ ! -n '' ]]
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: + . kolla_extend_start
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: + umask 0022
Dec 02 09:26:01 np0005541913.localdomain ovn_metadata_agent[160216]: + exec neutron-ovn-metadata-agent
Dec 02 09:26:01 np0005541913.localdomain podman[160224]: 2025-12-02 09:26:01.405322298 +0000 UTC m=+0.081973616 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 09:26:01 np0005541913.localdomain edpm-start-podman-container[160201]: Creating additional drop-in dependency for "ovn_metadata_agent" (34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb)
Dec 02 09:26:01 np0005541913.localdomain podman[160224]: 2025-12-02 09:26:01.487095057 +0000 UTC m=+0.163746405 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 02 09:26:01 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:26:01 np0005541913.localdomain systemd-rc-local-generator[160286]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:26:01 np0005541913.localdomain systemd-sysv-generator[160292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:26:01 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:26:01 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:26:01 np0005541913.localdomain systemd[1]: Started ovn_metadata_agent container.
Dec 02 09:26:01 np0005541913.localdomain sudo[160128]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:02 np0005541913.localdomain sudo[160320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:26:02 np0005541913.localdomain sudo[160320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:26:02 np0005541913.localdomain sudo[160320]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.973 160221 INFO neutron.common.config [-] Logging enabled!
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.973 160221 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.973 160221 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.974 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.974 160221 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.974 160221 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.974 160221 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.974 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.974 160221 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.974 160221 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.974 160221 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.974 160221 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.975 160221 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.976 160221 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.977 160221 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.977 160221 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.977 160221 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.977 160221 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.977 160221 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.977 160221 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.977 160221 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.977 160221 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.977 160221 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.978 160221 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.979 160221 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.980 160221 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.980 160221 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.980 160221 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.980 160221 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.980 160221 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.980 160221 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.980 160221 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.980 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.980 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.981 160221 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.982 160221 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.983 160221 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.984 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.985 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.985 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.985 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.985 160221 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.985 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.985 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.985 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.985 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.985 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.986 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.987 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.988 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.989 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.990 160221 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.991 160221 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.992 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.993 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.994 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.995 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.996 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.997 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.998 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:02.999 160221 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.000 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.001 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.002 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.002 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.002 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.002 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.002 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.002 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.002 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.002 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.002 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.003 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.003 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.003 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.003 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.003 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.003 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.003 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.003 160221 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.003 160221 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.011 160221 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.012 160221 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.012 160221 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.012 160221 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.012 160221 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.034 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd (UUID: cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.049 160221 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.049 160221 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.049 160221 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.049 160221 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.051 160221 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.056 160221 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.063 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:b2:03 192.168.0.102'], port_security=['fa:16:3e:26:b2:03 192.168.0.102'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.102/24', 'neutron:device_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005541913.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-595e1c9b-709c-41d2-9212-0b18b13291a8', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'neutron:revision_number': '7', 'neutron:security_group_ids': '1725f11b-f73c-4c4f-b3d3-772d68fcc09e 23293c48-39ca-43a0-a462-ebc8626a7f6c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23d69817-a35d-4528-880f-f329bfbd969c, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=4a318f6a-b3c1-4690-8246-f7d046ccd64a) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.064 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], external_ids={'neutron:ovn-metadata-id': '6e7f49c3-b0f2-5de8-9eab-f67d22eadf7d', 'neutron:ovn-metadata-sb-cfg': '1'}, name=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, nb_cfg_timestamp=1764667502569, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.065 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 4a318f6a-b3c1-4690-8246-f7d046ccd64a in datapath 595e1c9b-709c-41d2-9212-0b18b13291a8 bound to our chassis on insert
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.066 160221 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f40ecff2b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.066 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.067 160221 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.067 160221 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.067 160221 INFO oslo_service.service [-] Starting 1 workers
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.069 160221 DEBUG oslo_service.service [-] Started child 160335 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.072 160221 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 595e1c9b-709c-41d2-9212-0b18b13291a8
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.073 160221 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpfc5bqu1f/privsep.sock']
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.074 160335 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-1946781'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.111 160335 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.112 160335 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.112 160335 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.115 160335 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.116 160335 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.129 160335 INFO eventlet.wsgi.server [-] (160335) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 02 09:26:03 np0005541913.localdomain sshd[154991]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:26:03 np0005541913.localdomain systemd[1]: session-52.scope: Deactivated successfully.
Dec 02 09:26:03 np0005541913.localdomain systemd[1]: session-52.scope: Consumed 32.054s CPU time.
Dec 02 09:26:03 np0005541913.localdomain systemd-logind[757]: Session 52 logged out. Waiting for processes to exit.
Dec 02 09:26:03 np0005541913.localdomain systemd-logind[757]: Removed session 52.
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.702 160221 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.703 160221 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpfc5bqu1f/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.600 160340 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.606 160340 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.609 160340 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.610 160340 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160340
Dec 02 09:26:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:03.706 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[7f51690a-465a-4109-b7a5-6a9dd0c10206]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4895 DF PROTO=TCP SPT=38356 DPT=9102 SEQ=2803013807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791B51D0000000001030307) 
Dec 02 09:26:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61541 DF PROTO=TCP SPT=48654 DPT=9105 SEQ=1276867292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791B59E0000000001030307) 
Dec 02 09:26:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:04.140 160340 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:26:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:04.140 160340 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:26:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:04.140 160340 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:26:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:04.668 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[37691d4f-3e6e-466e-92ff-faef31c45e9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:04.670 160221 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp6quvgmqh/privsep.sock']
Dec 02 09:26:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:05.586 160221 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 02 09:26:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:05.587 160221 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6quvgmqh/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 02 09:26:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:05.364 160351 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 09:26:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:05.385 160351 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 09:26:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:05.400 160351 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 02 09:26:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:05.401 160351 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160351
Dec 02 09:26:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:05.590 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[82d7f118-e244-4378-92e2-39e53a3bcc00]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.072 160351 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.072 160351 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.072 160351 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.704 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[40b6147b-e62d-450f-9233-492ec02f4932]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.708 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[192a42ee-982d-415b-b3f6-1623bbffd307]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.746 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[a7107976-20c4-48a3-be58-49ada6436a9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.765 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[949eaf20-7b71-4041-9804-2aa6d1e747a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap595e1c9b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e8:5a:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647718, 'reachable_time': 39342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 160361, 'error': None, 'target': 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.783 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[3d892970-efdf-442b-aab3-fff7c20bd661]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap595e1c9b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 647724, 'tstamp': 647724}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160362, 'error': None, 'target': 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap595e1c9b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 647728, 'tstamp': 647728}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160362, 'error': None, 'target': 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 647727, 'tstamp': 647727}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160362, 'error': None, 'target': 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:5a19'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 647718, 'tstamp': 647718}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160362, 'error': None, 'target': 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.844 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[01c01976-ce1e-4337-9de6-aa3c438883a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.846 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap595e1c9b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.851 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap595e1c9b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.852 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.852 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap595e1c9b-70, col_values=(('external_ids', {'iface-id': 'd6e7da3f-8574-49e0-8ba1-2f642b3cec92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.853 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:26:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:06.857 160221 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp1mspp3cm/privsep.sock']
Dec 02 09:26:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4897 DF PROTO=TCP SPT=38356 DPT=9102 SEQ=2803013807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791C1250000000001030307) 
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:08.339 160221 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:08.340 160221 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1mspp3cm/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:07.459 160371 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:07.487 160371 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:07.489 160371 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:07.489 160371 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160371
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:08.344 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[8d6082e2-2eee-40b3-83fa-3840a6de9e75]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:08.803 160371 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:08.803 160371 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:26:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:08.803 160371 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.315 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[d731f87c-b725-4236-a0a2-3ae00f1dc5ea]: (4, ['ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.318 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, column=external_ids, values=({'neutron:ovn-metadata-id': '6e7f49c3-b0f2-5de8-9eab-f67d22eadf7d'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.319 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.320 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.472 160221 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.473 160221 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.474 160221 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.474 160221 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.474 160221 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.475 160221 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.475 160221 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.476 160221 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.476 160221 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.477 160221 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.477 160221 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.477 160221 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.478 160221 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.478 160221 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.479 160221 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.479 160221 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.480 160221 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.480 160221 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.480 160221 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.481 160221 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.481 160221 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.482 160221 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.482 160221 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.483 160221 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.483 160221 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.484 160221 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.484 160221 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.485 160221 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.485 160221 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.486 160221 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.486 160221 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.486 160221 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.487 160221 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.487 160221 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.487 160221 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.488 160221 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.488 160221 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.489 160221 DEBUG oslo_service.service [-] host                           = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.490 160221 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.490 160221 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.490 160221 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.490 160221 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.491 160221 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.491 160221 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.491 160221 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.492 160221 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.492 160221 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.492 160221 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.492 160221 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.493 160221 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.493 160221 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.493 160221 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.494 160221 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.494 160221 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.494 160221 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.495 160221 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.495 160221 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.495 160221 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.495 160221 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.496 160221 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.496 160221 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.496 160221 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.497 160221 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.497 160221 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.497 160221 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.498 160221 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.498 160221 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.498 160221 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.498 160221 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.499 160221 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.499 160221 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.499 160221 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.500 160221 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.500 160221 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.500 160221 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.501 160221 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.501 160221 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.501 160221 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.501 160221 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.502 160221 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.502 160221 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.502 160221 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.503 160221 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.503 160221 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.503 160221 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.504 160221 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.504 160221 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.504 160221 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.504 160221 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.505 160221 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.505 160221 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.505 160221 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.506 160221 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.506 160221 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.506 160221 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.506 160221 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.507 160221 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.507 160221 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.507 160221 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.508 160221 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.508 160221 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.508 160221 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.509 160221 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.509 160221 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.509 160221 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.509 160221 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.510 160221 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.510 160221 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.510 160221 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.511 160221 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.511 160221 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.511 160221 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.512 160221 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.512 160221 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.512 160221 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.513 160221 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.513 160221 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.513 160221 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.513 160221 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.514 160221 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.514 160221 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.514 160221 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.515 160221 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.515 160221 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.515 160221 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.516 160221 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.516 160221 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.516 160221 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.517 160221 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.517 160221 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.517 160221 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.517 160221 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.518 160221 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.518 160221 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.518 160221 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.519 160221 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.519 160221 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.519 160221 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.520 160221 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.520 160221 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.520 160221 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.520 160221 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.521 160221 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.521 160221 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.521 160221 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.522 160221 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.522 160221 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.522 160221 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.522 160221 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.523 160221 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.523 160221 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.523 160221 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.524 160221 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.524 160221 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.524 160221 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.524 160221 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.525 160221 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.525 160221 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.525 160221 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.525 160221 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.526 160221 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.526 160221 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.526 160221 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.527 160221 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.527 160221 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.527 160221 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.528 160221 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.528 160221 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.528 160221 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.528 160221 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.529 160221 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.529 160221 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.529 160221 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.529 160221 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.530 160221 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.530 160221 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.530 160221 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.531 160221 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.531 160221 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.531 160221 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.531 160221 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.532 160221 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.532 160221 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.532 160221 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.533 160221 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.533 160221 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.533 160221 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.534 160221 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.534 160221 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.534 160221 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.534 160221 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.535 160221 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.535 160221 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.535 160221 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.536 160221 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.536 160221 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.536 160221 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.537 160221 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.537 160221 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.537 160221 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.537 160221 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.538 160221 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.538 160221 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.538 160221 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.539 160221 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.539 160221 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.539 160221 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.539 160221 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.540 160221 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.540 160221 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.540 160221 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.540 160221 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.540 160221 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.541 160221 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.541 160221 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.541 160221 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.541 160221 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.541 160221 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.541 160221 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.542 160221 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.542 160221 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.542 160221 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.542 160221 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.542 160221 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.543 160221 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.543 160221 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.543 160221 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.543 160221 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.543 160221 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.543 160221 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.544 160221 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.544 160221 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.544 160221 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.544 160221 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.544 160221 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.545 160221 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.545 160221 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.545 160221 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.545 160221 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.545 160221 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.546 160221 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.546 160221 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.546 160221 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.546 160221 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.546 160221 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.546 160221 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.547 160221 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.547 160221 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.547 160221 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.547 160221 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.547 160221 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.548 160221 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.548 160221 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.548 160221 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.548 160221 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.548 160221 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.549 160221 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.549 160221 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.549 160221 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.549 160221 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.549 160221 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.550 160221 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.550 160221 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.550 160221 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.550 160221 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.550 160221 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.550 160221 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.551 160221 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.551 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.551 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.551 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.551 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.552 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.552 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.552 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.552 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.552 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.553 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.553 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.553 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.553 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.553 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.554 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.554 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.554 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.554 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.554 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.555 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.555 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.555 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.555 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.555 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.555 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.556 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.556 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.556 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.556 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.556 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.557 160221 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.557 160221 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.557 160221 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.557 160221 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.557 160221 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:26:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:26:09.558 160221 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 02 09:26:09 np0005541913.localdomain sshd[160376]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:26:09 np0005541913.localdomain sshd[160376]: Accepted publickey for zuul from 192.168.122.30 port 35100 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:26:09 np0005541913.localdomain systemd-logind[757]: New session 53 of user zuul.
Dec 02 09:26:09 np0005541913.localdomain systemd[1]: Started Session 53 of User zuul.
Dec 02 09:26:09 np0005541913.localdomain sshd[160376]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:26:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44628 DF PROTO=TCP SPT=45144 DPT=9100 SEQ=2318657959 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791CD240000000001030307) 
Dec 02 09:26:10 np0005541913.localdomain python3.9[160469]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:26:12 np0005541913.localdomain sudo[160563]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usrhawbhysczzjnzmwfalmdjwijskwfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667571.8895195-63-51146413826445/AnsiballZ_command.py
Dec 02 09:26:12 np0005541913.localdomain sudo[160563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:12 np0005541913.localdomain python3.9[160565]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:12 np0005541913.localdomain sudo[160563]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:13 np0005541913.localdomain sudo[160667]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvqahovjlfgdoqomaxhsnzbubbhigbih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667572.7718234-87-173776081724414/AnsiballZ_command.py
Dec 02 09:26:13 np0005541913.localdomain sudo[160667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57191 DF PROTO=TCP SPT=50190 DPT=9882 SEQ=2419112481 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791D9240000000001030307) 
Dec 02 09:26:13 np0005541913.localdomain python3.9[160669]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:13 np0005541913.localdomain systemd[1]: libpod-9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1.scope: Deactivated successfully.
Dec 02 09:26:13 np0005541913.localdomain podman[160670]: 2025-12-02 09:26:13.277411132 +0000 UTC m=+0.058042261 container died 9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 02 09:26:13 np0005541913.localdomain podman[160670]: 2025-12-02 09:26:13.32432964 +0000 UTC m=+0.104960759 container cleanup 9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, release=1761123044)
Dec 02 09:26:13 np0005541913.localdomain sudo[160667]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:13 np0005541913.localdomain podman[160684]: 2025-12-02 09:26:13.420729852 +0000 UTC m=+0.099679139 container remove 9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, distribution-scope=public, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team)
Dec 02 09:26:13 np0005541913.localdomain systemd[1]: libpod-conmon-9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1.scope: Deactivated successfully.
Dec 02 09:26:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-93ad9083e7cc3e7616303b5d13e7a101d6cbdaa325d96e32c757f24ef781f581-merged.mount: Deactivated successfully.
Dec 02 09:26:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9eb9f01827a63f0bcbf5f5e5d764a5c07546957865a50dee3f13116030c748e1-userdata-shm.mount: Deactivated successfully.
Dec 02 09:26:14 np0005541913.localdomain sudo[160789]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tchdfncmrciudmcwvbjxqsqfdofxnblc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667573.943715-117-125976917277717/AnsiballZ_systemd_service.py
Dec 02 09:26:14 np0005541913.localdomain sudo[160789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:14 np0005541913.localdomain python3.9[160791]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:26:14 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:26:14 np0005541913.localdomain systemd-rc-local-generator[160817]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:26:14 np0005541913.localdomain systemd-sysv-generator[160823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:26:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:26:15 np0005541913.localdomain sudo[160789]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:15 np0005541913.localdomain python3.9[160918]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:26:15 np0005541913.localdomain network[160935]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:26:15 np0005541913.localdomain network[160936]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:26:15 np0005541913.localdomain network[160937]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:26:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44630 DF PROTO=TCP SPT=45144 DPT=9100 SEQ=2318657959 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791E4E50000000001030307) 
Dec 02 09:26:16 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:26:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61545 DF PROTO=TCP SPT=48654 DPT=9105 SEQ=1276867292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791F1E40000000001030307) 
Dec 02 09:26:20 np0005541913.localdomain sudo[161136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtkjxwenqbjipjifvpyqwmpwbalmkpzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667580.5392349-174-177984609441659/AnsiballZ_systemd_service.py
Dec 02 09:26:20 np0005541913.localdomain sudo[161136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:21 np0005541913.localdomain python3.9[161138]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:26:22 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:26:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19699 DF PROTO=TCP SPT=39292 DPT=9101 SEQ=1521133884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4791FC900000000001030307) 
Dec 02 09:26:22 np0005541913.localdomain systemd-sysv-generator[161170]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:26:22 np0005541913.localdomain systemd-rc-local-generator[161165]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:26:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:26:22 np0005541913.localdomain systemd[1]: Stopped target tripleo_nova_libvirt.target.
Dec 02 09:26:22 np0005541913.localdomain sudo[161136]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:22 np0005541913.localdomain sudo[161267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puoxsgkidrzkvauvuhwgtckkllqjqnqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667582.6860428-174-261265926494804/AnsiballZ_systemd_service.py
Dec 02 09:26:22 np0005541913.localdomain sudo[161267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:23 np0005541913.localdomain python3.9[161269]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:26:23 np0005541913.localdomain sudo[161267]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:23 np0005541913.localdomain sudo[161360]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgonmnjatysoybljmxsopmlvofpihfau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667583.3864944-174-2694116813411/AnsiballZ_systemd_service.py
Dec 02 09:26:23 np0005541913.localdomain sudo[161360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:23 np0005541913.localdomain python3.9[161362]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:26:24 np0005541913.localdomain sudo[161360]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:24 np0005541913.localdomain sudo[161453]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aogvdtjjuxbbcgmohnsrktuzkujwsqvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667584.1233563-174-269735995156163/AnsiballZ_systemd_service.py
Dec 02 09:26:24 np0005541913.localdomain sudo[161453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:24 np0005541913.localdomain python3.9[161455]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:26:24 np0005541913.localdomain sudo[161453]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:25 np0005541913.localdomain sudo[161546]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjefodjwhrsyrvbdudwkiejprvjpcaei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667584.8229995-174-128826265272542/AnsiballZ_systemd_service.py
Dec 02 09:26:25 np0005541913.localdomain sudo[161546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19701 DF PROTO=TCP SPT=39292 DPT=9101 SEQ=1521133884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479208A40000000001030307) 
Dec 02 09:26:25 np0005541913.localdomain python3.9[161548]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:26:25 np0005541913.localdomain sudo[161546]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:26 np0005541913.localdomain sudo[161639]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgzllyeoavtfrnlywzbbwnlihpboycis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667585.547805-174-39702088775773/AnsiballZ_systemd_service.py
Dec 02 09:26:26 np0005541913.localdomain sudo[161639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:26:26 np0005541913.localdomain podman[161642]: 2025-12-02 09:26:26.277294158 +0000 UTC m=+0.086102123 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:26:26 np0005541913.localdomain podman[161642]: 2025-12-02 09:26:26.316930823 +0000 UTC m=+0.125738818 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 09:26:26 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:26:26 np0005541913.localdomain python3.9[161641]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:26:26 np0005541913.localdomain sudo[161639]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:26 np0005541913.localdomain sudo[161758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsnairdwqpxzbxednmwgjjmuoifxlbbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667586.6434147-174-3199837123041/AnsiballZ_systemd_service.py
Dec 02 09:26:26 np0005541913.localdomain sudo[161758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:27 np0005541913.localdomain python3.9[161760]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:26:27 np0005541913.localdomain sudo[161758]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:29 np0005541913.localdomain sudo[161851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-worjvkhbapbsbpdjhtmmifuuegtjvves ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667588.8672655-330-57501126362980/AnsiballZ_file.py
Dec 02 09:26:29 np0005541913.localdomain sudo[161851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19702 DF PROTO=TCP SPT=39292 DPT=9101 SEQ=1521133884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479218640000000001030307) 
Dec 02 09:26:29 np0005541913.localdomain python3.9[161853]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:29 np0005541913.localdomain sudo[161851]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:30 np0005541913.localdomain sudo[161943]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slvxirsizuqfmltkdlsptgjzqlealhtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667590.113038-330-137144691741794/AnsiballZ_file.py
Dec 02 09:26:30 np0005541913.localdomain sudo[161943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:30 np0005541913.localdomain python3.9[161945]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:30 np0005541913.localdomain sudo[161943]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:30 np0005541913.localdomain sudo[162035]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvgwkyxitdmfkmvmyrobbxxrvfbvqskw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667590.7351098-330-133589797090727/AnsiballZ_file.py
Dec 02 09:26:30 np0005541913.localdomain sudo[162035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:31 np0005541913.localdomain python3.9[162037]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:31 np0005541913.localdomain sudo[162035]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:32 np0005541913.localdomain sudo[162127]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpeubmsnrqctesvyqjwjnxmxmaglkpwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667591.3491464-330-53378959515399/AnsiballZ_file.py
Dec 02 09:26:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:26:32 np0005541913.localdomain sudo[162127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:32 np0005541913.localdomain systemd[1]: tmp-crun.AHl9rr.mount: Deactivated successfully.
Dec 02 09:26:32 np0005541913.localdomain podman[162129]: 2025-12-02 09:26:32.297012653 +0000 UTC m=+0.095636124 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 09:26:32 np0005541913.localdomain podman[162129]: 2025-12-02 09:26:32.304043058 +0000 UTC m=+0.102666569 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 02 09:26:32 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:26:32 np0005541913.localdomain python3.9[162130]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:32 np0005541913.localdomain sudo[162127]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:32 np0005541913.localdomain sudo[162237]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pprfpluxniuctbilcjttbzrupmblkqsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667592.529877-330-51290517334946/AnsiballZ_file.py
Dec 02 09:26:32 np0005541913.localdomain sudo[162237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:32 np0005541913.localdomain python3.9[162239]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:33 np0005541913.localdomain sudo[162237]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:33 np0005541913.localdomain sudo[162329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooqlbqcddwgjchmebudgullvqzwjpjxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667593.094167-330-81454014650709/AnsiballZ_file.py
Dec 02 09:26:33 np0005541913.localdomain sudo[162329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:33 np0005541913.localdomain python3.9[162331]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:33 np0005541913.localdomain sudo[162329]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:33 np0005541913.localdomain sudo[162421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olkxetbfwxgbamarnuesovgbxcevlfhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667593.6348083-330-126773945262200/AnsiballZ_file.py
Dec 02 09:26:33 np0005541913.localdomain sudo[162421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25749 DF PROTO=TCP SPT=52052 DPT=9102 SEQ=3936899043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47922A4E0000000001030307) 
Dec 02 09:26:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26660 DF PROTO=TCP SPT=42600 DPT=9105 SEQ=803090811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47922ACF0000000001030307) 
Dec 02 09:26:34 np0005541913.localdomain python3.9[162423]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:34 np0005541913.localdomain sudo[162421]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:34 np0005541913.localdomain sudo[162513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypslbtndjxzmhgttviszgynxubvtyafj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667594.3885584-480-25482951304472/AnsiballZ_file.py
Dec 02 09:26:34 np0005541913.localdomain sudo[162513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:34 np0005541913.localdomain python3.9[162515]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:34 np0005541913.localdomain sudo[162513]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:35 np0005541913.localdomain sudo[162605]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcnopubfmnhybwksnqdzazniesiwbjmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667594.9549613-480-254484334447787/AnsiballZ_file.py
Dec 02 09:26:35 np0005541913.localdomain sudo[162605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:35 np0005541913.localdomain python3.9[162607]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:35 np0005541913.localdomain sudo[162605]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:35 np0005541913.localdomain sudo[162697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvzaemnoobippqqpkfkitfzwvavcicww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667595.5384858-480-187637735400115/AnsiballZ_file.py
Dec 02 09:26:35 np0005541913.localdomain sudo[162697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:36 np0005541913.localdomain python3.9[162699]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:36 np0005541913.localdomain sudo[162697]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:36 np0005541913.localdomain sudo[162789]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aoskvtsywwognsriqbsdirjkmezepsrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667596.111925-480-272028312762765/AnsiballZ_file.py
Dec 02 09:26:36 np0005541913.localdomain sudo[162789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:36 np0005541913.localdomain python3.9[162791]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:36 np0005541913.localdomain sudo[162789]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25751 DF PROTO=TCP SPT=52052 DPT=9102 SEQ=3936899043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479236640000000001030307) 
Dec 02 09:26:37 np0005541913.localdomain sudo[162881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfqzilpcimqdjrnjkmznilttynciyrrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667596.8510787-480-280448833487920/AnsiballZ_file.py
Dec 02 09:26:37 np0005541913.localdomain sudo[162881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:37 np0005541913.localdomain python3.9[162883]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:37 np0005541913.localdomain sudo[162881]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:37 np0005541913.localdomain sudo[162973]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stkyepgacnzfoctloiarrlbvsojkrfjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667597.4533389-480-88881213719889/AnsiballZ_file.py
Dec 02 09:26:37 np0005541913.localdomain sudo[162973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:37 np0005541913.localdomain python3.9[162975]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:37 np0005541913.localdomain sudo[162973]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:38 np0005541913.localdomain sudo[163065]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmfvrmoudcwftosbaalpheozbtzuoaej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667598.012454-480-192982363766987/AnsiballZ_file.py
Dec 02 09:26:38 np0005541913.localdomain sudo[163065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:38 np0005541913.localdomain python3.9[163067]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:26:38 np0005541913.localdomain sudo[163065]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:39 np0005541913.localdomain sudo[163157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icjxwlujrmdiseyyivlxpoelemmnpgua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667598.7691474-633-129499074612500/AnsiballZ_command.py
Dec 02 09:26:39 np0005541913.localdomain sudo[163157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:39 np0005541913.localdomain python3.9[163159]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:39 np0005541913.localdomain sudo[163157]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:39 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28819 DF PROTO=TCP SPT=58718 DPT=9882 SEQ=1619514852 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479241E40000000001030307) 
Dec 02 09:26:40 np0005541913.localdomain python3.9[163251]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 09:26:40 np0005541913.localdomain sudo[163341]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asdsxciligsdlczksbjvzdikxztllziq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667600.518756-687-210518413581282/AnsiballZ_systemd_service.py
Dec 02 09:26:40 np0005541913.localdomain sudo[163341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:41 np0005541913.localdomain python3.9[163343]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:26:41 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:26:41 np0005541913.localdomain systemd-rc-local-generator[163366]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:26:41 np0005541913.localdomain systemd-sysv-generator[163371]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:26:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:26:41 np0005541913.localdomain sudo[163341]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:42 np0005541913.localdomain sudo[163468]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aobsevbrhaucccbunbqwsimlekccjrul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667601.6068094-711-137247580288814/AnsiballZ_command.py
Dec 02 09:26:42 np0005541913.localdomain sudo[163468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:42 np0005541913.localdomain python3.9[163470]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:42 np0005541913.localdomain sudo[163468]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61658 DF PROTO=TCP SPT=55884 DPT=9100 SEQ=1331993231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47924DE40000000001030307) 
Dec 02 09:26:43 np0005541913.localdomain sudo[163561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sckzlxhjkdhbtsxatiorznkzpcycvhns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667602.994323-711-135198364445565/AnsiballZ_command.py
Dec 02 09:26:43 np0005541913.localdomain sudo[163561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:43 np0005541913.localdomain python3.9[163563]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:43 np0005541913.localdomain sudo[163561]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:44 np0005541913.localdomain sudo[163654]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnfxjdncgpqvjmcbxdneczgkqoprccgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667603.8045228-711-233130622083380/AnsiballZ_command.py
Dec 02 09:26:44 np0005541913.localdomain sudo[163654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:44 np0005541913.localdomain python3.9[163656]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:44 np0005541913.localdomain sudo[163654]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:44 np0005541913.localdomain sudo[163747]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwwvkpblopkonpagtesmectsldjxekpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667604.3927677-711-121311629347300/AnsiballZ_command.py
Dec 02 09:26:44 np0005541913.localdomain sudo[163747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:44 np0005541913.localdomain python3.9[163749]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:44 np0005541913.localdomain sudo[163747]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:45 np0005541913.localdomain sudo[163840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydmijhypwummlyyxckhyzixqtipmbdgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667605.0076752-711-60818342041589/AnsiballZ_command.py
Dec 02 09:26:45 np0005541913.localdomain sudo[163840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:45 np0005541913.localdomain python3.9[163842]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:45 np0005541913.localdomain sudo[163840]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:45 np0005541913.localdomain sudo[163933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krtndfsomtifzktkfacqwpjvwuxxdqlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667605.6461735-711-72085002860130/AnsiballZ_command.py
Dec 02 09:26:45 np0005541913.localdomain sudo[163933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:46 np0005541913.localdomain python3.9[163935]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:46 np0005541913.localdomain sudo[163933]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23378 DF PROTO=TCP SPT=35140 DPT=9100 SEQ=816941798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47925A240000000001030307) 
Dec 02 09:26:46 np0005541913.localdomain sudo[164026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxwtideenfyvttxjsncfepdbvcwldawa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667606.162001-711-77442966801451/AnsiballZ_command.py
Dec 02 09:26:46 np0005541913.localdomain sudo[164026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:46 np0005541913.localdomain python3.9[164028]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:26:46 np0005541913.localdomain sudo[164026]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:48 np0005541913.localdomain sudo[164119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxbgmxfwozucmeqmutnnucmosbmeienk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667607.6759136-874-265304640170037/AnsiballZ_getent.py
Dec 02 09:26:48 np0005541913.localdomain sudo[164119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:48 np0005541913.localdomain python3.9[164121]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 02 09:26:48 np0005541913.localdomain sudo[164119]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:49 np0005541913.localdomain sudo[164212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afrdprktbxxkdfwbhoxthvesitpaidyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667608.6586623-898-14267152716561/AnsiballZ_group.py
Dec 02 09:26:49 np0005541913.localdomain sudo[164212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25753 DF PROTO=TCP SPT=52052 DPT=9102 SEQ=3936899043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479265E40000000001030307) 
Dec 02 09:26:49 np0005541913.localdomain python3.9[164214]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 09:26:49 np0005541913.localdomain groupadd[164216]: group added to /etc/group: name=libvirt, GID=42473
Dec 02 09:26:49 np0005541913.localdomain groupadd[164216]: group added to /etc/gshadow: name=libvirt
Dec 02 09:26:49 np0005541913.localdomain groupadd[164216]: new group: name=libvirt, GID=42473
Dec 02 09:26:49 np0005541913.localdomain sudo[164212]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:50 np0005541913.localdomain sudo[164311]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbiyvjsaugwqjafnzvkwrkvvkixwzaqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667609.5543206-921-63542644655072/AnsiballZ_user.py
Dec 02 09:26:50 np0005541913.localdomain sudo[164311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:50 np0005541913.localdomain python3.9[164313]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005541913.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 09:26:50 np0005541913.localdomain useradd[164315]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Dec 02 09:26:50 np0005541913.localdomain sudo[164311]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:51 np0005541913.localdomain sudo[164411]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvdkipfmzrztfroektabgvrukddhitly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667611.2854445-954-7554148040101/AnsiballZ_setup.py
Dec 02 09:26:51 np0005541913.localdomain sudo[164411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:51 np0005541913.localdomain python3.9[164413]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:26:52 np0005541913.localdomain sudo[164411]: pam_unix(sudo:session): session closed for user root
Dec 02 09:26:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13365 DF PROTO=TCP SPT=49588 DPT=9101 SEQ=2064776806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479271C00000000001030307) 
Dec 02 09:26:53 np0005541913.localdomain sudo[164465]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qohmcliyexcjxqulmngzexqjsfrstprx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667611.2854445-954-7554148040101/AnsiballZ_dnf.py
Dec 02 09:26:53 np0005541913.localdomain sudo[164465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:26:53 np0005541913.localdomain python3.9[164467]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:26:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13367 DF PROTO=TCP SPT=49588 DPT=9101 SEQ=2064776806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47927DE40000000001030307) 
Dec 02 09:26:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:26:56 np0005541913.localdomain podman[164470]: 2025-12-02 09:26:56.538010526 +0000 UTC m=+0.160081053 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Dec 02 09:26:56 np0005541913.localdomain podman[164470]: 2025-12-02 09:26:56.584527222 +0000 UTC m=+0.206597739 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:26:56 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:26:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13368 DF PROTO=TCP SPT=49588 DPT=9101 SEQ=2064776806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47928DA50000000001030307) 
Dec 02 09:27:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:27:02 np0005541913.localdomain podman[164564]: 2025-12-02 09:27:02.460069856 +0000 UTC m=+0.096580118 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 09:27:02 np0005541913.localdomain podman[164564]: 2025-12-02 09:27:02.46549582 +0000 UTC m=+0.102006052 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:27:02 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:27:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:27:03.005 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:27:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:27:03.006 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:27:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:27:03.008 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:27:03 np0005541913.localdomain sudo[164582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:27:03 np0005541913.localdomain sudo[164582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:27:03 np0005541913.localdomain sudo[164582]: pam_unix(sudo:session): session closed for user root
Dec 02 09:27:03 np0005541913.localdomain sudo[164600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:27:03 np0005541913.localdomain sudo[164600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:27:03 np0005541913.localdomain sudo[164600]: pam_unix(sudo:session): session closed for user root
Dec 02 09:27:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45502 DF PROTO=TCP SPT=46298 DPT=9102 SEQ=3016421781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47929F7D0000000001030307) 
Dec 02 09:27:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44373 DF PROTO=TCP SPT=49598 DPT=9105 SEQ=1362833837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47929FFF0000000001030307) 
Dec 02 09:27:04 np0005541913.localdomain sudo[164649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:27:04 np0005541913.localdomain sudo[164649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:27:04 np0005541913.localdomain sudo[164649]: pam_unix(sudo:session): session closed for user root
Dec 02 09:27:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45504 DF PROTO=TCP SPT=46298 DPT=9102 SEQ=3016421781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4792ABA40000000001030307) 
Dec 02 09:27:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63715 DF PROTO=TCP SPT=33978 DPT=9100 SEQ=1861443308 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4792B7650000000001030307) 
Dec 02 09:27:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:27:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 4776 writes, 21K keys, 4776 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4776 writes, 569 syncs, 8.39 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab13610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5581cab122d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 02 09:27:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36472 DF PROTO=TCP SPT=34670 DPT=9882 SEQ=1097936100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4792C3A50000000001030307) 
Dec 02 09:27:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:27:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.2 total, 600.0 interval
                                                          Cumulative writes: 5722 writes, 25K keys, 5722 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5722 writes, 780 syncs, 7.34 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.021       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.029       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd3610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x565243dd22d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 02 09:27:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63717 DF PROTO=TCP SPT=33978 DPT=9100 SEQ=1861443308 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4792CF240000000001030307) 
Dec 02 09:27:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44377 DF PROTO=TCP SPT=49598 DPT=9105 SEQ=1362833837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4792DBE40000000001030307) 
Dec 02 09:27:21 np0005541913.localdomain kernel: SELinux:  Converting 2759 SID table entries...
Dec 02 09:27:21 np0005541913.localdomain kernel: SELinux:  Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped).
Dec 02 09:27:21 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 09:27:21 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 09:27:21 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 09:27:21 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 09:27:21 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 09:27:21 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 09:27:21 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 09:27:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61846 DF PROTO=TCP SPT=38060 DPT=9101 SEQ=933683415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4792E6F00000000001030307) 
Dec 02 09:27:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61848 DF PROTO=TCP SPT=38060 DPT=9101 SEQ=933683415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4792F2E40000000001030307) 
Dec 02 09:27:27 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=19 res=1
Dec 02 09:27:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:27:27 np0005541913.localdomain podman[165712]: 2025-12-02 09:27:27.500626245 +0000 UTC m=+0.114048534 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:27:27 np0005541913.localdomain podman[165712]: 2025-12-02 09:27:27.570042341 +0000 UTC m=+0.183464650 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 09:27:27 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:27:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61849 DF PROTO=TCP SPT=38060 DPT=9101 SEQ=933683415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479302A40000000001030307) 
Dec 02 09:27:32 np0005541913.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 02 09:27:32 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 09:27:32 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 09:27:32 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 09:27:32 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 09:27:32 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 09:27:32 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 09:27:32 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 09:27:33 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Dec 02 09:27:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:27:33 np0005541913.localdomain systemd[1]: tmp-crun.FClIUm.mount: Deactivated successfully.
Dec 02 09:27:33 np0005541913.localdomain podman[165746]: 2025-12-02 09:27:33.515221152 +0000 UTC m=+0.129346106 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:27:33 np0005541913.localdomain podman[165746]: 2025-12-02 09:27:33.550192002 +0000 UTC m=+0.164316976 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:27:33 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:27:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58257 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3079829525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479314AD0000000001030307) 
Dec 02 09:27:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12312 DF PROTO=TCP SPT=36842 DPT=9105 SEQ=1737036643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793152E0000000001030307) 
Dec 02 09:27:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58259 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3079829525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479320A50000000001030307) 
Dec 02 09:27:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25145 DF PROTO=TCP SPT=34572 DPT=9100 SEQ=3650209280 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47932CA40000000001030307) 
Dec 02 09:27:42 np0005541913.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 02 09:27:42 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 09:27:42 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 09:27:42 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 09:27:42 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 09:27:42 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 09:27:42 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 09:27:42 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 09:27:42 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23381 DF PROTO=TCP SPT=35140 DPT=9100 SEQ=816941798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479337E40000000001030307) 
Dec 02 09:27:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25147 DF PROTO=TCP SPT=34572 DPT=9100 SEQ=3650209280 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479344640000000001030307) 
Dec 02 09:27:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58261 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3079829525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47934FE50000000001030307) 
Dec 02 09:27:51 np0005541913.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 02 09:27:51 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 09:27:51 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 09:27:51 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 09:27:51 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 09:27:51 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 09:27:51 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 09:27:51 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 09:27:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3406 DF PROTO=TCP SPT=37430 DPT=9101 SEQ=2658431250 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47935C200000000001030307) 
Dec 02 09:27:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3408 DF PROTO=TCP SPT=37430 DPT=9101 SEQ=2658431250 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479368240000000001030307) 
Dec 02 09:27:58 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=22 res=1
Dec 02 09:27:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:27:58 np0005541913.localdomain podman[165785]: 2025-12-02 09:27:58.490597933 +0000 UTC m=+0.108261180 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 02 09:27:58 np0005541913.localdomain podman[165785]: 2025-12-02 09:27:58.576554082 +0000 UTC m=+0.194217329 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:27:58 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:27:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3409 DF PROTO=TCP SPT=37430 DPT=9101 SEQ=2658431250 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479377E40000000001030307) 
Dec 02 09:28:02 np0005541913.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 02 09:28:02 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 09:28:02 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 09:28:02 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 09:28:02 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 09:28:02 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 09:28:02 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 09:28:02 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 09:28:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:28:03.006 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:28:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:28:03.008 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:28:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:28:03.011 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:28:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53984 DF PROTO=TCP SPT=47918 DPT=9102 SEQ=3456281276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479389DE0000000001030307) 
Dec 02 09:28:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65302 DF PROTO=TCP SPT=43668 DPT=9105 SEQ=1511092591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47938A5E0000000001030307) 
Dec 02 09:28:04 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=23 res=1
Dec 02 09:28:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:28:04 np0005541913.localdomain podman[165819]: 2025-12-02 09:28:04.462529264 +0000 UTC m=+0.089171657 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:28:04 np0005541913.localdomain podman[165819]: 2025-12-02 09:28:04.472024818 +0000 UTC m=+0.098667241 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 09:28:04 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:28:04 np0005541913.localdomain sudo[165837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:28:04 np0005541913.localdomain sudo[165837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:28:04 np0005541913.localdomain sudo[165837]: pam_unix(sudo:session): session closed for user root
Dec 02 09:28:04 np0005541913.localdomain sudo[165855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:28:04 np0005541913.localdomain sudo[165855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:28:05 np0005541913.localdomain sudo[165855]: pam_unix(sudo:session): session closed for user root
Dec 02 09:28:06 np0005541913.localdomain sudo[165905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:28:06 np0005541913.localdomain sudo[165905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:28:06 np0005541913.localdomain sudo[165905]: pam_unix(sudo:session): session closed for user root
Dec 02 09:28:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53986 DF PROTO=TCP SPT=47918 DPT=9102 SEQ=3456281276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479395E50000000001030307) 
Dec 02 09:28:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2404 DF PROTO=TCP SPT=51408 DPT=9100 SEQ=862885942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793A1E40000000001030307) 
Dec 02 09:28:11 np0005541913.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 02 09:28:11 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 09:28:11 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 09:28:11 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 09:28:11 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 09:28:11 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 09:28:11 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 09:28:11 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 09:28:12 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:28:12 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=24 res=1
Dec 02 09:28:12 np0005541913.localdomain systemd-rc-local-generator[165952]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:28:12 np0005541913.localdomain systemd-sysv-generator[165957]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:28:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:28:12 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:28:12 np0005541913.localdomain systemd-rc-local-generator[165991]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:28:12 np0005541913.localdomain systemd-sysv-generator[165996]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:28:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:28:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2755 DF PROTO=TCP SPT=52136 DPT=9882 SEQ=940349716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793ADE40000000001030307) 
Dec 02 09:28:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2406 DF PROTO=TCP SPT=51408 DPT=9100 SEQ=862885942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793B9A50000000001030307) 
Dec 02 09:28:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65306 DF PROTO=TCP SPT=43668 DPT=9105 SEQ=1511092591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793C5E40000000001030307) 
Dec 02 09:28:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63876 DF PROTO=TCP SPT=51204 DPT=9101 SEQ=3510173948 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793D1500000000001030307) 
Dec 02 09:28:22 np0005541913.localdomain kernel: SELinux:  Converting 2763 SID table entries...
Dec 02 09:28:22 np0005541913.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 09:28:22 np0005541913.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 02 09:28:22 np0005541913.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 09:28:22 np0005541913.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 02 09:28:22 np0005541913.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 09:28:22 np0005541913.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 09:28:22 np0005541913.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 09:28:23 np0005541913.localdomain groupadd[166021]: group added to /etc/group: name=clevis, GID=985
Dec 02 09:28:23 np0005541913.localdomain groupadd[166021]: group added to /etc/gshadow: name=clevis
Dec 02 09:28:23 np0005541913.localdomain groupadd[166021]: new group: name=clevis, GID=985
Dec 02 09:28:23 np0005541913.localdomain useradd[166028]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 02 09:28:23 np0005541913.localdomain usermod[166038]: add 'clevis' to group 'tss'
Dec 02 09:28:23 np0005541913.localdomain usermod[166038]: add 'clevis' to shadow group 'tss'
Dec 02 09:28:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63878 DF PROTO=TCP SPT=51204 DPT=9101 SEQ=3510173948 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793DD640000000001030307) 
Dec 02 09:28:26 np0005541913.localdomain groupadd[166060]: group added to /etc/group: name=dnsmasq, GID=984
Dec 02 09:28:26 np0005541913.localdomain groupadd[166060]: group added to /etc/gshadow: name=dnsmasq
Dec 02 09:28:26 np0005541913.localdomain groupadd[166060]: new group: name=dnsmasq, GID=984
Dec 02 09:28:27 np0005541913.localdomain useradd[166067]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 02 09:28:27 np0005541913.localdomain dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Dec 02 09:28:27 np0005541913.localdomain dbus-broker-launch[748]: avc:  op=load_policy lsm=selinux seqno=25 res=1
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Reloading rules
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Collecting garbage unconditionally...
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Loading rules from directory /etc/polkit-1/rules.d
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Finished loading, compiling and executing 5 rules
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Reloading rules
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Collecting garbage unconditionally...
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Loading rules from directory /etc/polkit-1/rules.d
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 02 09:28:28 np0005541913.localdomain polkitd[1037]: Finished loading, compiling and executing 5 rules
Dec 02 09:28:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63879 DF PROTO=TCP SPT=51204 DPT=9101 SEQ=3510173948 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793ED250000000001030307) 
Dec 02 09:28:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:28:29 np0005541913.localdomain podman[166203]: 2025-12-02 09:28:29.543335052 +0000 UTC m=+0.128965689 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:28:29 np0005541913.localdomain podman[166203]: 2025-12-02 09:28:29.619574394 +0000 UTC m=+0.205204961 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 09:28:29 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:28:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4721 DF PROTO=TCP SPT=56644 DPT=9102 SEQ=1960687409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793FF0D0000000001030307) 
Dec 02 09:28:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47866 DF PROTO=TCP SPT=34876 DPT=9105 SEQ=2819316595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4793FF8F0000000001030307) 
Dec 02 09:28:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:28:35 np0005541913.localdomain systemd[1]: tmp-crun.1MDhJ6.mount: Deactivated successfully.
Dec 02 09:28:35 np0005541913.localdomain podman[166273]: 2025-12-02 09:28:35.549470795 +0000 UTC m=+0.152978398 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 02 09:28:35 np0005541913.localdomain podman[166273]: 2025-12-02 09:28:35.578540951 +0000 UTC m=+0.182048474 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 09:28:35 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:28:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4723 DF PROTO=TCP SPT=56644 DPT=9102 SEQ=1960687409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47940B240000000001030307) 
Dec 02 09:28:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63458 DF PROTO=TCP SPT=52066 DPT=9100 SEQ=2095280239 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479416E40000000001030307) 
Dec 02 09:28:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40791 DF PROTO=TCP SPT=60150 DPT=9882 SEQ=887621075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479423240000000001030307) 
Dec 02 09:28:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63460 DF PROTO=TCP SPT=52066 DPT=9100 SEQ=2095280239 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47942EA40000000001030307) 
Dec 02 09:28:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4725 DF PROTO=TCP SPT=56644 DPT=9102 SEQ=1960687409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47943BE50000000001030307) 
Dec 02 09:28:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48416 DF PROTO=TCP SPT=38684 DPT=9101 SEQ=3557542384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479446800000000001030307) 
Dec 02 09:28:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48418 DF PROTO=TCP SPT=38684 DPT=9101 SEQ=3557542384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479452A40000000001030307) 
Dec 02 09:28:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48419 DF PROTO=TCP SPT=38684 DPT=9101 SEQ=3557542384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479462640000000001030307) 
Dec 02 09:29:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:29:01 np0005541913.localdomain podman[179628]: 2025-12-02 09:29:01.065485415 +0000 UTC m=+0.702095309 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:29:01 np0005541913.localdomain podman[179628]: 2025-12-02 09:29:01.100950325 +0000 UTC m=+0.737560209 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 02 09:29:01 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:29:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:29:03.008 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:29:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:29:03.008 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:29:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:29:03.010 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:29:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23394 DF PROTO=TCP SPT=40874 DPT=9102 SEQ=1648017291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4794743D0000000001030307) 
Dec 02 09:29:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30404 DF PROTO=TCP SPT=47810 DPT=9105 SEQ=2704965570 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479474BF0000000001030307) 
Dec 02 09:29:06 np0005541913.localdomain sudo[183111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:29:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:29:06 np0005541913.localdomain sudo[183111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:29:06 np0005541913.localdomain sudo[183111]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:06 np0005541913.localdomain sudo[183135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:29:06 np0005541913.localdomain sudo[183135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:29:06 np0005541913.localdomain podman[183128]: 2025-12-02 09:29:06.341828571 +0000 UTC m=+0.071537886 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:29:06 np0005541913.localdomain podman[183128]: 2025-12-02 09:29:06.376099588 +0000 UTC m=+0.105808863 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 02 09:29:06 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:29:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23396 DF PROTO=TCP SPT=40874 DPT=9102 SEQ=1648017291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479480640000000001030307) 
Dec 02 09:29:07 np0005541913.localdomain sudo[183135]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:07 np0005541913.localdomain groupadd[183202]: group added to /etc/group: name=ceph, GID=167
Dec 02 09:29:07 np0005541913.localdomain groupadd[183202]: group added to /etc/gshadow: name=ceph
Dec 02 09:29:07 np0005541913.localdomain groupadd[183202]: new group: name=ceph, GID=167
Dec 02 09:29:09 np0005541913.localdomain useradd[183208]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 02 09:29:09 np0005541913.localdomain sudo[183420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:29:09 np0005541913.localdomain sudo[183420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:29:09 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2758 DF PROTO=TCP SPT=52136 DPT=9882 SEQ=940349716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47948BE40000000001030307) 
Dec 02 09:29:09 np0005541913.localdomain sudo[183420]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:12 np0005541913.localdomain sshd[119826]: Received signal 15; terminating.
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: sshd.service: Consumed 1.080s CPU time, read 32.0K from disk, written 0B to disk.
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 02 09:29:12 np0005541913.localdomain sshd[184098]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:29:12 np0005541913.localdomain sshd[184098]: Server listening on 0.0.0.0 port 22.
Dec 02 09:29:12 np0005541913.localdomain sshd[184098]: Server listening on :: port 22.
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2409 DF PROTO=TCP SPT=51408 DPT=9100 SEQ=862885942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479497E50000000001030307) 
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:14 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 09:29:14 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 09:29:14 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:15 np0005541913.localdomain systemd-rc-local-generator[184323]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:15 np0005541913.localdomain systemd-sysv-generator[184328]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 09:29:15 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 09:29:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60019 DF PROTO=TCP SPT=35648 DPT=9100 SEQ=2819697767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4794A3E40000000001030307) 
Dec 02 09:29:18 np0005541913.localdomain sudo[164465]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23398 DF PROTO=TCP SPT=40874 DPT=9102 SEQ=1648017291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4794AFE50000000001030307) 
Dec 02 09:29:19 np0005541913.localdomain sudo[189385]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzxnwnqqedriemloachlcbkxpqfwuwmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667758.6799352-990-202246427302092/AnsiballZ_systemd.py
Dec 02 09:29:19 np0005541913.localdomain sudo[189385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:19 np0005541913.localdomain python3.9[189413]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:19 np0005541913.localdomain systemd-sysv-generator[189772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:19 np0005541913.localdomain systemd-rc-local-generator[189765]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:20 np0005541913.localdomain sudo[189385]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:20 np0005541913.localdomain sudo[190175]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzqqauvzsyodizwvsshxmeoxlqrgbptx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667760.1603158-990-57860436811722/AnsiballZ_systemd.py
Dec 02 09:29:20 np0005541913.localdomain sudo[190175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:20 np0005541913.localdomain python3.9[190195]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 09:29:20 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:20 np0005541913.localdomain systemd-rc-local-generator[190308]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:20 np0005541913.localdomain systemd-sysv-generator[190312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:21 np0005541913.localdomain sudo[190175]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:21 np0005541913.localdomain sudo[190653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrvdltdvnidtxshjndfjghmploksrcic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667761.3188262-990-128630090859862/AnsiballZ_systemd.py
Dec 02 09:29:21 np0005541913.localdomain sudo[190653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:21 np0005541913.localdomain python3.9[190670]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 09:29:21 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:22 np0005541913.localdomain systemd-rc-local-generator[190870]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:22 np0005541913.localdomain systemd-sysv-generator[190874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13286 DF PROTO=TCP SPT=33814 DPT=9101 SEQ=674863162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4794BBB00000000001030307) 
Dec 02 09:29:23 np0005541913.localdomain sudo[190653]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:23 np0005541913.localdomain sudo[191576]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-almajoydpbyikkosklmxlyewbtcyohhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667763.5251408-990-126268864254314/AnsiballZ_systemd.py
Dec 02 09:29:23 np0005541913.localdomain sudo[191576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:24 np0005541913.localdomain python3.9[191590]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:24 np0005541913.localdomain systemd-sysv-generator[191765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:24 np0005541913.localdomain systemd-rc-local-generator[191757]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:24 np0005541913.localdomain sudo[191576]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:25 np0005541913.localdomain sudo[192141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dppngztyybyyxokqujcczjirdpqjapvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667764.7203522-1077-79308416835090/AnsiballZ_systemd.py
Dec 02 09:29:25 np0005541913.localdomain sudo[192141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13288 DF PROTO=TCP SPT=33814 DPT=9101 SEQ=674863162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4794C7A40000000001030307) 
Dec 02 09:29:25 np0005541913.localdomain python3.9[192158]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:25 np0005541913.localdomain systemd-rc-local-generator[192356]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:25 np0005541913.localdomain systemd-sysv-generator[192362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:25 np0005541913.localdomain sudo[192141]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:26 np0005541913.localdomain sudo[192758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzgneizuvcxzqerxpwljqozpdwylwsgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667765.8420565-1077-205069600133514/AnsiballZ_systemd.py
Dec 02 09:29:26 np0005541913.localdomain sudo[192758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:26 np0005541913.localdomain python3.9[192776]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:26 np0005541913.localdomain systemd-rc-local-generator[193004]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:26 np0005541913.localdomain systemd-sysv-generator[193007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:26 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:26 np0005541913.localdomain sudo[192758]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:27 np0005541913.localdomain sudo[193359]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmaxzyvbzztqdwjsocxwolcvzxksimae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667766.9924664-1077-109136753408937/AnsiballZ_systemd.py
Dec 02 09:29:27 np0005541913.localdomain sudo[193359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:27 np0005541913.localdomain python3.9[193379]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:27 np0005541913.localdomain systemd-rc-local-generator[193590]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:27 np0005541913.localdomain systemd-sysv-generator[193597]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:27 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:28 np0005541913.localdomain sudo[193359]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:28 np0005541913.localdomain sudo[193891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuzbzoqzpqtxjcxnpisnskmidvqmzwny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667768.1431417-1077-126123210369088/AnsiballZ_systemd.py
Dec 02 09:29:28 np0005541913.localdomain sudo[193891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:28 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 09:29:28 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 09:29:28 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Consumed 16.483s CPU time.
Dec 02 09:29:28 np0005541913.localdomain systemd[1]: run-r6512aaa9a49947a7bce575053b2d2eb3.service: Deactivated successfully.
Dec 02 09:29:28 np0005541913.localdomain systemd[1]: run-r1080bb83e45e428ba54a0498c9e579da.service: Deactivated successfully.
Dec 02 09:29:28 np0005541913.localdomain python3.9[193916]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:28 np0005541913.localdomain sudo[193891]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:29 np0005541913.localdomain sudo[194060]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vidsfhzhpjgmpnakowwfapduhmvdoope ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667768.945701-1077-80617360842376/AnsiballZ_systemd.py
Dec 02 09:29:29 np0005541913.localdomain sudo[194060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13289 DF PROTO=TCP SPT=33814 DPT=9101 SEQ=674863162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4794D7640000000001030307) 
Dec 02 09:29:29 np0005541913.localdomain python3.9[194062]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:30 np0005541913.localdomain systemd-rc-local-generator[194094]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:30 np0005541913.localdomain systemd-sysv-generator[194098]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:30 np0005541913.localdomain sudo[194060]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:29:31 np0005541913.localdomain systemd[1]: tmp-crun.WhjzPF.mount: Deactivated successfully.
Dec 02 09:29:31 np0005541913.localdomain podman[194120]: 2025-12-02 09:29:31.474129599 +0000 UTC m=+0.105703822 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:29:31 np0005541913.localdomain podman[194120]: 2025-12-02 09:29:31.544738467 +0000 UTC m=+0.176312730 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:29:31 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:29:32 np0005541913.localdomain sudo[194237]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhhfnppohxycdfgwmxbjjuqcdjwsuavv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667772.0473857-1185-220120688430991/AnsiballZ_systemd.py
Dec 02 09:29:32 np0005541913.localdomain sudo[194237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:32 np0005541913.localdomain python3.9[194239]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:29:32 np0005541913.localdomain systemd-rc-local-generator[194264]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:29:32 np0005541913.localdomain systemd-sysv-generator[194269]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:32 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:29:33 np0005541913.localdomain sudo[194237]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38737 DF PROTO=TCP SPT=45840 DPT=9102 SEQ=2720412911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4794E96D0000000001030307) 
Dec 02 09:29:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64749 DF PROTO=TCP SPT=51178 DPT=9105 SEQ=986500614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4794E9EE0000000001030307) 
Dec 02 09:29:34 np0005541913.localdomain sudo[194386]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzlilbgzpdtkkjrqxcerlmblfdognzud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667774.2763205-1209-163034637098230/AnsiballZ_systemd.py
Dec 02 09:29:34 np0005541913.localdomain sudo[194386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:34 np0005541913.localdomain python3.9[194388]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:34 np0005541913.localdomain sudo[194386]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:35 np0005541913.localdomain sudo[194499]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdvjliozwjuhixnyhecuxpatftluybqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667775.0640156-1209-150083254126281/AnsiballZ_systemd.py
Dec 02 09:29:35 np0005541913.localdomain sudo[194499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:35 np0005541913.localdomain python3.9[194501]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:35 np0005541913.localdomain sudo[194499]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:36 np0005541913.localdomain sudo[194612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvlkiofnqaquanppocxlgcaxrneagbai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667775.8730063-1209-258744646643089/AnsiballZ_systemd.py
Dec 02 09:29:36 np0005541913.localdomain sudo[194612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:36 np0005541913.localdomain python3.9[194614]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:29:36 np0005541913.localdomain sudo[194612]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:36 np0005541913.localdomain podman[194616]: 2025-12-02 09:29:36.587436046 +0000 UTC m=+0.096971930 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:29:36 np0005541913.localdomain podman[194616]: 2025-12-02 09:29:36.62256069 +0000 UTC m=+0.132096534 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 09:29:36 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:29:36 np0005541913.localdomain sudo[194741]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phzpuukaqxwoarwmdzlkxlpzvdmyaxcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667776.6820862-1209-275068616640173/AnsiballZ_systemd.py
Dec 02 09:29:36 np0005541913.localdomain sudo[194741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38739 DF PROTO=TCP SPT=45840 DPT=9102 SEQ=2720412911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4794F5650000000001030307) 
Dec 02 09:29:37 np0005541913.localdomain python3.9[194743]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:37 np0005541913.localdomain sudo[194741]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:37 np0005541913.localdomain sudo[194854]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzzexldlcrqchdernsemfwuoqxspgjjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667777.521006-1209-47908283991143/AnsiballZ_systemd.py
Dec 02 09:29:37 np0005541913.localdomain sudo[194854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:38 np0005541913.localdomain python3.9[194856]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:38 np0005541913.localdomain sudo[194854]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:38 np0005541913.localdomain sudo[194967]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptdhzgcgrshojknzpudxvppjpsxftxyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667778.3531938-1209-65933182169167/AnsiballZ_systemd.py
Dec 02 09:29:38 np0005541913.localdomain sudo[194967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:38 np0005541913.localdomain python3.9[194969]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51120 DF PROTO=TCP SPT=59094 DPT=9100 SEQ=827855319 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479501640000000001030307) 
Dec 02 09:29:40 np0005541913.localdomain sudo[194967]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:40 np0005541913.localdomain sudo[195080]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgrdlchnpnpolozglldmbtqgnlfczkjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667780.2122948-1209-15412585601384/AnsiballZ_systemd.py
Dec 02 09:29:40 np0005541913.localdomain sudo[195080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:40 np0005541913.localdomain python3.9[195082]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:40 np0005541913.localdomain sudo[195080]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:41 np0005541913.localdomain sudo[195193]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjihgzsiyfbychenzlgvdjvxzodudmug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667781.5940053-1209-242845141928836/AnsiballZ_systemd.py
Dec 02 09:29:41 np0005541913.localdomain sudo[195193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:42 np0005541913.localdomain python3.9[195195]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:42 np0005541913.localdomain sudo[195193]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:42 np0005541913.localdomain sudo[195306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eoizyruiwobqcsiiwgdiltmcpvwkozhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667782.34885-1209-176655089056994/AnsiballZ_systemd.py
Dec 02 09:29:42 np0005541913.localdomain sudo[195306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:43 np0005541913.localdomain python3.9[195308]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:43 np0005541913.localdomain sudo[195306]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23261 DF PROTO=TCP SPT=59448 DPT=9882 SEQ=1754111265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47950D650000000001030307) 
Dec 02 09:29:43 np0005541913.localdomain sudo[195419]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxyuxkzbehtnvhetqnlrkfiktozatwrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667783.5728624-1209-88232814369085/AnsiballZ_systemd.py
Dec 02 09:29:43 np0005541913.localdomain sudo[195419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:44 np0005541913.localdomain python3.9[195421]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:44 np0005541913.localdomain sudo[195419]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:44 np0005541913.localdomain sudo[195532]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrzzcwxkdpoqmgkhlchmjlqxyzxctwjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667784.3047538-1209-223801259287838/AnsiballZ_systemd.py
Dec 02 09:29:44 np0005541913.localdomain sudo[195532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:44 np0005541913.localdomain python3.9[195534]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:44 np0005541913.localdomain sudo[195532]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:45 np0005541913.localdomain sudo[195645]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grucpsdgqhhthzbqdlvrkpvjydinroin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667785.0336878-1209-115220737359580/AnsiballZ_systemd.py
Dec 02 09:29:45 np0005541913.localdomain sudo[195645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:45 np0005541913.localdomain python3.9[195647]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:45 np0005541913.localdomain sudo[195645]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:46 np0005541913.localdomain sudo[195758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utovsopsrjsevfrxejzzcabbfuzwcnad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667785.798991-1209-43079322417555/AnsiballZ_systemd.py
Dec 02 09:29:46 np0005541913.localdomain sudo[195758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51122 DF PROTO=TCP SPT=59094 DPT=9100 SEQ=827855319 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479519250000000001030307) 
Dec 02 09:29:46 np0005541913.localdomain python3.9[195760]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:46 np0005541913.localdomain sudo[195758]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:46 np0005541913.localdomain sudo[195871]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igreouhgicjmyukihtycdjiowdkdeeaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667786.5901706-1209-248346057728018/AnsiballZ_systemd.py
Dec 02 09:29:46 np0005541913.localdomain sudo[195871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:47 np0005541913.localdomain python3.9[195873]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 09:29:47 np0005541913.localdomain sudo[195871]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38741 DF PROTO=TCP SPT=45840 DPT=9102 SEQ=2720412911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479525E40000000001030307) 
Dec 02 09:29:51 np0005541913.localdomain sudo[195984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opqcwtmnubwawsxtsgblzcvahnelebko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667791.5113835-1515-63736595877286/AnsiballZ_file.py
Dec 02 09:29:51 np0005541913.localdomain sudo[195984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:51 np0005541913.localdomain python3.9[195986]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:29:52 np0005541913.localdomain sudo[195984]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23545 DF PROTO=TCP SPT=43714 DPT=9101 SEQ=1325632831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479530E00000000001030307) 
Dec 02 09:29:52 np0005541913.localdomain sudo[196094]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ponuhwfkzoirtsdineymjhwlzyuihgqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667792.133293-1515-211342527566867/AnsiballZ_file.py
Dec 02 09:29:52 np0005541913.localdomain sudo[196094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:53 np0005541913.localdomain python3.9[196096]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:29:53 np0005541913.localdomain sudo[196094]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:53 np0005541913.localdomain sudo[196204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkmjsfudghxwbbdquqgohsjhlgpcdeeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667793.3235571-1515-66278212699384/AnsiballZ_file.py
Dec 02 09:29:53 np0005541913.localdomain sudo[196204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:53 np0005541913.localdomain python3.9[196206]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:29:53 np0005541913.localdomain sudo[196204]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:54 np0005541913.localdomain sudo[196314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veqzxffdnjycdzsymynsxjqfpliiujag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667793.8919122-1515-209163351224756/AnsiballZ_file.py
Dec 02 09:29:54 np0005541913.localdomain sudo[196314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:54 np0005541913.localdomain python3.9[196316]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:29:54 np0005541913.localdomain sudo[196314]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:55 np0005541913.localdomain sudo[196424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkdhqihigxnrbsdapqylxnltkjczyayh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667794.886913-1515-141336368386743/AnsiballZ_file.py
Dec 02 09:29:55 np0005541913.localdomain sudo[196424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23547 DF PROTO=TCP SPT=43714 DPT=9101 SEQ=1325632831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47953CE40000000001030307) 
Dec 02 09:29:55 np0005541913.localdomain python3.9[196426]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:29:55 np0005541913.localdomain sudo[196424]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:55 np0005541913.localdomain sudo[196534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcgupmlqsuirqtlkamewdwzrpabdbhjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667795.505384-1515-107718821315435/AnsiballZ_file.py
Dec 02 09:29:55 np0005541913.localdomain sudo[196534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:55 np0005541913.localdomain python3.9[196536]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:29:55 np0005541913.localdomain sudo[196534]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:56 np0005541913.localdomain sudo[196644]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdmfhzvoketrbtmwujjilrssxfdkgcrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667796.2832527-1644-176889572486854/AnsiballZ_stat.py
Dec 02 09:29:56 np0005541913.localdomain sudo[196644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:56 np0005541913.localdomain python3.9[196646]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:29:56 np0005541913.localdomain sudo[196644]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:57 np0005541913.localdomain sudo[196734]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgtzhydnbjrpffcgmjzocjlxwfqwinsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667796.2832527-1644-176889572486854/AnsiballZ_copy.py
Dec 02 09:29:57 np0005541913.localdomain sudo[196734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:57 np0005541913.localdomain python3.9[196736]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764667796.2832527-1644-176889572486854/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:29:57 np0005541913.localdomain sudo[196734]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:58 np0005541913.localdomain sudo[196844]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwifdhgvvpngylylycbziiobsagoylai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667797.7394698-1644-203025047166554/AnsiballZ_stat.py
Dec 02 09:29:58 np0005541913.localdomain sudo[196844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:58 np0005541913.localdomain python3.9[196846]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:29:58 np0005541913.localdomain sudo[196844]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:58 np0005541913.localdomain sudo[196934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvradssapepmbzgwrofddfzzettdmila ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667797.7394698-1644-203025047166554/AnsiballZ_copy.py
Dec 02 09:29:58 np0005541913.localdomain sudo[196934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:58 np0005541913.localdomain python3.9[196936]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764667797.7394698-1644-203025047166554/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:29:58 np0005541913.localdomain sudo[196934]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:59 np0005541913.localdomain sudo[197044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkxjbvgnxmvvvrsgzjjiaicjhuemtwov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667798.9228027-1644-158592085148154/AnsiballZ_stat.py
Dec 02 09:29:59 np0005541913.localdomain sudo[197044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23548 DF PROTO=TCP SPT=43714 DPT=9101 SEQ=1325632831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47954CA40000000001030307) 
Dec 02 09:29:59 np0005541913.localdomain python3.9[197046]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:29:59 np0005541913.localdomain sudo[197044]: pam_unix(sudo:session): session closed for user root
Dec 02 09:29:59 np0005541913.localdomain sudo[197134]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxpiofborchlnwnawnfxenmqrvduaffh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667798.9228027-1644-158592085148154/AnsiballZ_copy.py
Dec 02 09:29:59 np0005541913.localdomain sudo[197134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:29:59 np0005541913.localdomain python3.9[197136]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764667798.9228027-1644-158592085148154/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:00 np0005541913.localdomain sudo[197134]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:00 np0005541913.localdomain sudo[197244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehvezepeypegrlzdncfvvtycfvnvylce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667800.1194727-1644-100293923166625/AnsiballZ_stat.py
Dec 02 09:30:00 np0005541913.localdomain sudo[197244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:00 np0005541913.localdomain python3.9[197246]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:00 np0005541913.localdomain sudo[197244]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:00 np0005541913.localdomain sudo[197334]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjripgoawvhmzrkmfqsqrzgifuerjbzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667800.1194727-1644-100293923166625/AnsiballZ_copy.py
Dec 02 09:30:00 np0005541913.localdomain sudo[197334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:01 np0005541913.localdomain python3.9[197336]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764667800.1194727-1644-100293923166625/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:01 np0005541913.localdomain sudo[197334]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:01 np0005541913.localdomain sudo[197444]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftdjfchccdkoxqkbblygqbbxmnilvcge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667801.316876-1644-70596669830048/AnsiballZ_stat.py
Dec 02 09:30:01 np0005541913.localdomain sudo[197444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:30:01 np0005541913.localdomain python3.9[197446]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:01 np0005541913.localdomain sudo[197444]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:01 np0005541913.localdomain podman[197447]: 2025-12-02 09:30:01.984392025 +0000 UTC m=+0.344048730 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 09:30:02 np0005541913.localdomain podman[197447]: 2025-12-02 09:30:02.028102568 +0000 UTC m=+0.387759303 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 09:30:02 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:30:02 np0005541913.localdomain sudo[197559]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okjzmoecflklqpnighlbytprvlxkbnen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667801.316876-1644-70596669830048/AnsiballZ_copy.py
Dec 02 09:30:02 np0005541913.localdomain sudo[197559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:02 np0005541913.localdomain python3.9[197561]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764667801.316876-1644-70596669830048/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:02 np0005541913.localdomain sudo[197559]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:02 np0005541913.localdomain sudo[197669]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovliyyefokkcgarusmokornyjxkaflia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667802.5491602-1644-199823953317529/AnsiballZ_stat.py
Dec 02 09:30:02 np0005541913.localdomain sudo[197669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:30:03.008 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:30:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:30:03.009 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:30:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:30:03.010 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:30:03 np0005541913.localdomain python3.9[197671]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:03 np0005541913.localdomain sudo[197669]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:03 np0005541913.localdomain sudo[197759]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xktmeecngffaggbknytvubrtceoumpjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667802.5491602-1644-199823953317529/AnsiballZ_copy.py
Dec 02 09:30:03 np0005541913.localdomain sudo[197759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:03 np0005541913.localdomain python3.9[197761]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764667802.5491602-1644-199823953317529/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:03 np0005541913.localdomain sudo[197759]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20129 DF PROTO=TCP SPT=58394 DPT=9102 SEQ=1061814342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47955E9D0000000001030307) 
Dec 02 09:30:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12755 DF PROTO=TCP SPT=34426 DPT=9105 SEQ=4229317798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47955F1E0000000001030307) 
Dec 02 09:30:04 np0005541913.localdomain sudo[197869]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oaampbdmvsifztmrlvzellaeethszxxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667803.6481411-1644-79375034844437/AnsiballZ_stat.py
Dec 02 09:30:04 np0005541913.localdomain sudo[197869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:04 np0005541913.localdomain python3.9[197871]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:04 np0005541913.localdomain sudo[197869]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:04 np0005541913.localdomain sudo[197957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbnnslwncvtegaamfyjmdwdhfmpdvsqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667803.6481411-1644-79375034844437/AnsiballZ_copy.py
Dec 02 09:30:04 np0005541913.localdomain sudo[197957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:05 np0005541913.localdomain python3.9[197959]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764667803.6481411-1644-79375034844437/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:05 np0005541913.localdomain sudo[197957]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:05 np0005541913.localdomain sudo[198067]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzmhakyeiobfdwdeqqflixssasfymvvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667805.1673033-1644-70369463435257/AnsiballZ_stat.py
Dec 02 09:30:05 np0005541913.localdomain sudo[198067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:05 np0005541913.localdomain python3.9[198069]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:05 np0005541913.localdomain sudo[198067]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:06 np0005541913.localdomain sudo[198157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmhzkkpfjnhcccbskmhgbxbriyzwmvzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667805.1673033-1644-70369463435257/AnsiballZ_copy.py
Dec 02 09:30:06 np0005541913.localdomain sudo[198157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:06 np0005541913.localdomain python3.9[198159]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764667805.1673033-1644-70369463435257/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:06 np0005541913.localdomain sudo[198157]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20131 DF PROTO=TCP SPT=58394 DPT=9102 SEQ=1061814342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47956AA40000000001030307) 
Dec 02 09:30:07 np0005541913.localdomain sudo[198267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nozowkqinfapznziasvczcfqmlwkbtjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667807.035117-1986-35919626005270/AnsiballZ_file.py
Dec 02 09:30:07 np0005541913.localdomain sudo[198267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:30:07 np0005541913.localdomain podman[198270]: 2025-12-02 09:30:07.408977082 +0000 UTC m=+0.085134031 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:30:07 np0005541913.localdomain podman[198270]: 2025-12-02 09:30:07.443040777 +0000 UTC m=+0.119197746 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 02 09:30:07 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:30:07 np0005541913.localdomain python3.9[198269]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:07 np0005541913.localdomain sudo[198267]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:07 np0005541913.localdomain sudo[198393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djykrzglcabuokbxpuxkcgfftitbtgul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667807.7116523-2010-228312437779419/AnsiballZ_file.py
Dec 02 09:30:07 np0005541913.localdomain sudo[198393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:08 np0005541913.localdomain python3.9[198395]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:08 np0005541913.localdomain sudo[198393]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:08 np0005541913.localdomain sudo[198503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfsixifwlplimhqtldhkcpbzzlpsrtty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667808.251982-2010-132054553230579/AnsiballZ_file.py
Dec 02 09:30:08 np0005541913.localdomain sudo[198503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:08 np0005541913.localdomain python3.9[198505]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:08 np0005541913.localdomain sudo[198503]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:09 np0005541913.localdomain sudo[198613]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emyqgwwrkxfukdnsnbmsdldcwkrpixbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667808.824769-2010-117439762768879/AnsiballZ_file.py
Dec 02 09:30:09 np0005541913.localdomain sudo[198613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:09 np0005541913.localdomain python3.9[198615]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:09 np0005541913.localdomain sudo[198613]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:09 np0005541913.localdomain sudo[198723]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fygqdswydcyvgtpqfkfkjtgfbtjmshtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667809.394023-2010-269490188996964/AnsiballZ_file.py
Dec 02 09:30:09 np0005541913.localdomain sudo[198723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:09 np0005541913.localdomain python3.9[198725]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:09 np0005541913.localdomain sudo[198723]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29080 DF PROTO=TCP SPT=47358 DPT=9100 SEQ=3869807480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479576A40000000001030307) 
Dec 02 09:30:10 np0005541913.localdomain sudo[198797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:30:10 np0005541913.localdomain sudo[198797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:30:10 np0005541913.localdomain sudo[198797]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:10 np0005541913.localdomain sudo[198832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 09:30:10 np0005541913.localdomain sudo[198832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:30:10 np0005541913.localdomain sudo[198868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrufjyqklvensgkstuhxsotfyjcwyhcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667809.983514-2010-101129890951959/AnsiballZ_file.py
Dec 02 09:30:10 np0005541913.localdomain sudo[198868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:10 np0005541913.localdomain python3.9[198871]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:10 np0005541913.localdomain sudo[198868]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:10 np0005541913.localdomain sudo[198832]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:10 np0005541913.localdomain sudo[198964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:30:10 np0005541913.localdomain sudo[198964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:30:10 np0005541913.localdomain sudo[198964]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:10 np0005541913.localdomain sudo[199032]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iabaminakkzcppxrdliadybuudvlbvjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667810.540039-2010-162074306194454/AnsiballZ_file.py
Dec 02 09:30:10 np0005541913.localdomain sudo[199032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:10 np0005541913.localdomain sudo[199007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:30:10 np0005541913.localdomain sudo[199007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:30:10 np0005541913.localdomain python3.9[199037]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:10 np0005541913.localdomain sudo[199032]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:11 np0005541913.localdomain sudo[199166]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybfhxsttsujwqjkldnaqjzdlkxxuqoqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667811.100652-2010-160393553727662/AnsiballZ_file.py
Dec 02 09:30:11 np0005541913.localdomain sudo[199166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:11 np0005541913.localdomain sudo[199007]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:11 np0005541913.localdomain python3.9[199176]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:11 np0005541913.localdomain sudo[199166]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:11 np0005541913.localdomain sudo[199288]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-heznbyiykdcfcptvmstgqojapchvfddd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667811.6953042-2010-88331955945059/AnsiballZ_file.py
Dec 02 09:30:11 np0005541913.localdomain sudo[199288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:12 np0005541913.localdomain sudo[199291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:30:12 np0005541913.localdomain sudo[199291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:30:12 np0005541913.localdomain sudo[199291]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:12 np0005541913.localdomain python3.9[199290]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:12 np0005541913.localdomain sudo[199288]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:12 np0005541913.localdomain sudo[199416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgpkvqxlysbtufonyyxjskfzlkugptud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667812.3025057-2010-268619657789985/AnsiballZ_file.py
Dec 02 09:30:12 np0005541913.localdomain sudo[199416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:12 np0005541913.localdomain python3.9[199418]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:12 np0005541913.localdomain sudo[199416]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:12 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60022 DF PROTO=TCP SPT=35648 DPT=9100 SEQ=2819697767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479581E40000000001030307) 
Dec 02 09:30:13 np0005541913.localdomain sudo[199526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeivybqvjiafyuybfkqyunpvtueuognb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667812.8907523-2010-116078382884040/AnsiballZ_file.py
Dec 02 09:30:13 np0005541913.localdomain sudo[199526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:13 np0005541913.localdomain python3.9[199528]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:13 np0005541913.localdomain sudo[199526]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:13 np0005541913.localdomain sudo[199636]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scieiirkdyvoirkiosqmosdvkhnevpzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667813.4748313-2010-261394742512146/AnsiballZ_file.py
Dec 02 09:30:13 np0005541913.localdomain sudo[199636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:13 np0005541913.localdomain python3.9[199638]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:13 np0005541913.localdomain sudo[199636]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:14 np0005541913.localdomain sudo[199746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulqjrsthmecjljxaakknuwjjzxxmayyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667814.0468514-2010-70830647737784/AnsiballZ_file.py
Dec 02 09:30:14 np0005541913.localdomain sudo[199746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:15 np0005541913.localdomain python3.9[199748]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:15 np0005541913.localdomain sudo[199746]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:15 np0005541913.localdomain sudo[199856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulesvxvfqgqlytemaposgipdxlcifsnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667815.1301413-2010-165725936709330/AnsiballZ_file.py
Dec 02 09:30:15 np0005541913.localdomain sudo[199856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:15 np0005541913.localdomain python3.9[199858]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:15 np0005541913.localdomain sudo[199856]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:15 np0005541913.localdomain sudo[199966]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brluqjuntbevztlfpbaclwjjyumwyfsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667815.7186143-2010-20108447145272/AnsiballZ_file.py
Dec 02 09:30:15 np0005541913.localdomain sudo[199966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:16 np0005541913.localdomain python3.9[199968]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29082 DF PROTO=TCP SPT=47358 DPT=9100 SEQ=3869807480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47958E640000000001030307) 
Dec 02 09:30:16 np0005541913.localdomain sudo[199966]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:18 np0005541913.localdomain sudo[200076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfagwysxqhwwyokvzquenshtsngvrdvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667818.0274138-2307-91523881032296/AnsiballZ_stat.py
Dec 02 09:30:18 np0005541913.localdomain sudo[200076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:18 np0005541913.localdomain python3.9[200078]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:18 np0005541913.localdomain sudo[200076]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:18 np0005541913.localdomain sudo[200164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbhstyrijsxzbsyaulighuzwclxqyobf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667818.0274138-2307-91523881032296/AnsiballZ_copy.py
Dec 02 09:30:18 np0005541913.localdomain sudo[200164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:19 np0005541913.localdomain python3.9[200166]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667818.0274138-2307-91523881032296/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:19 np0005541913.localdomain sudo[200164]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20133 DF PROTO=TCP SPT=58394 DPT=9102 SEQ=1061814342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479599E40000000001030307) 
Dec 02 09:30:19 np0005541913.localdomain sudo[200274]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mimdpintdcqnlnbhmmtjkaxedbnvmglu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667819.173797-2307-237584841824311/AnsiballZ_stat.py
Dec 02 09:30:19 np0005541913.localdomain sudo[200274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:19 np0005541913.localdomain python3.9[200276]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:19 np0005541913.localdomain sudo[200274]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:19 np0005541913.localdomain sudo[200362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkylfljwmhsviqtrmjnawpzpcpdkirrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667819.173797-2307-237584841824311/AnsiballZ_copy.py
Dec 02 09:30:19 np0005541913.localdomain sudo[200362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:20 np0005541913.localdomain python3.9[200364]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667819.173797-2307-237584841824311/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:20 np0005541913.localdomain sudo[200362]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:20 np0005541913.localdomain sudo[200472]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owzdhqsdkdltfuvycuozuszwwawqymyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667820.2755363-2307-196115417920863/AnsiballZ_stat.py
Dec 02 09:30:20 np0005541913.localdomain sudo[200472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:20 np0005541913.localdomain python3.9[200474]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:20 np0005541913.localdomain sudo[200472]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:20 np0005541913.localdomain sudo[200560]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-advfgemrizcyrimujoypbtaldrvqiahn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667820.2755363-2307-196115417920863/AnsiballZ_copy.py
Dec 02 09:30:20 np0005541913.localdomain sudo[200560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:21 np0005541913.localdomain python3.9[200562]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667820.2755363-2307-196115417920863/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:21 np0005541913.localdomain sudo[200560]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:21 np0005541913.localdomain sudo[200670]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltmyibglucwzxwhowdtiylidvmfhfcty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667821.3090246-2307-68072189585767/AnsiballZ_stat.py
Dec 02 09:30:21 np0005541913.localdomain sudo[200670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:21 np0005541913.localdomain python3.9[200672]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:21 np0005541913.localdomain sudo[200670]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:22 np0005541913.localdomain sudo[200758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srsnylrkvnlsedpsplpritrhmbmliqqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667821.3090246-2307-68072189585767/AnsiballZ_copy.py
Dec 02 09:30:22 np0005541913.localdomain sudo[200758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56915 DF PROTO=TCP SPT=47022 DPT=9101 SEQ=2300905295 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4795A60F0000000001030307) 
Dec 02 09:30:22 np0005541913.localdomain python3.9[200760]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667821.3090246-2307-68072189585767/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:22 np0005541913.localdomain sudo[200758]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:22 np0005541913.localdomain sudo[200868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvedmyyezuwnmlmfqyxkjelembmmekug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667822.5160499-2307-94609702568089/AnsiballZ_stat.py
Dec 02 09:30:22 np0005541913.localdomain sudo[200868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:22 np0005541913.localdomain python3.9[200870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:22 np0005541913.localdomain sudo[200868]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:23 np0005541913.localdomain sudo[200956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuetuhggchtqmhbgsoxzkrbdcvklokeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667822.5160499-2307-94609702568089/AnsiballZ_copy.py
Dec 02 09:30:23 np0005541913.localdomain sudo[200956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:23 np0005541913.localdomain python3.9[200958]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667822.5160499-2307-94609702568089/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:23 np0005541913.localdomain sudo[200956]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:23 np0005541913.localdomain sudo[201066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mevyojbnuazaaocfbgoubwaswmoajcna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667823.6544166-2307-58264029808066/AnsiballZ_stat.py
Dec 02 09:30:23 np0005541913.localdomain sudo[201066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:24 np0005541913.localdomain python3.9[201068]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:24 np0005541913.localdomain sudo[201066]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:24 np0005541913.localdomain sudo[201154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rinbckakiuhkuvemftspngkevyoqjnrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667823.6544166-2307-58264029808066/AnsiballZ_copy.py
Dec 02 09:30:24 np0005541913.localdomain sudo[201154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:24 np0005541913.localdomain python3.9[201156]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667823.6544166-2307-58264029808066/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:24 np0005541913.localdomain sudo[201154]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:24 np0005541913.localdomain sudo[201264]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raggowzuufvfthkvolikbtlprzkaijtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667824.6975358-2307-127646501766313/AnsiballZ_stat.py
Dec 02 09:30:24 np0005541913.localdomain sudo[201264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:25 np0005541913.localdomain python3.9[201266]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:25 np0005541913.localdomain sudo[201264]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56917 DF PROTO=TCP SPT=47022 DPT=9101 SEQ=2300905295 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4795B2250000000001030307) 
Dec 02 09:30:25 np0005541913.localdomain sudo[201352]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdzerkzbkjfagrlvbrkaxgaaagrgyyuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667824.6975358-2307-127646501766313/AnsiballZ_copy.py
Dec 02 09:30:25 np0005541913.localdomain sudo[201352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:25 np0005541913.localdomain python3.9[201354]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667824.6975358-2307-127646501766313/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:25 np0005541913.localdomain sudo[201352]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:26 np0005541913.localdomain sudo[201462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szckbwednqejdiyateumadpzkzbdkdhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667826.0708952-2307-31364897240294/AnsiballZ_stat.py
Dec 02 09:30:26 np0005541913.localdomain sudo[201462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:26 np0005541913.localdomain python3.9[201464]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:26 np0005541913.localdomain sudo[201462]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:26 np0005541913.localdomain sudo[201550]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rycukpqmglveuqgbdviktagrfsalrbfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667826.0708952-2307-31364897240294/AnsiballZ_copy.py
Dec 02 09:30:26 np0005541913.localdomain sudo[201550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:27 np0005541913.localdomain python3.9[201552]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667826.0708952-2307-31364897240294/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:27 np0005541913.localdomain sudo[201550]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:27 np0005541913.localdomain sudo[201660]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tewdndbcocubaufcjxoonecsfilgdzod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667827.1615138-2307-231653903432193/AnsiballZ_stat.py
Dec 02 09:30:27 np0005541913.localdomain sudo[201660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:27 np0005541913.localdomain python3.9[201662]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:27 np0005541913.localdomain sudo[201660]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:27 np0005541913.localdomain sudo[201748]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxrzeoypxstsevjbemmfzfshkicmwtoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667827.1615138-2307-231653903432193/AnsiballZ_copy.py
Dec 02 09:30:27 np0005541913.localdomain sudo[201748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:28 np0005541913.localdomain python3.9[201750]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667827.1615138-2307-231653903432193/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:28 np0005541913.localdomain sudo[201748]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:28 np0005541913.localdomain sudo[201858]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsyyhfqgjidkqzmhfaqwgipvbdkrrqxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667828.2851064-2307-188761305600476/AnsiballZ_stat.py
Dec 02 09:30:28 np0005541913.localdomain sudo[201858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:28 np0005541913.localdomain python3.9[201860]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:28 np0005541913.localdomain sudo[201858]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:29 np0005541913.localdomain sudo[201946]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcdemlcslvvhmpywjznjxaktlipopgtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667828.2851064-2307-188761305600476/AnsiballZ_copy.py
Dec 02 09:30:29 np0005541913.localdomain sudo[201946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:29 np0005541913.localdomain python3.9[201948]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667828.2851064-2307-188761305600476/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:29 np0005541913.localdomain sudo[201946]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56918 DF PROTO=TCP SPT=47022 DPT=9101 SEQ=2300905295 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4795C1E40000000001030307) 
Dec 02 09:30:29 np0005541913.localdomain sudo[202056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bldjfcjxesedvfbvqipmmatnnownunum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667829.3924565-2307-159348590513793/AnsiballZ_stat.py
Dec 02 09:30:29 np0005541913.localdomain sudo[202056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:29 np0005541913.localdomain python3.9[202058]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:29 np0005541913.localdomain sudo[202056]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:30 np0005541913.localdomain sudo[202144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iynnffebcnjxbmbkjzmmyiikzgblezsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667829.3924565-2307-159348590513793/AnsiballZ_copy.py
Dec 02 09:30:30 np0005541913.localdomain sudo[202144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:30 np0005541913.localdomain python3.9[202146]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667829.3924565-2307-159348590513793/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:30 np0005541913.localdomain sudo[202144]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:30 np0005541913.localdomain sudo[202254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnsztzvvihryrlhqpramxzcwggsocztg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667830.4346-2307-34396234748468/AnsiballZ_stat.py
Dec 02 09:30:30 np0005541913.localdomain sudo[202254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:30 np0005541913.localdomain python3.9[202256]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:30 np0005541913.localdomain sudo[202254]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:31 np0005541913.localdomain sudo[202342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byzuejmthvbmhvyibtmntfpmpvsgctlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667830.4346-2307-34396234748468/AnsiballZ_copy.py
Dec 02 09:30:31 np0005541913.localdomain sudo[202342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:31 np0005541913.localdomain python3.9[202344]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667830.4346-2307-34396234748468/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:31 np0005541913.localdomain sudo[202342]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:31 np0005541913.localdomain sudo[202452]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jojwilgvlsxgggkwwpqdinjppsxxsgsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667831.519853-2307-48793818711791/AnsiballZ_stat.py
Dec 02 09:30:31 np0005541913.localdomain sudo[202452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:31 np0005541913.localdomain python3.9[202454]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:31 np0005541913.localdomain sudo[202452]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:32 np0005541913.localdomain sudo[202540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-meephyroqndyrdgxevsaiyxucrdclzmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667831.519853-2307-48793818711791/AnsiballZ_copy.py
Dec 02 09:30:32 np0005541913.localdomain sudo[202540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:30:32 np0005541913.localdomain podman[202543]: 2025-12-02 09:30:32.404740292 +0000 UTC m=+0.091162825 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 09:30:32 np0005541913.localdomain podman[202543]: 2025-12-02 09:30:32.445247805 +0000 UTC m=+0.131670328 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:30:32 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:30:32 np0005541913.localdomain python3.9[202542]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667831.519853-2307-48793818711791/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:32 np0005541913.localdomain sudo[202540]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:32 np0005541913.localdomain sudo[202674]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wytnoutxrdmemrktuljpapkugztorves ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667832.62407-2307-135438256528916/AnsiballZ_stat.py
Dec 02 09:30:32 np0005541913.localdomain sudo[202674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:33 np0005541913.localdomain python3.9[202676]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:33 np0005541913.localdomain sudo[202674]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:33 np0005541913.localdomain sudo[202762]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgymrmvpmxnnuqicbxenphumgnzxxkga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667832.62407-2307-135438256528916/AnsiballZ_copy.py
Dec 02 09:30:33 np0005541913.localdomain sudo[202762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:33 np0005541913.localdomain python3.9[202764]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667832.62407-2307-135438256528916/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:33 np0005541913.localdomain sudo[202762]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61452 DF PROTO=TCP SPT=51642 DPT=9102 SEQ=4027416656 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4795D3CD0000000001030307) 
Dec 02 09:30:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60140 DF PROTO=TCP SPT=46802 DPT=9105 SEQ=1932042984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4795D44E0000000001030307) 
Dec 02 09:30:34 np0005541913.localdomain python3.9[202872]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:30:35 np0005541913.localdomain sudo[202983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcwpelcvkhnksdscntfvyqkwmtsddjbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667834.8114598-2925-167759040329244/AnsiballZ_seboolean.py
Dec 02 09:30:35 np0005541913.localdomain sudo[202983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:35 np0005541913.localdomain python3.9[202985]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 02 09:30:35 np0005541913.localdomain sudo[202983]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:36 np0005541913.localdomain sudo[203093]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyhyxzbjuennlfafycryvnkuhpqmzmyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667836.1021369-2955-195846554452937/AnsiballZ_systemd.py
Dec 02 09:30:36 np0005541913.localdomain sudo[203093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:36 np0005541913.localdomain python3.9[203095]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:30:36 np0005541913.localdomain systemd-rc-local-generator[203124]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:30:36 np0005541913.localdomain systemd-sysv-generator[203128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61454 DF PROTO=TCP SPT=51642 DPT=9102 SEQ=4027416656 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4795DFE40000000001030307) 
Dec 02 09:30:37 np0005541913.localdomain systemd[1]: Starting libvirt logging daemon socket...
Dec 02 09:30:37 np0005541913.localdomain systemd[1]: Listening on libvirt logging daemon socket.
Dec 02 09:30:37 np0005541913.localdomain systemd[1]: Starting libvirt logging daemon admin socket...
Dec 02 09:30:37 np0005541913.localdomain systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 02 09:30:37 np0005541913.localdomain systemd[1]: Starting libvirt logging daemon...
Dec 02 09:30:37 np0005541913.localdomain systemd[1]: Started libvirt logging daemon.
Dec 02 09:30:37 np0005541913.localdomain sudo[203093]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:37 np0005541913.localdomain sudo[203246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epyaxwvribfsnxasmykvschruzuwwmfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667837.3683527-2955-32958877629679/AnsiballZ_systemd.py
Dec 02 09:30:37 np0005541913.localdomain sudo[203246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:30:37 np0005541913.localdomain podman[203248]: 2025-12-02 09:30:37.848428844 +0000 UTC m=+0.080827793 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 09:30:37 np0005541913.localdomain podman[203248]: 2025-12-02 09:30:37.860491612 +0000 UTC m=+0.092890551 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:30:37 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:30:38 np0005541913.localdomain python3.9[203249]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:30:38 np0005541913.localdomain systemd-sysv-generator[203291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:30:38 np0005541913.localdomain systemd-rc-local-generator[203284]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: Starting libvirt nodedev daemon socket...
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 02 09:30:38 np0005541913.localdomain systemd[1]: Started libvirt nodedev daemon.
Dec 02 09:30:38 np0005541913.localdomain sudo[203246]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:38 np0005541913.localdomain sudo[203439]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojckkmgcdjogakmbvtwbldohhfohvcow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667838.6193726-2955-211510448140651/AnsiballZ_systemd.py
Dec 02 09:30:38 np0005541913.localdomain sudo[203439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:39 np0005541913.localdomain python3.9[203441]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:30:39 np0005541913.localdomain systemd-sysv-generator[203471]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:30:39 np0005541913.localdomain systemd-rc-local-generator[203468]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Starting libvirt proxy daemon socket...
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Listening on libvirt proxy daemon socket.
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Started libvirt proxy daemon.
Dec 02 09:30:39 np0005541913.localdomain sudo[203439]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:39 np0005541913.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 02 09:30:39 np0005541913.localdomain setroubleshoot[203478]: Deleting alert c62ace7d-fc71-492d-8738-6cc52b8f8f8f, it is allowed in current policy
Dec 02 09:30:40 np0005541913.localdomain sudo[203612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlliqqfdovcotvevpvrcelsfvmhgaluv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667839.757304-2955-8276693162005/AnsiballZ_systemd.py
Dec 02 09:30:40 np0005541913.localdomain sudo[203612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41808 DF PROTO=TCP SPT=50554 DPT=9100 SEQ=1535800412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4795EBA50000000001030307) 
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Dec 02 09:30:40 np0005541913.localdomain python3.9[203618]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:30:40 np0005541913.localdomain systemd-rc-local-generator[203646]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:30:40 np0005541913.localdomain systemd-sysv-generator[203649]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Listening on libvirt locking daemon socket.
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Starting libvirt QEMU daemon socket...
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 02 09:30:40 np0005541913.localdomain systemd[1]: Started libvirt QEMU daemon.
Dec 02 09:30:40 np0005541913.localdomain sudo[203612]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:40 np0005541913.localdomain setroubleshoot[203478]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a3e9145b-2d8e-4e66-ba12-5632331a74ce
Dec 02 09:30:40 np0005541913.localdomain setroubleshoot[203478]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Dec 02 09:30:40 np0005541913.localdomain setroubleshoot[203478]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a3e9145b-2d8e-4e66-ba12-5632331a74ce
Dec 02 09:30:40 np0005541913.localdomain setroubleshoot[203478]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Dec 02 09:30:41 np0005541913.localdomain sudo[203802]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyjfhwdruqckqaxdvqvvdyrfowikfdqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667840.8849094-2955-104998497626475/AnsiballZ_systemd.py
Dec 02 09:30:41 np0005541913.localdomain sudo[203802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:41 np0005541913.localdomain python3.9[203804]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:30:41 np0005541913.localdomain systemd-rc-local-generator[203833]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:30:41 np0005541913.localdomain systemd-sysv-generator[203837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: Starting libvirt secret daemon socket...
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: Listening on libvirt secret daemon socket.
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: Starting libvirt secret daemon admin socket...
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 02 09:30:41 np0005541913.localdomain systemd[1]: Started libvirt secret daemon.
Dec 02 09:30:41 np0005541913.localdomain sudo[203802]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40770 DF PROTO=TCP SPT=51764 DPT=9882 SEQ=1877533652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4795F7E40000000001030307) 
Dec 02 09:30:45 np0005541913.localdomain sudo[203985]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbjqigtcmnpxnmzuatygpxccerzjxmxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667844.7175336-3066-200667480292751/AnsiballZ_file.py
Dec 02 09:30:45 np0005541913.localdomain sudo[203985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:45 np0005541913.localdomain python3.9[203987]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:45 np0005541913.localdomain sudo[203985]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:45 np0005541913.localdomain sudo[204095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zictouszxgrzvpzjtrxjbvqxiyglerrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667845.4426408-3090-230346560719257/AnsiballZ_find.py
Dec 02 09:30:45 np0005541913.localdomain sudo[204095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:45 np0005541913.localdomain python3.9[204097]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 09:30:45 np0005541913.localdomain sudo[204095]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41810 DF PROTO=TCP SPT=50554 DPT=9100 SEQ=1535800412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479603650000000001030307) 
Dec 02 09:30:46 np0005541913.localdomain sudo[204205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukrjhcpudvrgksonitgijgqsbznybbhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667846.176506-3114-249902099465647/AnsiballZ_command.py
Dec 02 09:30:46 np0005541913.localdomain sudo[204205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:46 np0005541913.localdomain python3.9[204207]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                                            echo ceph
                                                            awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:30:46 np0005541913.localdomain sudo[204205]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:47 np0005541913.localdomain python3.9[204319]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 09:30:48 np0005541913.localdomain python3.9[204427]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:49 np0005541913.localdomain python3.9[204513]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667848.0374932-3171-110454829948638/.source.xml follow=False _original_basename=secret.xml.j2 checksum=45e14b3898e47796a04e3213d8ff716cad2ef6d4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60144 DF PROTO=TCP SPT=46802 DPT=9105 SEQ=1932042984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47960FE40000000001030307) 
Dec 02 09:30:49 np0005541913.localdomain sudo[204621]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhrytvzghnrswkamphsygcnwjhcieqlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667849.2526681-3216-193844876718620/AnsiballZ_command.py
Dec 02 09:30:49 np0005541913.localdomain sudo[204621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:49 np0005541913.localdomain python3.9[204623]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine c7c8e171-a193-56fb-95fa-8879fcfa7074
                                                            virsh secret-define --file /tmp/secret.xml
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:30:49 np0005541913.localdomain polkitd[1037]: Registered Authentication Agent for unix-process:204625:1001195 (system bus name :1.2856 [pkttyagent --process 204625 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Dec 02 09:30:49 np0005541913.localdomain polkitd[1037]: Unregistered Authentication Agent for unix-process:204625:1001195 (system bus name :1.2856, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Dec 02 09:30:49 np0005541913.localdomain polkitd[1037]: Registered Authentication Agent for unix-process:204624:1001195 (system bus name :1.2857 [pkttyagent --process 204624 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Dec 02 09:30:49 np0005541913.localdomain polkitd[1037]: Unregistered Authentication Agent for unix-process:204624:1001195 (system bus name :1.2857, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Dec 02 09:30:49 np0005541913.localdomain sudo[204621]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:50 np0005541913.localdomain python3.9[204743]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:51 np0005541913.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Dec 02 09:30:51 np0005541913.localdomain sudo[204851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qytxlvfvwekagqngymfwgwuagcppguye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667850.7567756-3264-272397522629014/AnsiballZ_command.py
Dec 02 09:30:51 np0005541913.localdomain sudo[204851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:51 np0005541913.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 02 09:30:51 np0005541913.localdomain sudo[204851]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:51 np0005541913.localdomain sudo[204962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivzxmrejffvitzfcjolprgwruwaiodvn ; FSID=c7c8e171-a193-56fb-95fa-8879fcfa7074 KEY=AQCsmS5pAAAAABAA9iv/nZiAlLVhWrPIkulquw== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667851.4753485-3288-234502585073910/AnsiballZ_command.py
Dec 02 09:30:51 np0005541913.localdomain sudo[204962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:52 np0005541913.localdomain polkitd[1037]: Registered Authentication Agent for unix-process:204965:1001433 (system bus name :1.2860 [pkttyagent --process 204965 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Dec 02 09:30:52 np0005541913.localdomain polkitd[1037]: Unregistered Authentication Agent for unix-process:204965:1001433 (system bus name :1.2860, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Dec 02 09:30:52 np0005541913.localdomain sudo[204962]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35829 DF PROTO=TCP SPT=44798 DPT=9101 SEQ=1382640492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47961B400000000001030307) 
Dec 02 09:30:54 np0005541913.localdomain sudo[205078]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avjddhstrrxdrrzqxediybxpzpmmcywf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667853.3830695-3312-106372735000297/AnsiballZ_copy.py
Dec 02 09:30:54 np0005541913.localdomain sudo[205078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:54 np0005541913.localdomain python3.9[205080]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:54 np0005541913.localdomain sudo[205078]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:54 np0005541913.localdomain sudo[205188]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzdeimiyslfndcparpwdiucwsbqspodd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667854.646691-3336-142424303568259/AnsiballZ_stat.py
Dec 02 09:30:54 np0005541913.localdomain sudo[205188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:55 np0005541913.localdomain python3.9[205190]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:55 np0005541913.localdomain sudo[205188]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35831 DF PROTO=TCP SPT=44798 DPT=9101 SEQ=1382640492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479627640000000001030307) 
Dec 02 09:30:55 np0005541913.localdomain sudo[205276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qprfllxwjudefdxrflunhgpuaxmiydlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667854.646691-3336-142424303568259/AnsiballZ_copy.py
Dec 02 09:30:55 np0005541913.localdomain sudo[205276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:55 np0005541913.localdomain python3.9[205278]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667854.646691-3336-142424303568259/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:55 np0005541913.localdomain sudo[205276]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:57 np0005541913.localdomain sudo[205386]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihmkyxouevmxdsuiqfxfcotyhuelgptn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667856.7916038-3384-16199275116291/AnsiballZ_file.py
Dec 02 09:30:57 np0005541913.localdomain sudo[205386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:57 np0005541913.localdomain python3.9[205388]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:57 np0005541913.localdomain sudo[205386]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:57 np0005541913.localdomain sudo[205496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnlrdqssoojusgsgelpbxlhofsnfoibw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667857.499871-3408-138697265324618/AnsiballZ_stat.py
Dec 02 09:30:57 np0005541913.localdomain sudo[205496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:57 np0005541913.localdomain python3.9[205498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:58 np0005541913.localdomain sudo[205496]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:58 np0005541913.localdomain sudo[205553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpmuxrxraldoqpfmrbswxjakrurclkmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667857.499871-3408-138697265324618/AnsiballZ_file.py
Dec 02 09:30:58 np0005541913.localdomain sudo[205553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:58 np0005541913.localdomain python3.9[205555]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:58 np0005541913.localdomain sudo[205553]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:58 np0005541913.localdomain sudo[205663]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftuyudescpwaxsoejdqtylohilvsuswf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667858.6513886-3444-269868264852187/AnsiballZ_stat.py
Dec 02 09:30:58 np0005541913.localdomain sudo[205663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:59 np0005541913.localdomain python3.9[205665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:30:59 np0005541913.localdomain sudo[205663]: pam_unix(sudo:session): session closed for user root
Dec 02 09:30:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35832 DF PROTO=TCP SPT=44798 DPT=9101 SEQ=1382640492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479637240000000001030307) 
Dec 02 09:30:59 np0005541913.localdomain sudo[205720]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzkuzlrqevburrrcpdaaddlossxxifsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667858.6513886-3444-269868264852187/AnsiballZ_file.py
Dec 02 09:30:59 np0005541913.localdomain sudo[205720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:30:59 np0005541913.localdomain python3.9[205722]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.x0dw1ram recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:30:59 np0005541913.localdomain sudo[205720]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:00 np0005541913.localdomain sudo[205830]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbwamxrhctzbupokffxmgnjcvfvzjfrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667859.7567751-3480-220454409758310/AnsiballZ_stat.py
Dec 02 09:31:00 np0005541913.localdomain sudo[205830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:00 np0005541913.localdomain python3.9[205832]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:31:00 np0005541913.localdomain sudo[205830]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:00 np0005541913.localdomain sudo[205887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jazlhuiymyeiskjhsxhmkeddmhbkdybr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667859.7567751-3480-220454409758310/AnsiballZ_file.py
Dec 02 09:31:00 np0005541913.localdomain sudo[205887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:00 np0005541913.localdomain python3.9[205889]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:00 np0005541913.localdomain sudo[205887]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:01 np0005541913.localdomain sudo[205997]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snwngfazubnsbkaraoqokvnfepgidxev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667860.9548516-3519-81652949250752/AnsiballZ_command.py
Dec 02 09:31:01 np0005541913.localdomain sudo[205997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:01 np0005541913.localdomain python3.9[205999]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:31:01 np0005541913.localdomain sudo[205997]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:01 np0005541913.localdomain sudo[206108]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znoypukpkrwlblbnngqjlehmajjsklit ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764667861.6081183-3543-245994482757077/AnsiballZ_edpm_nftables_from_files.py
Dec 02 09:31:01 np0005541913.localdomain sudo[206108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:02 np0005541913.localdomain python3[206110]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 09:31:02 np0005541913.localdomain sudo[206108]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:02 np0005541913.localdomain sudo[206218]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vexyaokweybqlhuxumldkbnytogfkelq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667862.4033945-3568-277189534043937/AnsiballZ_stat.py
Dec 02 09:31:02 np0005541913.localdomain sudo[206218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:31:02 np0005541913.localdomain podman[206221]: 2025-12-02 09:31:02.843460576 +0000 UTC m=+0.089132549 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 09:31:02 np0005541913.localdomain podman[206221]: 2025-12-02 09:31:02.890193998 +0000 UTC m=+0.135866001 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:31:02 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:31:02 np0005541913.localdomain python3.9[206220]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:31:02 np0005541913.localdomain sudo[206218]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:31:03.009 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:31:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:31:03.010 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:31:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:31:03.011 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:31:03 np0005541913.localdomain sudo[206300]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oukzcaouhfwasryvfffainbiwzjgkgpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667862.4033945-3568-277189534043937/AnsiballZ_file.py
Dec 02 09:31:03 np0005541913.localdomain sudo[206300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:03 np0005541913.localdomain python3.9[206302]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:03 np0005541913.localdomain sudo[206300]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:03 np0005541913.localdomain sudo[206410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usmnctthtddohcolgmgdfxankxupmjoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667863.5580196-3604-183428546697866/AnsiballZ_stat.py
Dec 02 09:31:03 np0005541913.localdomain sudo[206410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4948 DF PROTO=TCP SPT=37648 DPT=9102 SEQ=72312645 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479648FD0000000001030307) 
Dec 02 09:31:04 np0005541913.localdomain python3.9[206412]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:31:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12827 DF PROTO=TCP SPT=59402 DPT=9105 SEQ=3264827158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4796497F0000000001030307) 
Dec 02 09:31:04 np0005541913.localdomain sudo[206410]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:04 np0005541913.localdomain sudo[206467]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnlbhsafsxrevanxroqnidshzrfscuxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667863.5580196-3604-183428546697866/AnsiballZ_file.py
Dec 02 09:31:04 np0005541913.localdomain sudo[206467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:04 np0005541913.localdomain python3.9[206469]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:04 np0005541913.localdomain sudo[206467]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:05 np0005541913.localdomain sudo[206577]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljrdzsnkyznniugrpoablpkfplawwawk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667864.7664447-3639-192739425881725/AnsiballZ_stat.py
Dec 02 09:31:05 np0005541913.localdomain sudo[206577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:05 np0005541913.localdomain python3.9[206579]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:31:05 np0005541913.localdomain sudo[206577]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:05 np0005541913.localdomain sudo[206634]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjmgjnnfrtukbtedgryhfpaogrbhqbew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667864.7664447-3639-192739425881725/AnsiballZ_file.py
Dec 02 09:31:05 np0005541913.localdomain sudo[206634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:05 np0005541913.localdomain python3.9[206636]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:05 np0005541913.localdomain sudo[206634]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:06 np0005541913.localdomain sudo[206744]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbeejejljmmdnztxokvpcyuxxgsoqrra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667865.87635-3675-276607331022375/AnsiballZ_stat.py
Dec 02 09:31:06 np0005541913.localdomain sudo[206744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:06 np0005541913.localdomain python3.9[206746]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:31:06 np0005541913.localdomain sudo[206744]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:06 np0005541913.localdomain sudo[206801]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbyohpbqhfdnfsbxiekwsclajqcndssa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667865.87635-3675-276607331022375/AnsiballZ_file.py
Dec 02 09:31:06 np0005541913.localdomain sudo[206801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4950 DF PROTO=TCP SPT=37648 DPT=9102 SEQ=72312645 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479655240000000001030307) 
Dec 02 09:31:07 np0005541913.localdomain python3.9[206803]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:07 np0005541913.localdomain sudo[206801]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:07 np0005541913.localdomain sudo[206911]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtfrywtmivbbxrjdlubkhehepleqdghd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667867.3709314-3711-118293261624326/AnsiballZ_stat.py
Dec 02 09:31:07 np0005541913.localdomain sudo[206911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:07 np0005541913.localdomain python3.9[206913]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:31:07 np0005541913.localdomain sudo[206911]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:08 np0005541913.localdomain sudo[207001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfiveingrkcsdeedapaqqhipbzywimmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667867.3709314-3711-118293261624326/AnsiballZ_copy.py
Dec 02 09:31:08 np0005541913.localdomain sudo[207001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:31:08 np0005541913.localdomain podman[207004]: 2025-12-02 09:31:08.302188707 +0000 UTC m=+0.089592332 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 02 09:31:08 np0005541913.localdomain podman[207004]: 2025-12-02 09:31:08.333504569 +0000 UTC m=+0.120908184 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 09:31:08 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:31:08 np0005541913.localdomain python3.9[207003]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764667867.3709314-3711-118293261624326/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:08 np0005541913.localdomain sudo[207001]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:08 np0005541913.localdomain sudo[207129]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pppyenqcfwdckjfpeulrmyicgvysoywa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667868.708576-3756-257626132488252/AnsiballZ_file.py
Dec 02 09:31:08 np0005541913.localdomain sudo[207129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:09 np0005541913.localdomain python3.9[207131]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:09 np0005541913.localdomain sudo[207129]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:09 np0005541913.localdomain sudo[207239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pazcrvrmqfjnreqswzpddtxxkwaxlekz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667869.4032345-3780-177120486234030/AnsiballZ_command.py
Dec 02 09:31:09 np0005541913.localdomain sudo[207239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:09 np0005541913.localdomain python3.9[207241]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:31:09 np0005541913.localdomain sudo[207239]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61262 DF PROTO=TCP SPT=49304 DPT=9100 SEQ=1414406547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479660E40000000001030307) 
Dec 02 09:31:10 np0005541913.localdomain sudo[207352]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-euovixftcbqiplkpjblpuzkptkpehiyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667870.4544911-3804-81303401947285/AnsiballZ_blockinfile.py
Dec 02 09:31:10 np0005541913.localdomain sudo[207352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:11 np0005541913.localdomain python3.9[207354]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:11 np0005541913.localdomain sudo[207352]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:11 np0005541913.localdomain sudo[207462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrwrznwapnmwvptgsfzejojoznxomykr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667871.4901974-3831-198578787875273/AnsiballZ_command.py
Dec 02 09:31:11 np0005541913.localdomain sudo[207462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:11 np0005541913.localdomain python3.9[207464]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:31:11 np0005541913.localdomain sudo[207462]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:12 np0005541913.localdomain sudo[207466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:31:12 np0005541913.localdomain sudo[207466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:31:12 np0005541913.localdomain sudo[207466]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:12 np0005541913.localdomain sudo[207484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:31:12 np0005541913.localdomain sudo[207484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:31:12 np0005541913.localdomain sudo[207620]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcbuczpupcnjvvewxtcichzuhngtarbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667872.4184453-3855-31333944052360/AnsiballZ_stat.py
Dec 02 09:31:12 np0005541913.localdomain sudo[207620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:12 np0005541913.localdomain python3.9[207623]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:31:12 np0005541913.localdomain sudo[207620]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:12 np0005541913.localdomain sudo[207484]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42393 DF PROTO=TCP SPT=43504 DPT=9882 SEQ=1227093900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47966D240000000001030307) 
Dec 02 09:31:13 np0005541913.localdomain sudo[207752]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvtbverxkwewntvafsupiinxbohbpmjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667873.1518633-3880-83303292517932/AnsiballZ_command.py
Dec 02 09:31:13 np0005541913.localdomain sudo[207752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:13 np0005541913.localdomain python3.9[207754]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:31:13 np0005541913.localdomain sudo[207752]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:13 np0005541913.localdomain sudo[207756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:31:13 np0005541913.localdomain sudo[207756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:31:13 np0005541913.localdomain sudo[207756]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:14 np0005541913.localdomain sudo[207883]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uytfimosekxwnarwuissqwfgbovrkxgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667873.8219523-3903-139732318911938/AnsiballZ_file.py
Dec 02 09:31:14 np0005541913.localdomain sudo[207883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:14 np0005541913.localdomain python3.9[207885]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:14 np0005541913.localdomain sudo[207883]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:14 np0005541913.localdomain sudo[207993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tadddctpxrpepzvtyowamhbjesmikfvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667874.5003877-3927-60140911227733/AnsiballZ_stat.py
Dec 02 09:31:14 np0005541913.localdomain sudo[207993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:15 np0005541913.localdomain python3.9[207995]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:31:15 np0005541913.localdomain sudo[207993]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:15 np0005541913.localdomain sudo[208081]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cljhslukttrrokoiblspyamyfdzohzfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667874.5003877-3927-60140911227733/AnsiballZ_copy.py
Dec 02 09:31:15 np0005541913.localdomain sudo[208081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:15 np0005541913.localdomain python3.9[208083]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667874.5003877-3927-60140911227733/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:15 np0005541913.localdomain sudo[208081]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61264 DF PROTO=TCP SPT=49304 DPT=9100 SEQ=1414406547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479678A40000000001030307) 
Dec 02 09:31:16 np0005541913.localdomain sudo[208191]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evrwhukgaaawehdvetjuvhsombfmimnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667875.881825-3972-145378283632900/AnsiballZ_stat.py
Dec 02 09:31:16 np0005541913.localdomain sudo[208191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:16 np0005541913.localdomain python3.9[208193]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:31:16 np0005541913.localdomain sudo[208191]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:16 np0005541913.localdomain sudo[208279]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcrzrcycdfismlbdkjrebiqancigccly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667875.881825-3972-145378283632900/AnsiballZ_copy.py
Dec 02 09:31:16 np0005541913.localdomain sudo[208279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:16 np0005541913.localdomain python3.9[208281]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667875.881825-3972-145378283632900/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:16 np0005541913.localdomain sudo[208279]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:17 np0005541913.localdomain sudo[208389]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztjcylzjrqzjvyxbnaewwjcntlmowyzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667877.1341233-4017-270469235903438/AnsiballZ_stat.py
Dec 02 09:31:17 np0005541913.localdomain sudo[208389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:17 np0005541913.localdomain python3.9[208391]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:31:17 np0005541913.localdomain sudo[208389]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:17 np0005541913.localdomain sudo[208477]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmlcbccxwtbuyhejlbbfftcqyaadvitp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667877.1341233-4017-270469235903438/AnsiballZ_copy.py
Dec 02 09:31:17 np0005541913.localdomain sudo[208477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:18 np0005541913.localdomain python3.9[208479]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667877.1341233-4017-270469235903438/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:18 np0005541913.localdomain sudo[208477]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:18 np0005541913.localdomain sudo[208587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epbtlowsiwpprvcfekhhqlqyuqcqqbng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667878.3241422-4062-118114955739430/AnsiballZ_systemd.py
Dec 02 09:31:18 np0005541913.localdomain sudo[208587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:18 np0005541913.localdomain python3.9[208589]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:31:18 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:31:18 np0005541913.localdomain systemd-sysv-generator[208617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:31:18 np0005541913.localdomain systemd-rc-local-generator[208613]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:31:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:31:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:19 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12831 DF PROTO=TCP SPT=59402 DPT=9105 SEQ=3264827158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479685E40000000001030307) 
Dec 02 09:31:20 np0005541913.localdomain systemd[1]: Reached target edpm_libvirt.target.
Dec 02 09:31:20 np0005541913.localdomain sudo[208587]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:21 np0005541913.localdomain sudo[208737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjtpmmwqyexhigsyclkxtipfqfyhpvzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667880.4212022-4086-76337245324059/AnsiballZ_systemd.py
Dec 02 09:31:21 np0005541913.localdomain sudo[208737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:21 np0005541913.localdomain python3.9[208739]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 09:31:21 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:31:21 np0005541913.localdomain systemd-rc-local-generator[208767]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:31:21 np0005541913.localdomain systemd-sysv-generator[208770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:31:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:31:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64496 DF PROTO=TCP SPT=50994 DPT=9101 SEQ=1619115592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479690700000000001030307) 
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:31:23 np0005541913.localdomain systemd-rc-local-generator[208800]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:31:23 np0005541913.localdomain systemd-sysv-generator[208804]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:23 np0005541913.localdomain sudo[208737]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:23 np0005541913.localdomain sshd[160376]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: session-53.scope: Deactivated successfully.
Dec 02 09:31:23 np0005541913.localdomain systemd[1]: session-53.scope: Consumed 3min 53.501s CPU time.
Dec 02 09:31:23 np0005541913.localdomain systemd-logind[757]: Session 53 logged out. Waiting for processes to exit.
Dec 02 09:31:23 np0005541913.localdomain systemd-logind[757]: Removed session 53.
Dec 02 09:31:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64498 DF PROTO=TCP SPT=50994 DPT=9101 SEQ=1619115592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47969C640000000001030307) 
Dec 02 09:31:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64499 DF PROTO=TCP SPT=50994 DPT=9101 SEQ=1619115592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4796AC240000000001030307) 
Dec 02 09:31:29 np0005541913.localdomain sshd[208830]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:31:29 np0005541913.localdomain sshd[208830]: Accepted publickey for zuul from 192.168.122.30 port 41656 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:31:29 np0005541913.localdomain systemd-logind[757]: New session 54 of user zuul.
Dec 02 09:31:29 np0005541913.localdomain systemd[1]: Started Session 54 of User zuul.
Dec 02 09:31:29 np0005541913.localdomain sshd[208830]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:31:30 np0005541913.localdomain python3.9[208941]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:31:31 np0005541913.localdomain python3.9[209053]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:31:31 np0005541913.localdomain network[209070]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:31:31 np0005541913.localdomain network[209071]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:31:31 np0005541913.localdomain network[209072]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:31:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:31:33 np0005541913.localdomain systemd[1]: tmp-crun.4LaZ4V.mount: Deactivated successfully.
Dec 02 09:31:33 np0005541913.localdomain podman[209103]: 2025-12-02 09:31:33.06019302 +0000 UTC m=+0.094831282 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 09:31:33 np0005541913.localdomain podman[209103]: 2025-12-02 09:31:33.13070344 +0000 UTC m=+0.165341732 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 09:31:33 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:31:33 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:31:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38915 DF PROTO=TCP SPT=43762 DPT=9102 SEQ=1342219773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4796BE2E0000000001030307) 
Dec 02 09:31:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11970 DF PROTO=TCP SPT=55084 DPT=9105 SEQ=4004772905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4796BEAF0000000001030307) 
Dec 02 09:31:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38917 DF PROTO=TCP SPT=43762 DPT=9102 SEQ=1342219773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4796CA250000000001030307) 
Dec 02 09:31:37 np0005541913.localdomain sudo[209330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byztlxakufyqigcpyisylchkbrbvgatk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667897.1358387-102-155730997933746/AnsiballZ_setup.py
Dec 02 09:31:37 np0005541913.localdomain sudo[209330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:37 np0005541913.localdomain python3.9[209332]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:31:38 np0005541913.localdomain sudo[209330]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:38 np0005541913.localdomain sudo[209393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kauayhbymrkehfynbjmqogjrxxqkbniy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667897.1358387-102-155730997933746/AnsiballZ_dnf.py
Dec 02 09:31:38 np0005541913.localdomain sudo[209393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:31:39 np0005541913.localdomain systemd[1]: tmp-crun.V4UZ8y.mount: Deactivated successfully.
Dec 02 09:31:39 np0005541913.localdomain podman[209396]: 2025-12-02 09:31:39.047083192 +0000 UTC m=+0.079758708 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 02 09:31:39 np0005541913.localdomain podman[209396]: 2025-12-02 09:31:39.05222942 +0000 UTC m=+0.084904976 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 09:31:39 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:31:39 np0005541913.localdomain python3.9[209395]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:31:39 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40773 DF PROTO=TCP SPT=51764 DPT=9882 SEQ=1877533652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4796D5E40000000001030307) 
Dec 02 09:31:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41813 DF PROTO=TCP SPT=50554 DPT=9100 SEQ=1535800412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4796E1E50000000001030307) 
Dec 02 09:31:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56853 DF PROTO=TCP SPT=59142 DPT=9100 SEQ=2753570109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4796EDE40000000001030307) 
Dec 02 09:31:47 np0005541913.localdomain sudo[209393]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:47 np0005541913.localdomain sudo[209523]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbyqkolkjjiqdsfoeebuaekcgvqcedcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667907.1847723-138-3950847691967/AnsiballZ_stat.py
Dec 02 09:31:47 np0005541913.localdomain sudo[209523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:47 np0005541913.localdomain python3.9[209525]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:31:47 np0005541913.localdomain sudo[209523]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:49 np0005541913.localdomain sudo[209635]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itsjgoerrstdsinzgroxbwoerawwflkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667907.971957-162-84920186038445/AnsiballZ_copy.py
Dec 02 09:31:49 np0005541913.localdomain sudo[209635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11974 DF PROTO=TCP SPT=55084 DPT=9105 SEQ=4004772905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4796F9FD0000000001030307) 
Dec 02 09:31:49 np0005541913.localdomain python3.9[209637]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:49 np0005541913.localdomain sudo[209635]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:49 np0005541913.localdomain sudo[209745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdjrarrsgceldwfxxcnneqdbgiodpcmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667909.5354168-186-140999321938867/AnsiballZ_command.py
Dec 02 09:31:49 np0005541913.localdomain sudo[209745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:50 np0005541913.localdomain python3.9[209747]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:31:50 np0005541913.localdomain sudo[209745]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:51 np0005541913.localdomain sudo[209856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyhkcbgjfuyqmoramhhgrjciyfruucuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667910.3213427-210-3052997377942/AnsiballZ_command.py
Dec 02 09:31:51 np0005541913.localdomain sudo[209856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:51 np0005541913.localdomain python3.9[209858]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:31:51 np0005541913.localdomain sudo[209856]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:52 np0005541913.localdomain sudo[209967]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjubgdoigmfdsrwqzuggcsybeunbjkej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667911.7264118-234-272713118636265/AnsiballZ_command.py
Dec 02 09:31:52 np0005541913.localdomain sudo[209967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:52 np0005541913.localdomain python3.9[209969]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:31:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62804 DF PROTO=TCP SPT=37640 DPT=9101 SEQ=4112707506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797059F0000000001030307) 
Dec 02 09:31:52 np0005541913.localdomain sudo[209967]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:52 np0005541913.localdomain sudo[210078]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llbgdlxxczhdkobielvjlawhusfeglxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667912.4870272-261-222244591014131/AnsiballZ_stat.py
Dec 02 09:31:52 np0005541913.localdomain sudo[210078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:52 np0005541913.localdomain python3.9[210080]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:31:52 np0005541913.localdomain sudo[210078]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:53 np0005541913.localdomain sudo[210190]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqxatxzynxrbmlcoemculwtzzxrnbdqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667913.331542-294-235226588332079/AnsiballZ_lineinfile.py
Dec 02 09:31:53 np0005541913.localdomain sudo[210190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:53 np0005541913.localdomain python3.9[210192]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:31:54 np0005541913.localdomain sudo[210190]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:54 np0005541913.localdomain sudo[210300]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhlrugwgtatahbojgtarikojfbtqvxlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667914.2951937-321-137475928982845/AnsiballZ_systemd_service.py
Dec 02 09:31:54 np0005541913.localdomain sudo[210300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:55 np0005541913.localdomain python3.9[210302]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:31:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62806 DF PROTO=TCP SPT=37640 DPT=9101 SEQ=4112707506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479711A40000000001030307) 
Dec 02 09:31:55 np0005541913.localdomain systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 02 09:31:55 np0005541913.localdomain sudo[210300]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:55 np0005541913.localdomain sudo[210414]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hidgfcdhoripyqeniuuafibjqxqutohg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667915.5677106-345-270926360488925/AnsiballZ_systemd_service.py
Dec 02 09:31:55 np0005541913.localdomain sudo[210414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:56 np0005541913.localdomain python3.9[210416]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:31:56 np0005541913.localdomain systemd-sysv-generator[210448]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:31:56 np0005541913.localdomain systemd-rc-local-generator[210442]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: Starting Open-iSCSI...
Dec 02 09:31:56 np0005541913.localdomain iscsid[210457]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi
Dec 02 09:31:56 np0005541913.localdomain iscsid[210457]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.<reversed domain name>[:identifier].
Dec 02 09:31:56 np0005541913.localdomain iscsid[210457]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6.
Dec 02 09:31:56 np0005541913.localdomain iscsid[210457]: If using hardware iscsi like qla4xxx this message can be ignored.
Dec 02 09:31:56 np0005541913.localdomain iscsid[210457]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi
Dec 02 09:31:56 np0005541913.localdomain iscsid[210457]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf
Dec 02 09:31:56 np0005541913.localdomain iscsid[210457]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: Started Open-iSCSI.
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 02 09:31:56 np0005541913.localdomain systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 02 09:31:56 np0005541913.localdomain sudo[210414]: pam_unix(sudo:session): session closed for user root
Dec 02 09:31:58 np0005541913.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 02 09:31:58 np0005541913.localdomain sudo[210567]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-joottjjegipnlssaxnzwaqvdekhzfgwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667918.0768626-378-207556767115918/AnsiballZ_service_facts.py
Dec 02 09:31:58 np0005541913.localdomain sudo[210567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:31:58 np0005541913.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 02 09:31:58 np0005541913.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service.
Dec 02 09:31:58 np0005541913.localdomain python3.9[210569]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:31:58 np0005541913.localdomain network[210599]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:31:58 np0005541913.localdomain network[210600]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:31:58 np0005541913.localdomain network[210601]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:31:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62807 DF PROTO=TCP SPT=37640 DPT=9101 SEQ=4112707506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479721650000000001030307) 
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 53657d6c-c100-406f-a5c8-7ed1309fb42f
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 53657d6c-c100-406f-a5c8-7ed1309fb42f
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 53657d6c-c100-406f-a5c8-7ed1309fb42f
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 53657d6c-c100-406f-a5c8-7ed1309fb42f
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 53657d6c-c100-406f-a5c8-7ed1309fb42f
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 53657d6c-c100-406f-a5c8-7ed1309fb42f
Dec 02 09:31:59 np0005541913.localdomain setroubleshoot[210489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 02 09:32:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:32:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:32:03.013 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:32:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:32:03.017 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:32:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:32:03.020 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:32:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:32:03 np0005541913.localdomain podman[210694]: 2025-12-02 09:32:03.287963058 +0000 UTC m=+0.101063129 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec 02 09:32:03 np0005541913.localdomain podman[210694]: 2025-12-02 09:32:03.360129712 +0000 UTC m=+0.173229773 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:32:03 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:32:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41133 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=76396377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797335E0000000001030307) 
Dec 02 09:32:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48145 DF PROTO=TCP SPT=44198 DPT=9105 SEQ=2160779315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479733DF0000000001030307) 
Dec 02 09:32:05 np0005541913.localdomain sudo[210567]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:05 np0005541913.localdomain sudo[210856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejbkjgthomylurllxijtwzqnbaxwmmrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667925.3739371-408-70464712414227/AnsiballZ_file.py
Dec 02 09:32:05 np0005541913.localdomain sudo[210856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:06 np0005541913.localdomain python3.9[210858]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 09:32:06 np0005541913.localdomain sudo[210856]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:06 np0005541913.localdomain sudo[210966]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odanrjujdawbtifpuzqezkcbybdrbkbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667926.330881-432-124612232662855/AnsiballZ_modprobe.py
Dec 02 09:32:06 np0005541913.localdomain sudo[210966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:06 np0005541913.localdomain python3.9[210968]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 02 09:32:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41135 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=76396377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47973F650000000001030307) 
Dec 02 09:32:07 np0005541913.localdomain sudo[210966]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:07 np0005541913.localdomain sudo[211080]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vledzoqtmviaxdqfrnjibkennnnjiuqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667927.1810136-456-6334875595177/AnsiballZ_stat.py
Dec 02 09:32:07 np0005541913.localdomain sudo[211080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:07 np0005541913.localdomain python3.9[211082]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:07 np0005541913.localdomain sudo[211080]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:08 np0005541913.localdomain sudo[211168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azrgytcbmdbvhbverfyenwiyqdsvfery ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667927.1810136-456-6334875595177/AnsiballZ_copy.py
Dec 02 09:32:08 np0005541913.localdomain sudo[211168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:08 np0005541913.localdomain python3.9[211170]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667927.1810136-456-6334875595177/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:08 np0005541913.localdomain sudo[211168]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:08 np0005541913.localdomain sudo[211278]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vynvjugakyrrdpcxdysmktdzobmncmbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667928.5823853-504-200637864453351/AnsiballZ_lineinfile.py
Dec 02 09:32:08 np0005541913.localdomain sudo[211278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:09 np0005541913.localdomain python3.9[211280]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:09 np0005541913.localdomain sudo[211278]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:32:09 np0005541913.localdomain podman[211298]: 2025-12-02 09:32:09.458364318 +0000 UTC m=+0.094624186 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 02 09:32:09 np0005541913.localdomain podman[211298]: 2025-12-02 09:32:09.494445285 +0000 UTC m=+0.130705133 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 09:32:09 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:32:09 np0005541913.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully.
Dec 02 09:32:09 np0005541913.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 02 09:32:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25210 DF PROTO=TCP SPT=42364 DPT=9100 SEQ=2668511031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47974B650000000001030307) 
Dec 02 09:32:10 np0005541913.localdomain sudo[211406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nigiviavfvrcqytapdoorqxzmwuksqeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667929.5347998-528-28495305956770/AnsiballZ_systemd.py
Dec 02 09:32:10 np0005541913.localdomain sudo[211406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:10 np0005541913.localdomain python3.9[211408]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:32:10 np0005541913.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 02 09:32:10 np0005541913.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 02 09:32:10 np0005541913.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 02 09:32:10 np0005541913.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 02 09:32:10 np0005541913.localdomain systemd-modules-load[211412]: Module 'msr' is built in
Dec 02 09:32:10 np0005541913.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 02 09:32:10 np0005541913.localdomain sudo[211406]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:10 np0005541913.localdomain sudo[211520]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqndgleyvgufdxirjaziftvkvoudkwxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667930.7229137-552-33095042730014/AnsiballZ_file.py
Dec 02 09:32:10 np0005541913.localdomain sudo[211520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:11 np0005541913.localdomain python3.9[211522]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:32:11 np0005541913.localdomain sudo[211520]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:11 np0005541913.localdomain sudo[211630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztwpmhxofstcwjvrixsyrahmlrzlqovs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667931.5315933-579-82536895094906/AnsiballZ_stat.py
Dec 02 09:32:11 np0005541913.localdomain sudo[211630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:12 np0005541913.localdomain python3.9[211632]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:32:12 np0005541913.localdomain sudo[211630]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:12 np0005541913.localdomain sudo[211740]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxsceljinvlepgfoajbkpvmnyaddyanw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667932.2856026-606-131808827414290/AnsiballZ_stat.py
Dec 02 09:32:12 np0005541913.localdomain sudo[211740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:12 np0005541913.localdomain python3.9[211742]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:32:12 np0005541913.localdomain sudo[211740]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40252 DF PROTO=TCP SPT=40544 DPT=9882 SEQ=335701621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479757640000000001030307) 
Dec 02 09:32:13 np0005541913.localdomain sudo[211850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlbvqfglnzmbtrwhiqmxuwmodvwwmvya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667932.9422061-630-53145120483686/AnsiballZ_stat.py
Dec 02 09:32:13 np0005541913.localdomain sudo[211850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:13 np0005541913.localdomain python3.9[211852]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:13 np0005541913.localdomain sudo[211850]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:13 np0005541913.localdomain sudo[211938]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cexwlfflbsidvmgqzurhfdirpdgftrww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667932.9422061-630-53145120483686/AnsiballZ_copy.py
Dec 02 09:32:13 np0005541913.localdomain sudo[211938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:13 np0005541913.localdomain sudo[211941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:32:13 np0005541913.localdomain sudo[211941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:32:13 np0005541913.localdomain sudo[211941]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:13 np0005541913.localdomain sudo[211959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:32:13 np0005541913.localdomain sudo[211959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:32:13 np0005541913.localdomain python3.9[211940]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667932.9422061-630-53145120483686/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:13 np0005541913.localdomain sudo[211938]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:14 np0005541913.localdomain sudo[212118]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esjexkqaqvdfsixdewxfiglosnfjhoar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667934.161077-675-79893288147363/AnsiballZ_command.py
Dec 02 09:32:14 np0005541913.localdomain sudo[212118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:14 np0005541913.localdomain python3.9[212126]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:32:14 np0005541913.localdomain sudo[212118]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:14 np0005541913.localdomain podman[212159]: 2025-12-02 09:32:14.701112029 +0000 UTC m=+0.090323201 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container)
Dec 02 09:32:14 np0005541913.localdomain podman[212159]: 2025-12-02 09:32:14.798027166 +0000 UTC m=+0.187238438 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:32:15 np0005541913.localdomain sudo[211959]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:15 np0005541913.localdomain sudo[212341]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epnalzryxoisewgcimhuukhavqlxdylu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667934.851802-699-252345798288948/AnsiballZ_lineinfile.py
Dec 02 09:32:15 np0005541913.localdomain sudo[212341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:15 np0005541913.localdomain sudo[212323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:32:15 np0005541913.localdomain sudo[212323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:32:15 np0005541913.localdomain sudo[212323]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:15 np0005541913.localdomain sudo[212353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:32:15 np0005541913.localdomain sudo[212353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:32:15 np0005541913.localdomain python3.9[212351]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:15 np0005541913.localdomain sudo[212341]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:15 np0005541913.localdomain sudo[212353]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:16 np0005541913.localdomain sudo[212510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sayvnylsczmmdejarerwhlunhixyfefx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667935.5652282-723-265318304179094/AnsiballZ_replace.py
Dec 02 09:32:16 np0005541913.localdomain sudo[212510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25212 DF PROTO=TCP SPT=42364 DPT=9100 SEQ=2668511031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479763240000000001030307) 
Dec 02 09:32:16 np0005541913.localdomain python3.9[212512]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:16 np0005541913.localdomain sudo[212510]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:16 np0005541913.localdomain sudo[212554]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:32:16 np0005541913.localdomain sudo[212554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:32:16 np0005541913.localdomain sudo[212554]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:16 np0005541913.localdomain sudo[212638]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fumwgvcutzkazvclsxirfjxsjbcdfyqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667936.4598508-747-12629549852768/AnsiballZ_replace.py
Dec 02 09:32:16 np0005541913.localdomain sudo[212638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:16 np0005541913.localdomain python3.9[212640]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:16 np0005541913.localdomain sudo[212638]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:18 np0005541913.localdomain sudo[212749]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxqbcntkrulfpxsnlafmtbngemhbbuay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667937.21502-774-267207608791770/AnsiballZ_lineinfile.py
Dec 02 09:32:18 np0005541913.localdomain sudo[212749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:18 np0005541913.localdomain python3.9[212751]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:18 np0005541913.localdomain sudo[212749]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:18 np0005541913.localdomain sudo[212859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgsjpoowidtbanoqcalidurguchdirdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667938.4347792-774-114135594931369/AnsiballZ_lineinfile.py
Dec 02 09:32:18 np0005541913.localdomain sudo[212859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:18 np0005541913.localdomain python3.9[212861]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:18 np0005541913.localdomain sudo[212859]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:19 np0005541913.localdomain sudo[212969]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cumxfrdgyoydlncamizppifgkqtyajwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667939.0691378-774-262327781759573/AnsiballZ_lineinfile.py
Dec 02 09:32:19 np0005541913.localdomain sudo[212969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48149 DF PROTO=TCP SPT=44198 DPT=9105 SEQ=2160779315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47976FE40000000001030307) 
Dec 02 09:32:19 np0005541913.localdomain python3.9[212971]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:19 np0005541913.localdomain sudo[212969]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:20 np0005541913.localdomain sudo[213079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkgqdkiuxhxmtihgixtxhrgwutjkalib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667939.7951753-774-211838780318615/AnsiballZ_lineinfile.py
Dec 02 09:32:20 np0005541913.localdomain sudo[213079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:20 np0005541913.localdomain python3.9[213081]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:20 np0005541913.localdomain sudo[213079]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:20 np0005541913.localdomain sudo[213189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cshhwbqvjvkbasawnajmzoytdotbchre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667940.5109367-861-182158198761272/AnsiballZ_stat.py
Dec 02 09:32:20 np0005541913.localdomain sudo[213189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:20 np0005541913.localdomain python3.9[213191]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:32:20 np0005541913.localdomain sudo[213189]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:21 np0005541913.localdomain sudo[213301]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arghozbhbyeotdgyzughhjoetlisbrdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667941.2242477-885-195623921609246/AnsiballZ_file.py
Dec 02 09:32:21 np0005541913.localdomain sudo[213301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:21 np0005541913.localdomain python3.9[213303]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:21 np0005541913.localdomain sudo[213301]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57613 DF PROTO=TCP SPT=36730 DPT=9101 SEQ=3065102644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47977AD10000000001030307) 
Dec 02 09:32:22 np0005541913.localdomain sudo[213411]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxrnuaoanvwyihdfeminiticveppebrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667942.2126517-912-243835668357288/AnsiballZ_file.py
Dec 02 09:32:22 np0005541913.localdomain sudo[213411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:22 np0005541913.localdomain python3.9[213413]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:32:22 np0005541913.localdomain sudo[213411]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:23 np0005541913.localdomain sudo[213521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yayfdzlxnnioqsopuwadyvamyahhxlzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667942.9343321-936-41911188618865/AnsiballZ_stat.py
Dec 02 09:32:23 np0005541913.localdomain sudo[213521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:23 np0005541913.localdomain python3.9[213523]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:23 np0005541913.localdomain sudo[213521]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:23 np0005541913.localdomain sudo[213578]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-putndzdxfgxivvzvjslzspoyulbmicsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667942.9343321-936-41911188618865/AnsiballZ_file.py
Dec 02 09:32:23 np0005541913.localdomain sudo[213578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:23 np0005541913.localdomain python3.9[213580]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:32:23 np0005541913.localdomain sudo[213578]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:24 np0005541913.localdomain sudo[213688]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivpqonqirbpvywyveglhojohamamntyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667943.9968247-936-109863563994591/AnsiballZ_stat.py
Dec 02 09:32:24 np0005541913.localdomain sudo[213688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:24 np0005541913.localdomain python3.9[213690]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:24 np0005541913.localdomain sudo[213688]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:24 np0005541913.localdomain sudo[213745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stenezzygcoolcbgmfykcafgbqsotdzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667943.9968247-936-109863563994591/AnsiballZ_file.py
Dec 02 09:32:24 np0005541913.localdomain sudo[213745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:24 np0005541913.localdomain python3.9[213747]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:32:24 np0005541913.localdomain sudo[213745]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57615 DF PROTO=TCP SPT=36730 DPT=9101 SEQ=3065102644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479786E40000000001030307) 
Dec 02 09:32:25 np0005541913.localdomain sudo[213855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ophgcuicmltbqqcvfgguffglpglbjdwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667945.1070786-1005-19757023048543/AnsiballZ_file.py
Dec 02 09:32:25 np0005541913.localdomain sudo[213855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:25 np0005541913.localdomain python3.9[213857]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:25 np0005541913.localdomain sudo[213855]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:26 np0005541913.localdomain sudo[213965]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acvatlxykyirhsuwjbzlxokrfdbdfeix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667945.7760344-1029-32801387298428/AnsiballZ_stat.py
Dec 02 09:32:26 np0005541913.localdomain sudo[213965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:26 np0005541913.localdomain python3.9[213967]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:26 np0005541913.localdomain sudo[213965]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:26 np0005541913.localdomain sudo[214022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brdgbwlczohhtogoohdpzsqohuqprhvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667945.7760344-1029-32801387298428/AnsiballZ_file.py
Dec 02 09:32:26 np0005541913.localdomain sudo[214022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:26 np0005541913.localdomain python3.9[214024]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:26 np0005541913.localdomain sudo[214022]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:27 np0005541913.localdomain sudo[214132]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfbsnimddrdpxtopybnszxlygrhadxfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667946.9408205-1065-34105422147757/AnsiballZ_stat.py
Dec 02 09:32:27 np0005541913.localdomain sudo[214132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:27 np0005541913.localdomain python3.9[214134]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:27 np0005541913.localdomain sudo[214132]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:28 np0005541913.localdomain sudo[214189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hndvmufqbuurjeweisxaecbwgobvrrms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667946.9408205-1065-34105422147757/AnsiballZ_file.py
Dec 02 09:32:28 np0005541913.localdomain sudo[214189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:28 np0005541913.localdomain python3.9[214191]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:28 np0005541913.localdomain sudo[214189]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:28 np0005541913.localdomain sudo[214299]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iheesnjpbmxslxmkhonpplxgokoxeery ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667948.654058-1101-136609418649144/AnsiballZ_systemd.py
Dec 02 09:32:28 np0005541913.localdomain sudo[214299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:29 np0005541913.localdomain python3.9[214301]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:32:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57616 DF PROTO=TCP SPT=36730 DPT=9101 SEQ=3065102644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479796A40000000001030307) 
Dec 02 09:32:29 np0005541913.localdomain systemd-rc-local-generator[214330]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:32:29 np0005541913.localdomain systemd-sysv-generator[214333]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:29 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:29 np0005541913.localdomain sudo[214299]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:31 np0005541913.localdomain sudo[214448]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voxznymzittlfviuxnqouyaccuuzrjry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667950.8876991-1125-208380629739252/AnsiballZ_stat.py
Dec 02 09:32:31 np0005541913.localdomain sudo[214448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:31 np0005541913.localdomain python3.9[214450]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:31 np0005541913.localdomain sudo[214448]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:31 np0005541913.localdomain sudo[214505]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otovrewptgthueybntvhbxbkgzdbwwvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667950.8876991-1125-208380629739252/AnsiballZ_file.py
Dec 02 09:32:31 np0005541913.localdomain sudo[214505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:31 np0005541913.localdomain python3.9[214507]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:31 np0005541913.localdomain sudo[214505]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:32 np0005541913.localdomain sudo[214615]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxoedqswspsqeamvkvkvywvekgongatu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667952.0879989-1161-197164522526919/AnsiballZ_stat.py
Dec 02 09:32:32 np0005541913.localdomain sudo[214615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:32 np0005541913.localdomain python3.9[214617]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:32 np0005541913.localdomain sudo[214615]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:32 np0005541913.localdomain sudo[214672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ookzgjvyrofnqpnqhqbkrlajnxdyzpbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667952.0879989-1161-197164522526919/AnsiballZ_file.py
Dec 02 09:32:32 np0005541913.localdomain sudo[214672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:32 np0005541913.localdomain python3.9[214674]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:32 np0005541913.localdomain sudo[214672]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:33 np0005541913.localdomain sudo[214782]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-midafyziaydaxsqxbifuxdauupwzpysw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667953.1568491-1197-60548130401066/AnsiballZ_systemd.py
Dec 02 09:32:33 np0005541913.localdomain sudo[214782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:32:33 np0005541913.localdomain podman[214785]: 2025-12-02 09:32:33.588033723 +0000 UTC m=+0.091858602 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 09:32:33 np0005541913.localdomain podman[214785]: 2025-12-02 09:32:33.64316372 +0000 UTC m=+0.146988609 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:32:33 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:32:33 np0005541913.localdomain python3.9[214784]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:32:33 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:32:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31116 DF PROTO=TCP SPT=40090 DPT=9102 SEQ=661332112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797A88D0000000001030307) 
Dec 02 09:32:33 np0005541913.localdomain systemd-rc-local-generator[214836]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:32:33 np0005541913.localdomain systemd-sysv-generator[214839]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:32:33 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:33 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:33 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:33 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:34 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:32:34 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:34 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:34 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:34 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64292 DF PROTO=TCP SPT=57484 DPT=9105 SEQ=3569643623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797A90F0000000001030307) 
Dec 02 09:32:34 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 09:32:34 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 09:32:34 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 09:32:34 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 09:32:34 np0005541913.localdomain sudo[214782]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:34 np0005541913.localdomain sudo[214958]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwbojtjujuedeujmjuncfnmjgusjfkdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667954.5707529-1227-48501980600498/AnsiballZ_file.py
Dec 02 09:32:34 np0005541913.localdomain sudo[214958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:35 np0005541913.localdomain python3.9[214960]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:32:35 np0005541913.localdomain sudo[214958]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:35 np0005541913.localdomain sudo[215068]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gplczzwhkpqguxbrrlmgqlakscopbapk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667955.2667012-1251-2359701460469/AnsiballZ_stat.py
Dec 02 09:32:35 np0005541913.localdomain sudo[215068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:35 np0005541913.localdomain python3.9[215070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:35 np0005541913.localdomain sudo[215068]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:36 np0005541913.localdomain sudo[215156]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myszjmilnuryvejmnpftadnkascezcig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667955.2667012-1251-2359701460469/AnsiballZ_copy.py
Dec 02 09:32:36 np0005541913.localdomain sudo[215156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:36 np0005541913.localdomain python3.9[215158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764667955.2667012-1251-2359701460469/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:32:36 np0005541913.localdomain sudo[215156]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31118 DF PROTO=TCP SPT=40090 DPT=9102 SEQ=661332112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797B4A40000000001030307) 
Dec 02 09:32:37 np0005541913.localdomain sudo[215266]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rahtagnclefmcovnameurwqoaxscnjyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667956.850048-1302-269693849558985/AnsiballZ_file.py
Dec 02 09:32:37 np0005541913.localdomain sudo[215266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:37 np0005541913.localdomain python3.9[215268]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:32:37 np0005541913.localdomain sudo[215266]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:37 np0005541913.localdomain sudo[215376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ownqixwaeyayqxmyxecnaubjdlujnhhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667957.5600255-1326-218548267908060/AnsiballZ_stat.py
Dec 02 09:32:37 np0005541913.localdomain sudo[215376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:37 np0005541913.localdomain python3.9[215378]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:32:37 np0005541913.localdomain sudo[215376]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:38 np0005541913.localdomain sudo[215464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdhshqlptjefflczgmmoyikaznvynewm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667957.5600255-1326-218548267908060/AnsiballZ_copy.py
Dec 02 09:32:38 np0005541913.localdomain sudo[215464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:38 np0005541913.localdomain python3.9[215466]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667957.5600255-1326-218548267908060/.source.json _original_basename=.uo38n_h8 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:38 np0005541913.localdomain sudo[215464]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:38 np0005541913.localdomain systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 02 09:32:39 np0005541913.localdomain sudo[215575]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hriurczkmjvkyvwnstpzwpmqpnllkldy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667958.7787824-1371-78638544705597/AnsiballZ_file.py
Dec 02 09:32:39 np0005541913.localdomain sudo[215575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:39 np0005541913.localdomain python3.9[215577]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:39 np0005541913.localdomain sudo[215575]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:39 np0005541913.localdomain systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 02 09:32:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:32:39 np0005541913.localdomain podman[215650]: 2025-12-02 09:32:39.799965501 +0000 UTC m=+0.093629702 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 02 09:32:39 np0005541913.localdomain podman[215650]: 2025-12-02 09:32:39.811138253 +0000 UTC m=+0.104802454 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:32:39 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:32:39 np0005541913.localdomain sudo[215704]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emegsqaputosupxywdymhtzpouteknnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667959.5830445-1395-34259109026665/AnsiballZ_stat.py
Dec 02 09:32:39 np0005541913.localdomain sudo[215704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:40 np0005541913.localdomain sudo[215704]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52890 DF PROTO=TCP SPT=56238 DPT=9100 SEQ=3280184966 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797C0A40000000001030307) 
Dec 02 09:32:40 np0005541913.localdomain sudo[215792]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zeppdnmzlahduyakdlovpjofrcxvrrcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667959.5830445-1395-34259109026665/AnsiballZ_copy.py
Dec 02 09:32:40 np0005541913.localdomain sudo[215792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:40 np0005541913.localdomain sudo[215792]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:41 np0005541913.localdomain sudo[215902]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olnmbfsptmnbznbdcnaamckoylagguvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667961.0848858-1446-229291574688364/AnsiballZ_container_config_data.py
Dec 02 09:32:41 np0005541913.localdomain sudo[215902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:41 np0005541913.localdomain python3.9[215904]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 02 09:32:41 np0005541913.localdomain sudo[215902]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:42 np0005541913.localdomain sudo[216012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnroejpztcuwspmqxvacnezlzuucbtqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667961.9778502-1473-218555780384616/AnsiballZ_container_config_hash.py
Dec 02 09:32:42 np0005541913.localdomain sudo[216012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:42 np0005541913.localdomain python3.9[216014]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:32:42 np0005541913.localdomain sudo[216012]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:42 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56856 DF PROTO=TCP SPT=59142 DPT=9100 SEQ=2753570109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797CBE40000000001030307) 
Dec 02 09:32:43 np0005541913.localdomain sudo[216122]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-endpsmvsrgcuiakshcpmgtyxuxyrlxga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667963.0830672-1500-267376150884690/AnsiballZ_podman_container_info.py
Dec 02 09:32:43 np0005541913.localdomain sudo[216122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:43 np0005541913.localdomain python3.9[216124]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 09:32:44 np0005541913.localdomain sudo[216122]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52892 DF PROTO=TCP SPT=56238 DPT=9100 SEQ=3280184966 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797D8640000000001030307) 
Dec 02 09:32:47 np0005541913.localdomain sudo[216260]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrtqeuzyqzmbhssftjxwnbyczvynzgmi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764667967.2453737-1539-145051909616668/AnsiballZ_edpm_container_manage.py
Dec 02 09:32:47 np0005541913.localdomain sudo[216260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:47 np0005541913.localdomain python3[216262]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:32:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31120 DF PROTO=TCP SPT=40090 DPT=9102 SEQ=661332112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797E3E40000000001030307) 
Dec 02 09:32:50 np0005541913.localdomain podman[216275]: 2025-12-02 09:32:48.046984353 +0000 UTC m=+0.047470394 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 02 09:32:50 np0005541913.localdomain podman[216323]: 
Dec 02 09:32:50 np0005541913.localdomain podman[216323]: 2025-12-02 09:32:50.309809698 +0000 UTC m=+0.094155565 container create f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 02 09:32:50 np0005541913.localdomain podman[216323]: 2025-12-02 09:32:50.265577933 +0000 UTC m=+0.049923860 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 02 09:32:50 np0005541913.localdomain python3[216262]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 02 09:32:50 np0005541913.localdomain sudo[216260]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:51 np0005541913.localdomain sudo[216469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggehmsrgvfvrfppjcftmgthabjtxbtfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667970.698195-1563-271099945708983/AnsiballZ_stat.py
Dec 02 09:32:51 np0005541913.localdomain sudo[216469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:51 np0005541913.localdomain python3.9[216471]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:32:51 np0005541913.localdomain sudo[216469]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37304 DF PROTO=TCP SPT=36884 DPT=9101 SEQ=72661203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797F0010000000001030307) 
Dec 02 09:32:52 np0005541913.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 02 09:32:52 np0005541913.localdomain sudo[216582]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewbsfmezztlysamewjdnacdxaxqfowtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667972.0397217-1590-133335647377478/AnsiballZ_file.py
Dec 02 09:32:52 np0005541913.localdomain sudo[216582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:52 np0005541913.localdomain python3.9[216584]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:52 np0005541913.localdomain sudo[216582]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:53 np0005541913.localdomain sudo[216637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrplkqoraxphjtaaylplpxssnveocenb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667972.0397217-1590-133335647377478/AnsiballZ_stat.py
Dec 02 09:32:53 np0005541913.localdomain sudo[216637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:53 np0005541913.localdomain python3.9[216639]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:32:53 np0005541913.localdomain sudo[216637]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:54 np0005541913.localdomain sudo[216746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqrcismofdgpfthqytviorlilpjhohqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667973.686059-1590-16600817307411/AnsiballZ_copy.py
Dec 02 09:32:54 np0005541913.localdomain sudo[216746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:54 np0005541913.localdomain python3.9[216748]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764667973.686059-1590-16600817307411/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:32:54 np0005541913.localdomain sudo[216746]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:54 np0005541913.localdomain sudo[216801]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjkiqbpvigwvhybowrfghpnyqoqaxiiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667973.686059-1590-16600817307411/AnsiballZ_systemd.py
Dec 02 09:32:54 np0005541913.localdomain sudo[216801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:55 np0005541913.localdomain python3.9[216803]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:32:55 np0005541913.localdomain systemd-rc-local-generator[216825]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:32:55 np0005541913.localdomain systemd-sysv-generator[216832]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:55 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37306 DF PROTO=TCP SPT=36884 DPT=9101 SEQ=72661203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4797FC240000000001030307) 
Dec 02 09:32:55 np0005541913.localdomain sudo[216801]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:55 np0005541913.localdomain sudo[216892]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgrdrnwvnhrvqtkvtvlriipsxfanejht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667973.686059-1590-16600817307411/AnsiballZ_systemd.py
Dec 02 09:32:55 np0005541913.localdomain sudo[216892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:56 np0005541913.localdomain python3.9[216894]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:32:56 np0005541913.localdomain systemd-rc-local-generator[216923]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:32:56 np0005541913.localdomain systemd-sysv-generator[216926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: Starting multipathd container...
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:32:56 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/decc3fae46c177fc07a98939101be81ba2acbbce5cd8ac84de4a05d1c252d1c6/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 09:32:56 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/decc3fae46c177fc07a98939101be81ba2acbbce5cd8ac84de4a05d1c252d1c6/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:32:56 np0005541913.localdomain podman[216935]: 2025-12-02 09:32:56.52280793 +0000 UTC m=+0.135434002 container init f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: + sudo -E kolla_set_configs
Dec 02 09:32:56 np0005541913.localdomain sudo[216956]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 09:32:56 np0005541913.localdomain sudo[216956]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 02 09:32:56 np0005541913.localdomain sudo[216956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:32:56 np0005541913.localdomain podman[216935]: 2025-12-02 09:32:56.568236417 +0000 UTC m=+0.180862499 container start f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 09:32:56 np0005541913.localdomain podman[216935]: multipathd
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: Started multipathd container.
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: INFO:__main__:Validating config file
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: INFO:__main__:Writing out command to execute
Dec 02 09:32:56 np0005541913.localdomain sudo[216956]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: ++ cat /run_command
Dec 02 09:32:56 np0005541913.localdomain sudo[216892]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: + CMD='/usr/sbin/multipathd -d'
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: + ARGS=
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: + sudo kolla_copy_cacerts
Dec 02 09:32:56 np0005541913.localdomain sudo[216973]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 02 09:32:56 np0005541913.localdomain sudo[216973]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 02 09:32:56 np0005541913.localdomain sudo[216973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 09:32:56 np0005541913.localdomain sudo[216973]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:56 np0005541913.localdomain podman[216959]: 2025-12-02 09:32:56.637960512 +0000 UTC m=+0.066497488 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: + [[ ! -n '' ]]
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: + . kolla_extend_start
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: Running command: '/usr/sbin/multipathd -d'
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: + umask 0022
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: + exec /usr/sbin/multipathd -d
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: 10138.869656 | --------start up--------
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: 10138.869677 | read /etc/multipath.conf
Dec 02 09:32:56 np0005541913.localdomain podman[216959]: 2025-12-02 09:32:56.651959761 +0000 UTC m=+0.080496687 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:32:56 np0005541913.localdomain multipathd[216950]: 10138.873280 | path checkers start up
Dec 02 09:32:56 np0005541913.localdomain podman[216959]: unhealthy
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:32:56 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Failed with result 'exit-code'.
Dec 02 09:32:57 np0005541913.localdomain python3.9[217096]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:32:58 np0005541913.localdomain sudo[217206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuecgpmdevqbvhdkcvsojrohwfxvoilp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667978.1044872-1698-166757912324010/AnsiballZ_command.py
Dec 02 09:32:58 np0005541913.localdomain sudo[217206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:58 np0005541913.localdomain python3.9[217208]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:32:58 np0005541913.localdomain sudo[217206]: pam_unix(sudo:session): session closed for user root
Dec 02 09:32:59 np0005541913.localdomain sudo[217329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbdbxzyvivohrtktzpnkjscpniofrhqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667978.9643607-1722-60698573717616/AnsiballZ_systemd.py
Dec 02 09:32:59 np0005541913.localdomain sudo[217329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:32:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37307 DF PROTO=TCP SPT=36884 DPT=9101 SEQ=72661203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47980BE50000000001030307) 
Dec 02 09:32:59 np0005541913.localdomain python3.9[217331]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:32:59 np0005541913.localdomain systemd[1]: Stopping multipathd container...
Dec 02 09:32:59 np0005541913.localdomain multipathd[216950]: 10141.879531 | exit (signal)
Dec 02 09:32:59 np0005541913.localdomain multipathd[216950]: 10141.879903 | --------shut down-------
Dec 02 09:32:59 np0005541913.localdomain systemd[1]: libpod-f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.scope: Deactivated successfully.
Dec 02 09:32:59 np0005541913.localdomain podman[217335]: 2025-12-02 09:32:59.691907193 +0000 UTC m=+0.069361586 container died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 02 09:32:59 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.timer: Deactivated successfully.
Dec 02 09:32:59 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:32:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a-userdata-shm.mount: Deactivated successfully.
Dec 02 09:32:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-decc3fae46c177fc07a98939101be81ba2acbbce5cd8ac84de4a05d1c252d1c6-merged.mount: Deactivated successfully.
Dec 02 09:33:00 np0005541913.localdomain podman[217335]: 2025-12-02 09:33:00.442828921 +0000 UTC m=+0.820283254 container cleanup f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 09:33:00 np0005541913.localdomain podman[217335]: multipathd
Dec 02 09:33:00 np0005541913.localdomain podman[217364]: 2025-12-02 09:33:00.544050626 +0000 UTC m=+0.068103532 container cleanup f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 02 09:33:00 np0005541913.localdomain podman[217364]: multipathd
Dec 02 09:33:00 np0005541913.localdomain systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 02 09:33:00 np0005541913.localdomain systemd[1]: Stopped multipathd container.
Dec 02 09:33:00 np0005541913.localdomain systemd[1]: Starting multipathd container...
Dec 02 09:33:00 np0005541913.localdomain systemd[1]: tmp-crun.kLfc3l.mount: Deactivated successfully.
Dec 02 09:33:00 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:33:00 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/decc3fae46c177fc07a98939101be81ba2acbbce5cd8ac84de4a05d1c252d1c6/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 09:33:00 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/decc3fae46c177fc07a98939101be81ba2acbbce5cd8ac84de4a05d1c252d1c6/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 09:33:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:33:00 np0005541913.localdomain podman[217377]: 2025-12-02 09:33:00.881821856 +0000 UTC m=+0.299083915 container init f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS)
Dec 02 09:33:00 np0005541913.localdomain multipathd[217390]: + sudo -E kolla_set_configs
Dec 02 09:33:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:33:00 np0005541913.localdomain sudo[217396]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 09:33:00 np0005541913.localdomain sudo[217396]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 02 09:33:00 np0005541913.localdomain podman[217377]: 2025-12-02 09:33:00.923312168 +0000 UTC m=+0.340574207 container start f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Dec 02 09:33:00 np0005541913.localdomain podman[217377]: multipathd
Dec 02 09:33:00 np0005541913.localdomain sudo[217396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 09:33:00 np0005541913.localdomain systemd[1]: Started multipathd container.
Dec 02 09:33:00 np0005541913.localdomain multipathd[217390]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:33:00 np0005541913.localdomain multipathd[217390]: INFO:__main__:Validating config file
Dec 02 09:33:00 np0005541913.localdomain multipathd[217390]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:33:00 np0005541913.localdomain multipathd[217390]: INFO:__main__:Writing out command to execute
Dec 02 09:33:00 np0005541913.localdomain sudo[217396]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:00 np0005541913.localdomain multipathd[217390]: ++ cat /run_command
Dec 02 09:33:00 np0005541913.localdomain sudo[217329]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:00 np0005541913.localdomain multipathd[217390]: + CMD='/usr/sbin/multipathd -d'
Dec 02 09:33:00 np0005541913.localdomain multipathd[217390]: + ARGS=
Dec 02 09:33:00 np0005541913.localdomain multipathd[217390]: + sudo kolla_copy_cacerts
Dec 02 09:33:00 np0005541913.localdomain sudo[217411]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 02 09:33:00 np0005541913.localdomain sudo[217411]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 02 09:33:00 np0005541913.localdomain sudo[217411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 09:33:01 np0005541913.localdomain sudo[217411]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:01 np0005541913.localdomain multipathd[217390]: + [[ ! -n '' ]]
Dec 02 09:33:01 np0005541913.localdomain multipathd[217390]: + . kolla_extend_start
Dec 02 09:33:01 np0005541913.localdomain multipathd[217390]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 02 09:33:01 np0005541913.localdomain multipathd[217390]: Running command: '/usr/sbin/multipathd -d'
Dec 02 09:33:01 np0005541913.localdomain multipathd[217390]: + umask 0022
Dec 02 09:33:01 np0005541913.localdomain multipathd[217390]: + exec /usr/sbin/multipathd -d
Dec 02 09:33:01 np0005541913.localdomain multipathd[217390]: 10143.230588 | --------start up--------
Dec 02 09:33:01 np0005541913.localdomain multipathd[217390]: 10143.230645 | read /etc/multipath.conf
Dec 02 09:33:01 np0005541913.localdomain multipathd[217390]: 10143.235825 | path checkers start up
Dec 02 09:33:01 np0005541913.localdomain podman[217398]: 2025-12-02 09:33:01.050231788 +0000 UTC m=+0.116036547 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:33:01 np0005541913.localdomain podman[217398]: 2025-12-02 09:33:01.060985779 +0000 UTC m=+0.126790528 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:33:01 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:33:01 np0005541913.localdomain sudo[217536]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aldspkvcohjilotvemkrjtmqgqvkoppl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667981.1560893-1746-180516753054824/AnsiballZ_file.py
Dec 02 09:33:01 np0005541913.localdomain sudo[217536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:01 np0005541913.localdomain python3.9[217538]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:01 np0005541913.localdomain sudo[217536]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:02 np0005541913.localdomain sudo[217646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhoxttyspksyksgrmdqzhhgwnufisuoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667982.6788762-1782-216074075049815/AnsiballZ_file.py
Dec 02 09:33:02 np0005541913.localdomain sudo[217646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:33:03.014 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:33:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:33:03.015 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:33:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:33:03.018 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:33:03 np0005541913.localdomain python3.9[217648]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 09:33:03 np0005541913.localdomain sudo[217646]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:03 np0005541913.localdomain sudo[217756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owvnoazpdahxnqlwphvrbvptkmkrtycq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667983.395655-1806-226907495257857/AnsiballZ_modprobe.py
Dec 02 09:33:03 np0005541913.localdomain sudo[217756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:03 np0005541913.localdomain python3.9[217758]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 02 09:33:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:33:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20376 DF PROTO=TCP SPT=54526 DPT=9102 SEQ=964100235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47981DBE0000000001030307) 
Dec 02 09:33:03 np0005541913.localdomain sudo[217756]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:03 np0005541913.localdomain podman[217762]: 2025-12-02 09:33:03.983245949 +0000 UTC m=+0.086823858 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 02 09:33:04 np0005541913.localdomain podman[217762]: 2025-12-02 09:33:04.028312647 +0000 UTC m=+0.131890606 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:33:04 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:33:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40803 DF PROTO=TCP SPT=33038 DPT=9105 SEQ=1315420248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47981E3F0000000001030307) 
Dec 02 09:33:04 np0005541913.localdomain sudo[217900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlqcjjhauwalytmjnfudtzoxazhometv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667984.2512019-1830-138597268523202/AnsiballZ_stat.py
Dec 02 09:33:04 np0005541913.localdomain sudo[217900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:04 np0005541913.localdomain python3.9[217902]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:33:04 np0005541913.localdomain sudo[217900]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:05 np0005541913.localdomain sudo[217988]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duayrxcezzbgzztjdqriskszzsvdkeql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667984.2512019-1830-138597268523202/AnsiballZ_copy.py
Dec 02 09:33:05 np0005541913.localdomain sudo[217988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:05 np0005541913.localdomain python3.9[217990]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764667984.2512019-1830-138597268523202/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:05 np0005541913.localdomain sudo[217988]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:05 np0005541913.localdomain sudo[218098]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udwswbzgiwsfjdilrfcfxxjpfufrppxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667985.597579-1878-180833792557652/AnsiballZ_lineinfile.py
Dec 02 09:33:05 np0005541913.localdomain sudo[218098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:06 np0005541913.localdomain python3.9[218100]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:06 np0005541913.localdomain sudo[218098]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:06 np0005541913.localdomain sudo[218208]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjfkthsbsehlsxzhndhcsbivibtvyrwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667986.3135483-1902-45634891371305/AnsiballZ_systemd.py
Dec 02 09:33:06 np0005541913.localdomain sudo[218208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:06 np0005541913.localdomain python3.9[218210]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:33:06 np0005541913.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 02 09:33:06 np0005541913.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 02 09:33:06 np0005541913.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 02 09:33:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20378 DF PROTO=TCP SPT=54526 DPT=9102 SEQ=964100235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479829E40000000001030307) 
Dec 02 09:33:07 np0005541913.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 02 09:33:07 np0005541913.localdomain systemd-modules-load[218214]: Module 'msr' is built in
Dec 02 09:33:07 np0005541913.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 02 09:33:07 np0005541913.localdomain sudo[218208]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:07 np0005541913.localdomain sudo[218322]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gezazcvkclcxxwvmvgdzajnoqqruszfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667987.3521068-1926-60755058164923/AnsiballZ_dnf.py
Dec 02 09:33:07 np0005541913.localdomain sudo[218322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:07 np0005541913.localdomain python3.9[218324]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:33:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13959 DF PROTO=TCP SPT=47048 DPT=9100 SEQ=2305447161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479835A40000000001030307) 
Dec 02 09:33:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:33:10 np0005541913.localdomain systemd[1]: tmp-crun.whhTqv.mount: Deactivated successfully.
Dec 02 09:33:10 np0005541913.localdomain podman[218327]: 2025-12-02 09:33:10.456482005 +0000 UTC m=+0.091202266 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:33:10 np0005541913.localdomain podman[218327]: 2025-12-02 09:33:10.491115311 +0000 UTC m=+0.125835582 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:33:10 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:33:11 np0005541913.localdomain systemd-sysv-generator[218377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:33:11 np0005541913.localdomain systemd-rc-local-generator[218372]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:33:12 np0005541913.localdomain systemd-rc-local-generator[218414]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:33:12 np0005541913.localdomain systemd-sysv-generator[218417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd-logind[757]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 02 09:33:12 np0005541913.localdomain systemd-logind[757]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 02 09:33:12 np0005541913.localdomain lvm[218464]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 02 09:33:12 np0005541913.localdomain lvm[218464]: VG ceph_vg0 finished
Dec 02 09:33:12 np0005541913.localdomain lvm[218463]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 02 09:33:12 np0005541913.localdomain lvm[218463]: VG ceph_vg1 finished
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:33:12 np0005541913.localdomain systemd-rc-local-generator[218516]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:33:12 np0005541913.localdomain systemd-sysv-generator[218519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:33:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:13 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25151 DF PROTO=TCP SPT=39096 DPT=9882 SEQ=455505681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479841E40000000001030307) 
Dec 02 09:33:13 np0005541913.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 09:33:14 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 09:33:14 np0005541913.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 02 09:33:14 np0005541913.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.502s CPU time.
Dec 02 09:33:14 np0005541913.localdomain systemd[1]: run-r58bd866ee1894a2fa6157db2b5d4183e.service: Deactivated successfully.
Dec 02 09:33:14 np0005541913.localdomain sudo[218322]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:15 np0005541913.localdomain python3.9[219759]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:33:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13961 DF PROTO=TCP SPT=47048 DPT=9100 SEQ=2305447161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47984D640000000001030307) 
Dec 02 09:33:16 np0005541913.localdomain sudo[219871]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxjweqpeplgyddoromzziycuvketlxza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667996.2200503-1978-40130756700480/AnsiballZ_file.py
Dec 02 09:33:16 np0005541913.localdomain sudo[219871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:16 np0005541913.localdomain python3.9[219873]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:16 np0005541913.localdomain sudo[219871]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:16 np0005541913.localdomain sudo[219874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:33:16 np0005541913.localdomain sudo[219874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:33:16 np0005541913.localdomain sudo[219874]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:16 np0005541913.localdomain sudo[219900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:33:16 np0005541913.localdomain sudo[219900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:33:17 np0005541913.localdomain sudo[219900]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:17 np0005541913.localdomain sudo[220049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrlpamvywslmlmdkruoteobzgsyrjzgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764667997.2766309-2011-117617365535888/AnsiballZ_systemd_service.py
Dec 02 09:33:17 np0005541913.localdomain sudo[220049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:17 np0005541913.localdomain python3.9[220051]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:33:17 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:33:17 np0005541913.localdomain systemd-rc-local-generator[220076]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:33:17 np0005541913.localdomain systemd-sysv-generator[220079]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:33:17 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:17 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:17 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:18 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:18 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:33:18 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:18 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:18 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:18 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:18 np0005541913.localdomain sudo[220049]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:18 np0005541913.localdomain sudo[220088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:33:18 np0005541913.localdomain sudo[220088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:33:18 np0005541913.localdomain sudo[220088]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:18 np0005541913.localdomain python3.9[220213]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:33:18 np0005541913.localdomain network[220230]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:33:18 np0005541913.localdomain network[220231]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:33:18 np0005541913.localdomain network[220232]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:33:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40807 DF PROTO=TCP SPT=33038 DPT=9105 SEQ=1315420248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479859E50000000001030307) 
Dec 02 09:33:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63941 DF PROTO=TCP SPT=43906 DPT=9101 SEQ=1657305403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479865300000000001030307) 
Dec 02 09:33:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:33:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63943 DF PROTO=TCP SPT=43906 DPT=9101 SEQ=1657305403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479871240000000001030307) 
Dec 02 09:33:26 np0005541913.localdomain sudo[220465]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enliwczqsqtyqqfiqpknawskoforxice ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668005.7512527-2068-241066104817096/AnsiballZ_systemd_service.py
Dec 02 09:33:26 np0005541913.localdomain sudo[220465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:26 np0005541913.localdomain python3.9[220467]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:33:26 np0005541913.localdomain sudo[220465]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:27 np0005541913.localdomain sudo[220576]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcmmaxjsszjlkzjxvfhxkirapvbijrje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668006.5045214-2068-41717710630919/AnsiballZ_systemd_service.py
Dec 02 09:33:27 np0005541913.localdomain sudo[220576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:27 np0005541913.localdomain python3.9[220578]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:33:27 np0005541913.localdomain sudo[220576]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:27 np0005541913.localdomain sudo[220687]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgmahoxsuiepomzgirtoojtcjolulgwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668007.6739998-2068-257807616538667/AnsiballZ_systemd_service.py
Dec 02 09:33:27 np0005541913.localdomain sudo[220687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:28 np0005541913.localdomain python3.9[220689]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:33:28 np0005541913.localdomain sudo[220687]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:28 np0005541913.localdomain sudo[220798]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjbeicehntpablzczypufljiwsyfgpxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668008.3630838-2068-189360117171623/AnsiballZ_systemd_service.py
Dec 02 09:33:28 np0005541913.localdomain sudo[220798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:28 np0005541913.localdomain python3.9[220800]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:33:29 np0005541913.localdomain sudo[220798]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63944 DF PROTO=TCP SPT=43906 DPT=9101 SEQ=1657305403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479880E40000000001030307) 
Dec 02 09:33:29 np0005541913.localdomain sudo[220909]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouvfgjvimlwkymxofqypkwktkjlxykii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668009.125663-2068-105870913671301/AnsiballZ_systemd_service.py
Dec 02 09:33:29 np0005541913.localdomain sudo[220909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:29 np0005541913.localdomain python3.9[220911]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:33:29 np0005541913.localdomain sudo[220909]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:30 np0005541913.localdomain sudo[221020]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkzmwldwomhfkzrhxwoujnjwyayhmkzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668009.8151114-2068-149854403418929/AnsiballZ_systemd_service.py
Dec 02 09:33:30 np0005541913.localdomain sudo[221020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:30 np0005541913.localdomain python3.9[221022]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:33:30 np0005541913.localdomain sudo[221020]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:30 np0005541913.localdomain sudo[221131]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aojhgwrqenhqsuksqfklhfclocjxagzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668010.4913242-2068-280299293462770/AnsiballZ_systemd_service.py
Dec 02 09:33:30 np0005541913.localdomain sudo[221131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:30 np0005541913.localdomain python3.9[221133]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:33:31 np0005541913.localdomain sudo[221131]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:33:31 np0005541913.localdomain sudo[221250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czsrzikomfvacqledgfyqbcrhywzpmnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668011.161953-2068-255735394656159/AnsiballZ_systemd_service.py
Dec 02 09:33:31 np0005541913.localdomain sudo[221250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:31 np0005541913.localdomain systemd[1]: tmp-crun.zzbcnZ.mount: Deactivated successfully.
Dec 02 09:33:31 np0005541913.localdomain podman[221224]: 2025-12-02 09:33:31.463232972 +0000 UTC m=+0.095120532 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd)
Dec 02 09:33:31 np0005541913.localdomain podman[221224]: 2025-12-02 09:33:31.504214489 +0000 UTC m=+0.136102079 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:33:31 np0005541913.localdomain python3.9[221255]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:33:31 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:33:31 np0005541913.localdomain sudo[221250]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:33 np0005541913.localdomain sudo[221372]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kalrdohvfrcivimozcmethjhrdvomjta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668013.4720435-2245-141253338557009/AnsiballZ_file.py
Dec 02 09:33:33 np0005541913.localdomain sudo[221372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:33 np0005541913.localdomain python3.9[221374]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:33 np0005541913.localdomain sudo[221372]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37327 DF PROTO=TCP SPT=60628 DPT=9102 SEQ=1845266549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479892ED0000000001030307) 
Dec 02 09:33:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13173 DF PROTO=TCP SPT=34802 DPT=9105 SEQ=331962085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4798936E0000000001030307) 
Dec 02 09:33:34 np0005541913.localdomain sudo[221482]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqycvuqytugrdtvdtvoswhntvnyqdvse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668014.0217085-2245-93834259490791/AnsiballZ_file.py
Dec 02 09:33:34 np0005541913.localdomain sudo[221482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:33:34 np0005541913.localdomain systemd[1]: tmp-crun.nLoHAX.mount: Deactivated successfully.
Dec 02 09:33:34 np0005541913.localdomain podman[221485]: 2025-12-02 09:33:34.398193844 +0000 UTC m=+0.092020127 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:33:34 np0005541913.localdomain podman[221485]: 2025-12-02 09:33:34.475088014 +0000 UTC m=+0.168914367 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:33:34 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:33:34 np0005541913.localdomain python3.9[221484]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:34 np0005541913.localdomain sudo[221482]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:35 np0005541913.localdomain sudo[221617]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqkjiqhhlqtcekqaiiufmtnvdvvsiapb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668014.622759-2245-178163205854381/AnsiballZ_file.py
Dec 02 09:33:35 np0005541913.localdomain sudo[221617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:35 np0005541913.localdomain python3.9[221619]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:35 np0005541913.localdomain sudo[221617]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:36 np0005541913.localdomain sudo[221727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpeykmtibikrvgzzrvboqnfixxetdhua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668015.8366508-2245-5500003277992/AnsiballZ_file.py
Dec 02 09:33:36 np0005541913.localdomain sudo[221727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:36 np0005541913.localdomain python3.9[221729]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:36 np0005541913.localdomain sudo[221727]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:36 np0005541913.localdomain sudo[221837]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjvderndorxzcngvvamrnlbocptikpgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668016.464178-2245-46392515816694/AnsiballZ_file.py
Dec 02 09:33:36 np0005541913.localdomain sudo[221837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:36 np0005541913.localdomain python3.9[221839]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:36 np0005541913.localdomain sudo[221837]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37329 DF PROTO=TCP SPT=60628 DPT=9102 SEQ=1845266549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47989EE40000000001030307) 
Dec 02 09:33:37 np0005541913.localdomain sudo[221947]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmvkruirgkxewbkzcfiuvzkymkqtzcoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668017.4835737-2245-96742090168732/AnsiballZ_file.py
Dec 02 09:33:37 np0005541913.localdomain sudo[221947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:37 np0005541913.localdomain python3.9[221949]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:37 np0005541913.localdomain sudo[221947]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:38 np0005541913.localdomain sudo[222057]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxqwdfktsgizadigmbtfnatsgcfhtrcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668018.056757-2245-90884366844983/AnsiballZ_file.py
Dec 02 09:33:38 np0005541913.localdomain sudo[222057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:38 np0005541913.localdomain python3.9[222059]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:38 np0005541913.localdomain sudo[222057]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:39 np0005541913.localdomain sudo[222167]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqgjcnhkjzwjcsxjvweyadwcjsjyuazi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668018.6344697-2245-263531044110475/AnsiballZ_file.py
Dec 02 09:33:39 np0005541913.localdomain sudo[222167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:39 np0005541913.localdomain python3.9[222169]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:39 np0005541913.localdomain sudo[222167]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:39 np0005541913.localdomain sudo[222277]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwcmhkgcjbjtxvdzctvruhvzoapyemgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668019.5290494-2416-175016378084698/AnsiballZ_file.py
Dec 02 09:33:39 np0005541913.localdomain sudo[222277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:39 np0005541913.localdomain python3.9[222279]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:40 np0005541913.localdomain sudo[222277]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37677 DF PROTO=TCP SPT=44624 DPT=9100 SEQ=2140714318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4798AAE40000000001030307) 
Dec 02 09:33:40 np0005541913.localdomain sudo[222387]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diimihnerhrzimtzrpshbuwpnuyxotst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668020.1115959-2416-76107377271295/AnsiballZ_file.py
Dec 02 09:33:40 np0005541913.localdomain sudo[222387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:40 np0005541913.localdomain python3.9[222389]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:40 np0005541913.localdomain sudo[222387]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:40 np0005541913.localdomain sudo[222497]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mybxqrbnpwwtkyscnazukebehbhtsyzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668020.6911318-2416-165340061642215/AnsiballZ_file.py
Dec 02 09:33:40 np0005541913.localdomain sudo[222497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:33:41 np0005541913.localdomain systemd[1]: tmp-crun.oDkP9v.mount: Deactivated successfully.
Dec 02 09:33:41 np0005541913.localdomain podman[222500]: 2025-12-02 09:33:41.097023516 +0000 UTC m=+0.115192440 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 02 09:33:41 np0005541913.localdomain podman[222500]: 2025-12-02 09:33:41.104422364 +0000 UTC m=+0.122591288 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:33:41 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:33:41 np0005541913.localdomain python3.9[222499]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:41 np0005541913.localdomain sudo[222497]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:41 np0005541913.localdomain sudo[222625]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gndbiauzxjatmxsdfjaksjzhrhlhdjtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668021.3360016-2416-178285917462931/AnsiballZ_file.py
Dec 02 09:33:41 np0005541913.localdomain sudo[222625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:41 np0005541913.localdomain python3.9[222627]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:41 np0005541913.localdomain sudo[222625]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:42 np0005541913.localdomain sudo[222735]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzgnxpmhbkmsjqyivifchntatwxafecw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668021.9842453-2416-60918807458686/AnsiballZ_file.py
Dec 02 09:33:42 np0005541913.localdomain sudo[222735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:42 np0005541913.localdomain python3.9[222737]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:42 np0005541913.localdomain sudo[222735]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:42 np0005541913.localdomain sudo[222845]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caiirsqdezfbwnsbhyzydpxhrzdptiie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668022.6225142-2416-47518658261450/AnsiballZ_file.py
Dec 02 09:33:42 np0005541913.localdomain sudo[222845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:43 np0005541913.localdomain python3.9[222847]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:43 np0005541913.localdomain sudo[222845]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19351 DF PROTO=TCP SPT=49686 DPT=9882 SEQ=2656160242 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4798B6E40000000001030307) 
Dec 02 09:33:43 np0005541913.localdomain sudo[222955]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlabhxkbjhiedhbfbhvxyilyxsrmfgdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668023.1993365-2416-2657792929439/AnsiballZ_file.py
Dec 02 09:33:43 np0005541913.localdomain sudo[222955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:43 np0005541913.localdomain python3.9[222957]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:43 np0005541913.localdomain sudo[222955]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:44 np0005541913.localdomain sudo[223065]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujxdfbodubpqbquwskkkebyvfzblmqsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668023.804999-2416-6493482937004/AnsiballZ_file.py
Dec 02 09:33:44 np0005541913.localdomain sudo[223065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:44 np0005541913.localdomain python3.9[223067]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:33:44 np0005541913.localdomain sudo[223065]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:45 np0005541913.localdomain sudo[223175]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtzgztjkmzhqimpwqdxhlyggzmnskjyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668024.559297-2590-236774201376934/AnsiballZ_command.py
Dec 02 09:33:45 np0005541913.localdomain sudo[223175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:45 np0005541913.localdomain python3.9[223177]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:33:45 np0005541913.localdomain sudo[223175]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37679 DF PROTO=TCP SPT=44624 DPT=9100 SEQ=2140714318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4798C2A40000000001030307) 
Dec 02 09:33:46 np0005541913.localdomain python3.9[223287]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 09:33:47 np0005541913.localdomain sudo[223395]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wclewkettltaqfgvkpzgqddjvojtmbpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668027.2204885-2644-44177376679651/AnsiballZ_systemd_service.py
Dec 02 09:33:47 np0005541913.localdomain sudo[223395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:47 np0005541913.localdomain python3.9[223397]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:33:47 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:33:47 np0005541913.localdomain systemd-rc-local-generator[223423]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:33:47 np0005541913.localdomain systemd-sysv-generator[223428]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:33:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:33:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:48 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:33:48 np0005541913.localdomain sudo[223395]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:48 np0005541913.localdomain sudo[223542]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evrjgvqgarfpytqbnvhsmcyjsgoxgvhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668028.4056578-2668-274733579926509/AnsiballZ_command.py
Dec 02 09:33:48 np0005541913.localdomain sudo[223542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:48 np0005541913.localdomain python3.9[223544]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:33:48 np0005541913.localdomain sudo[223542]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:49 np0005541913.localdomain sudo[223653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kimcmybkykqxifpglyuxpwtkjdnpnrtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668029.0221663-2668-193364123424414/AnsiballZ_command.py
Dec 02 09:33:49 np0005541913.localdomain sudo[223653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:49 np0005541913.localdomain python3.9[223655]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:33:49 np0005541913.localdomain sudo[223653]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37331 DF PROTO=TCP SPT=60628 DPT=9102 SEQ=1845266549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4798CFE40000000001030307) 
Dec 02 09:33:49 np0005541913.localdomain sudo[223764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftzkgtslbztfchfllgykdthrgmipijnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668029.6408322-2668-92422436782333/AnsiballZ_command.py
Dec 02 09:33:49 np0005541913.localdomain sudo[223764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:50 np0005541913.localdomain python3.9[223766]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:33:50 np0005541913.localdomain sudo[223764]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:50 np0005541913.localdomain sudo[223875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccycuxebyykdtcpnirqdfkjbwikvricw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668030.2830014-2668-35713464431751/AnsiballZ_command.py
Dec 02 09:33:50 np0005541913.localdomain sudo[223875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:50 np0005541913.localdomain python3.9[223877]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:33:50 np0005541913.localdomain sudo[223875]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:51 np0005541913.localdomain sudo[223986]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjegasdiloqksltdghtjdtbjdwevukyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668030.8610127-2668-23366255664282/AnsiballZ_command.py
Dec 02 09:33:51 np0005541913.localdomain sudo[223986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:51 np0005541913.localdomain python3.9[223988]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:33:51 np0005541913.localdomain sudo[223986]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:51 np0005541913.localdomain sudo[224097]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtmpbkguzqnlqfkfabuzebraxetuljwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668031.461267-2668-136624957685599/AnsiballZ_command.py
Dec 02 09:33:51 np0005541913.localdomain sudo[224097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:51 np0005541913.localdomain python3.9[224099]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:33:51 np0005541913.localdomain sudo[224097]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41785 DF PROTO=TCP SPT=42006 DPT=9101 SEQ=2616569565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4798DA600000000001030307) 
Dec 02 09:33:52 np0005541913.localdomain sudo[224208]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xloylowfqpuushayjqfulknrcpxsbeqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668032.0370922-2668-38273732460615/AnsiballZ_command.py
Dec 02 09:33:52 np0005541913.localdomain sudo[224208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:52 np0005541913.localdomain python3.9[224210]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:33:53 np0005541913.localdomain sudo[224208]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:53 np0005541913.localdomain sudo[224319]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkyvnfesnsdijvfnuocfvthzxsjtjdod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668033.658057-2668-211973641770431/AnsiballZ_command.py
Dec 02 09:33:53 np0005541913.localdomain sudo[224319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:54 np0005541913.localdomain python3.9[224321]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:33:54 np0005541913.localdomain sudo[224319]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41787 DF PROTO=TCP SPT=42006 DPT=9101 SEQ=2616569565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4798E6640000000001030307) 
Dec 02 09:33:57 np0005541913.localdomain sudo[224430]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzrfsbesiszgnxanjxocthdoogqqlhdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668036.9159188-2875-112173429583465/AnsiballZ_file.py
Dec 02 09:33:57 np0005541913.localdomain sudo[224430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:57 np0005541913.localdomain python3.9[224432]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:33:57 np0005541913.localdomain sudo[224430]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:57 np0005541913.localdomain sudo[224540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkttqrvdpcrucreaxwdxkdemwrybwlmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668037.455468-2875-92190648659953/AnsiballZ_file.py
Dec 02 09:33:57 np0005541913.localdomain sudo[224540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:57 np0005541913.localdomain python3.9[224542]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:33:57 np0005541913.localdomain sudo[224540]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:58 np0005541913.localdomain sudo[224650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxanjnxvgnttltfqkwhwfiefdrhzwmss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668038.7196276-2875-19215976478825/AnsiballZ_file.py
Dec 02 09:33:58 np0005541913.localdomain sudo[224650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:59 np0005541913.localdomain python3.9[224652]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:33:59 np0005541913.localdomain sudo[224650]: pam_unix(sudo:session): session closed for user root
Dec 02 09:33:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41788 DF PROTO=TCP SPT=42006 DPT=9101 SEQ=2616569565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4798F6240000000001030307) 
Dec 02 09:33:59 np0005541913.localdomain sudo[224760]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhzefejbbukpligvfbrejdmbvnvbunjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668039.3670855-2941-24591756970844/AnsiballZ_file.py
Dec 02 09:33:59 np0005541913.localdomain sudo[224760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:33:59 np0005541913.localdomain python3.9[224762]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:33:59 np0005541913.localdomain sudo[224760]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:00 np0005541913.localdomain sudo[224870]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypztlaeocdzczppqxjgfgprjetbioozf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668039.9802396-2941-150907713183490/AnsiballZ_file.py
Dec 02 09:34:00 np0005541913.localdomain sudo[224870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:00 np0005541913.localdomain python3.9[224872]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:00 np0005541913.localdomain sudo[224870]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:00 np0005541913.localdomain sudo[224980]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioyzlecmslivyjwejkuyuyohfqnqzyrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668040.526425-2941-150123424850499/AnsiballZ_file.py
Dec 02 09:34:00 np0005541913.localdomain sudo[224980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:00 np0005541913.localdomain python3.9[224982]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:01 np0005541913.localdomain sudo[224980]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:01 np0005541913.localdomain sudo[225090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpopljbarvwjutenhbtjqfiiojaovvni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668041.1177907-2941-57002139228808/AnsiballZ_file.py
Dec 02 09:34:01 np0005541913.localdomain sudo[225090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:01 np0005541913.localdomain python3.9[225092]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:01 np0005541913.localdomain sudo[225090]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:02 np0005541913.localdomain sudo[225200]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-auljeatphrvnaspdjnkbtkhvxsrkabdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668041.7386184-2941-133681556822382/AnsiballZ_file.py
Dec 02 09:34:02 np0005541913.localdomain sudo[225200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:34:02 np0005541913.localdomain systemd[1]: tmp-crun.CJ0vaS.mount: Deactivated successfully.
Dec 02 09:34:02 np0005541913.localdomain podman[225203]: 2025-12-02 09:34:02.143303038 +0000 UTC m=+0.103895116 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:34:02 np0005541913.localdomain podman[225203]: 2025-12-02 09:34:02.158902857 +0000 UTC m=+0.119494955 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3)
Dec 02 09:34:02 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:34:02 np0005541913.localdomain python3.9[225202]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:02 np0005541913.localdomain sudo[225200]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:02 np0005541913.localdomain sudo[225329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpoogzvynenkxceefegnhxehgfybsjcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668042.360926-2941-56939455566833/AnsiballZ_file.py
Dec 02 09:34:02 np0005541913.localdomain sudo[225329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:02 np0005541913.localdomain python3.9[225331]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:02 np0005541913.localdomain sudo[225329]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:34:03.015 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:34:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:34:03.016 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:34:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:34:03.017 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:34:03 np0005541913.localdomain sudo[225439]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olqdysaxyrxohpatqqmlagnuqjnhvarw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668042.9132133-2941-252910721189112/AnsiballZ_file.py
Dec 02 09:34:03 np0005541913.localdomain sudo[225439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:03 np0005541913.localdomain python3.9[225441]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:03 np0005541913.localdomain sudo[225439]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54121 DF PROTO=TCP SPT=45392 DPT=9102 SEQ=1991185936 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799081D0000000001030307) 
Dec 02 09:34:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=317 DF PROTO=TCP SPT=36452 DPT=9105 SEQ=2136280617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799089E0000000001030307) 
Dec 02 09:34:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:34:05 np0005541913.localdomain podman[225459]: 2025-12-02 09:34:05.445197831 +0000 UTC m=+0.087262248 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Dec 02 09:34:05 np0005541913.localdomain podman[225459]: 2025-12-02 09:34:05.483213684 +0000 UTC m=+0.125278081 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 02 09:34:05 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:34:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54123 DF PROTO=TCP SPT=45392 DPT=9102 SEQ=1991185936 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479914240000000001030307) 
Dec 02 09:34:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25154 DF PROTO=TCP SPT=39096 DPT=9882 SEQ=455505681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47991FE50000000001030307) 
Dec 02 09:34:10 np0005541913.localdomain sudo[225574]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvulduhewhvajbqohkhxgssadddsjcjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668049.5388951-3266-70622307162852/AnsiballZ_getent.py
Dec 02 09:34:10 np0005541913.localdomain sudo[225574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:10 np0005541913.localdomain python3.9[225576]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 02 09:34:10 np0005541913.localdomain sudo[225574]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:34:11 np0005541913.localdomain podman[225633]: 2025-12-02 09:34:11.544026368 +0000 UTC m=+0.188366008 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 09:34:11 np0005541913.localdomain sudo[225702]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iaremfqhzwphchgdgpbggyjqallfqipr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668051.1927097-3290-122291386316815/AnsiballZ_group.py
Dec 02 09:34:11 np0005541913.localdomain sudo[225702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:11 np0005541913.localdomain podman[225633]: 2025-12-02 09:34:11.581200709 +0000 UTC m=+0.225540359 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec 02 09:34:11 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:34:11 np0005541913.localdomain python3.9[225704]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 09:34:11 np0005541913.localdomain groupadd[225705]: group added to /etc/group: name=nova, GID=42436
Dec 02 09:34:11 np0005541913.localdomain groupadd[225705]: group added to /etc/gshadow: name=nova
Dec 02 09:34:11 np0005541913.localdomain groupadd[225705]: new group: name=nova, GID=42436
Dec 02 09:34:11 np0005541913.localdomain sudo[225702]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:12 np0005541913.localdomain sudo[225818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdoyqjufwefmlkpvujzoqyzrvuzamzjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668052.0279882-3314-230918785260768/AnsiballZ_user.py
Dec 02 09:34:12 np0005541913.localdomain sudo[225818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:12 np0005541913.localdomain python3.9[225820]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005541913.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 09:34:12 np0005541913.localdomain useradd[225822]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Dec 02 09:34:12 np0005541913.localdomain useradd[225822]: add 'nova' to group 'libvirt'
Dec 02 09:34:12 np0005541913.localdomain useradd[225822]: add 'nova' to shadow group 'libvirt'
Dec 02 09:34:12 np0005541913.localdomain sudo[225818]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13964 DF PROTO=TCP SPT=47048 DPT=9100 SEQ=2305447161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47992BE40000000001030307) 
Dec 02 09:34:13 np0005541913.localdomain sshd[225846]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:34:13 np0005541913.localdomain sshd[225846]: Accepted publickey for zuul from 192.168.122.30 port 35522 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:34:13 np0005541913.localdomain systemd-logind[757]: New session 55 of user zuul.
Dec 02 09:34:13 np0005541913.localdomain systemd[1]: Started Session 55 of User zuul.
Dec 02 09:34:13 np0005541913.localdomain sshd[225846]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:34:13 np0005541913.localdomain sshd[225849]: Received disconnect from 192.168.122.30 port 35522:11: disconnected by user
Dec 02 09:34:13 np0005541913.localdomain sshd[225849]: Disconnected from user zuul 192.168.122.30 port 35522
Dec 02 09:34:13 np0005541913.localdomain sshd[225846]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:34:13 np0005541913.localdomain systemd[1]: session-55.scope: Deactivated successfully.
Dec 02 09:34:13 np0005541913.localdomain systemd-logind[757]: Session 55 logged out. Waiting for processes to exit.
Dec 02 09:34:13 np0005541913.localdomain systemd-logind[757]: Removed session 55.
Dec 02 09:34:14 np0005541913.localdomain python3.9[225957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:34:15 np0005541913.localdomain python3.9[226043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668054.0983036-3389-49370459733825/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:15 np0005541913.localdomain python3.9[226151]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:34:16 np0005541913.localdomain python3.9[226206]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23696 DF PROTO=TCP SPT=58080 DPT=9100 SEQ=3870753883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479937E50000000001030307) 
Dec 02 09:34:16 np0005541913.localdomain python3.9[226314]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:34:17 np0005541913.localdomain python3.9[226400]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668056.167528-3389-212269439790116/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:17 np0005541913.localdomain python3.9[226508]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:34:18 np0005541913.localdomain python3.9[226594]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668057.2044785-3389-181542399892225/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=be0176be25a535cff695cce5406adb3d3b53bef4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:18 np0005541913.localdomain sudo[226645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:34:18 np0005541913.localdomain sudo[226645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:34:18 np0005541913.localdomain sudo[226645]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:18 np0005541913.localdomain sudo[226684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:34:18 np0005541913.localdomain sudo[226684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:34:18 np0005541913.localdomain python3.9[226738]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:34:19 np0005541913.localdomain sudo[226684]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:19 np0005541913.localdomain python3.9[226844]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668058.265475-3389-241886877258445/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=321 DF PROTO=TCP SPT=36452 DPT=9105 SEQ=2136280617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479943E40000000001030307) 
Dec 02 09:34:19 np0005541913.localdomain sudo[226959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:34:19 np0005541913.localdomain sudo[226959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:34:19 np0005541913.localdomain sudo[226959]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:19 np0005541913.localdomain python3.9[226972]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:34:20 np0005541913.localdomain python3.9[227068]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668059.3430681-3389-223376930207811/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:20 np0005541913.localdomain sudo[227176]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwapebjsodzufxnybafgxwymnjhmhusn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668060.6170952-3638-109865843649739/AnsiballZ_file.py
Dec 02 09:34:20 np0005541913.localdomain sudo[227176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:21 np0005541913.localdomain python3.9[227178]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:34:21 np0005541913.localdomain sudo[227176]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:21 np0005541913.localdomain sudo[227286]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyhvijcjudgivpeebsdtrwxvyksbojix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668061.278449-3662-144033443863268/AnsiballZ_copy.py
Dec 02 09:34:21 np0005541913.localdomain sudo[227286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:21 np0005541913.localdomain python3.9[227288]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:34:21 np0005541913.localdomain sudo[227286]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:22 np0005541913.localdomain sudo[227396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pugybidhxqgxjiduhqlguskfaoerwmsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668061.9075844-3686-41639980433459/AnsiballZ_stat.py
Dec 02 09:34:22 np0005541913.localdomain sudo[227396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39386 DF PROTO=TCP SPT=35254 DPT=9101 SEQ=4052270145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47994F900000000001030307) 
Dec 02 09:34:22 np0005541913.localdomain python3.9[227398]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:34:22 np0005541913.localdomain sudo[227396]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:22 np0005541913.localdomain sudo[227508]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axuepubdejuwcgvmebgyjjinikhrnfcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668062.6295407-3713-130283203121999/AnsiballZ_file.py
Dec 02 09:34:22 np0005541913.localdomain sudo[227508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:23 np0005541913.localdomain python3.9[227510]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:34:23 np0005541913.localdomain sudo[227508]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:23 np0005541913.localdomain python3.9[227618]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:34:24 np0005541913.localdomain python3.9[227728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:34:25 np0005541913.localdomain python3.9[227814]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668064.003581-3765-162382892498878/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39388 DF PROTO=TCP SPT=35254 DPT=9101 SEQ=4052270145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47995BA40000000001030307) 
Dec 02 09:34:25 np0005541913.localdomain python3.9[227922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:34:26 np0005541913.localdomain python3.9[228008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668065.2029831-3809-229375328704837/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:34:26 np0005541913.localdomain sudo[228116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wysrehocewnjhfvkoenoaqvvsudpvurk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668066.601418-3860-196621726013950/AnsiballZ_container_config_data.py
Dec 02 09:34:26 np0005541913.localdomain sudo[228116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:27 np0005541913.localdomain python3.9[228118]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 02 09:34:27 np0005541913.localdomain sudo[228116]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:27 np0005541913.localdomain sudo[228226]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovuyogfolmimwlmdwdfnvgumeiyfyjpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668067.3913105-3887-84798446148883/AnsiballZ_container_config_hash.py
Dec 02 09:34:27 np0005541913.localdomain sudo[228226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:27 np0005541913.localdomain python3.9[228228]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:34:27 np0005541913.localdomain sudo[228226]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:28 np0005541913.localdomain sudo[228336]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iobyrdkcivsivbdjhutrlcbvcndkohfs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668068.3021162-3917-210375420450846/AnsiballZ_edpm_container_manage.py
Dec 02 09:34:28 np0005541913.localdomain sudo[228336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:28 np0005541913.localdomain python3[228338]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:34:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39389 DF PROTO=TCP SPT=35254 DPT=9101 SEQ=4052270145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47996B650000000001030307) 
Dec 02 09:34:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:34:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7342 DF PROTO=TCP SPT=58180 DPT=9102 SEQ=891547760 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47997D4D0000000001030307) 
Dec 02 09:34:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43783 DF PROTO=TCP SPT=37720 DPT=9105 SEQ=501411175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47997DCF0000000001030307) 
Dec 02 09:34:34 np0005541913.localdomain podman[228376]: 2025-12-02 09:34:34.934081685 +0000 UTC m=+2.720192549 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 09:34:34 np0005541913.localdomain podman[228376]: 2025-12-02 09:34:34.974194564 +0000 UTC m=+2.760305408 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 09:34:34 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:34:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:34:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7344 DF PROTO=TCP SPT=58180 DPT=9102 SEQ=891547760 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479989640000000001030307) 
Dec 02 09:34:39 np0005541913.localdomain podman[228409]: 2025-12-02 09:34:39.994664433 +0000 UTC m=+3.582532093 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:34:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53344 DF PROTO=TCP SPT=45828 DPT=9100 SEQ=1112148696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479995240000000001030307) 
Dec 02 09:34:40 np0005541913.localdomain podman[228352]: 2025-12-02 09:34:28.95949298 +0000 UTC m=+0.052000100 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 02 09:34:40 np0005541913.localdomain podman[228409]: 2025-12-02 09:34:40.056851826 +0000 UTC m=+3.644719456 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 09:34:40 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:34:40 np0005541913.localdomain podman[228456]: 
Dec 02 09:34:40 np0005541913.localdomain podman[228456]: 2025-12-02 09:34:40.244224596 +0000 UTC m=+0.086879568 container create ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 02 09:34:40 np0005541913.localdomain podman[228456]: 2025-12-02 09:34:40.204845957 +0000 UTC m=+0.047500959 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 02 09:34:40 np0005541913.localdomain python3[228338]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 02 09:34:40 np0005541913.localdomain sudo[228336]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:34:42 np0005541913.localdomain podman[228511]: 2025-12-02 09:34:42.450353258 +0000 UTC m=+0.087846735 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:34:42 np0005541913.localdomain podman[228511]: 2025-12-02 09:34:42.457818428 +0000 UTC m=+0.095311965 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 02 09:34:42 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:34:42 np0005541913.localdomain sudo[228618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkimxnacethvhbgjrebhosxcbeyvcnbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668082.6384246-3941-27172250896281/AnsiballZ_stat.py
Dec 02 09:34:42 np0005541913.localdomain sudo[228618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:43 np0005541913.localdomain python3.9[228620]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:34:43 np0005541913.localdomain sudo[228618]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47240 DF PROTO=TCP SPT=40726 DPT=9882 SEQ=887817473 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799A1640000000001030307) 
Dec 02 09:34:43 np0005541913.localdomain sudo[228730]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbvlatsyuksodcemhrasdimqdhlfebqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668083.7433136-3977-251533990068083/AnsiballZ_container_config_data.py
Dec 02 09:34:43 np0005541913.localdomain sudo[228730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:44 np0005541913.localdomain python3.9[228732]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 02 09:34:44 np0005541913.localdomain sudo[228730]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:44 np0005541913.localdomain sudo[228840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zeevebovmcbiytklulwcfpthvcxfkufd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668084.4817011-4004-83712068772375/AnsiballZ_container_config_hash.py
Dec 02 09:34:44 np0005541913.localdomain sudo[228840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:44 np0005541913.localdomain python3.9[228842]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:34:44 np0005541913.localdomain sudo[228840]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:45 np0005541913.localdomain sudo[228950]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgrchyvtrxotejehtoqyprjscxzrizwv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668085.4088023-4034-32343723762706/AnsiballZ_edpm_container_manage.py
Dec 02 09:34:45 np0005541913.localdomain sudo[228950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:45 np0005541913.localdomain python3[228952]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:34:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53346 DF PROTO=TCP SPT=45828 DPT=9100 SEQ=1112148696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799ACE50000000001030307) 
Dec 02 09:34:46 np0005541913.localdomain python3[228952]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",
                                                                    "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:31:10.62653219Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211779450,
                                                                    "VirtualSize": 1211779450,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",
                                                                              "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:53.072482982Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:02.761216507Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:03.785234187Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:17.194997182Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:24.14458279Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:30.048641643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:09.707360362Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.208898452Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624465805Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624514176Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:18.661822382Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 02 09:34:46 np0005541913.localdomain podman[229004]: 2025-12-02 09:34:46.32529686 +0000 UTC m=+0.098270966 container remove 1a8728c4e99f56ba35b80d741f9a9c3570dc11229db60a0ffdbea5a9022358e1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '230f4ebc92ecc6f511b0217abb58f1b6-ff8ff724cb5f0d02131158e2fae849b6'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 02 09:34:46 np0005541913.localdomain python3[228952]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Dec 02 09:34:46 np0005541913.localdomain podman[229018]: 
Dec 02 09:34:46 np0005541913.localdomain podman[229018]: 2025-12-02 09:34:46.43513596 +0000 UTC m=+0.092606282 container create a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 09:34:46 np0005541913.localdomain podman[229018]: 2025-12-02 09:34:46.389157578 +0000 UTC m=+0.046627970 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 02 09:34:46 np0005541913.localdomain python3[228952]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 02 09:34:46 np0005541913.localdomain sudo[228950]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:47 np0005541913.localdomain sudo[229163]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxgzcfvxtuycyvjnaovjklfbhggoisim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668086.767246-4058-97268839217961/AnsiballZ_stat.py
Dec 02 09:34:47 np0005541913.localdomain sudo[229163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:47 np0005541913.localdomain python3.9[229165]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:34:47 np0005541913.localdomain sudo[229163]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:47 np0005541913.localdomain sudo[229275]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfmlchfidaooebwrzgswnisabaujydvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668087.5448472-4085-202821065566346/AnsiballZ_file.py
Dec 02 09:34:47 np0005541913.localdomain sudo[229275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:47 np0005541913.localdomain python3.9[229277]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:34:47 np0005541913.localdomain sudo[229275]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:48 np0005541913.localdomain sudo[229384]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avxzhblwtarfzidvsdqskhjfxmklfpiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668088.0457976-4085-208589644972197/AnsiballZ_copy.py
Dec 02 09:34:48 np0005541913.localdomain sudo[229384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:48 np0005541913.localdomain python3.9[229386]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764668088.0457976-4085-208589644972197/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:34:48 np0005541913.localdomain sudo[229384]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:48 np0005541913.localdomain sudo[229439]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aucqekrgllrkmrocvpwytwiqozporkgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668088.0457976-4085-208589644972197/AnsiballZ_systemd.py
Dec 02 09:34:48 np0005541913.localdomain sudo[229439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:49 np0005541913.localdomain python3.9[229441]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:34:49 np0005541913.localdomain systemd-rc-local-generator[229469]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:34:49 np0005541913.localdomain systemd-sysv-generator[229472]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43787 DF PROTO=TCP SPT=37720 DPT=9105 SEQ=501411175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799B9E50000000001030307) 
Dec 02 09:34:49 np0005541913.localdomain sudo[229439]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:49 np0005541913.localdomain sudo[229530]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcnvnjloeiretljokoixudtxsfoxiuhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668088.0457976-4085-208589644972197/AnsiballZ_systemd.py
Dec 02 09:34:49 np0005541913.localdomain sudo[229530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:50 np0005541913.localdomain python3.9[229532]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:34:50 np0005541913.localdomain systemd-rc-local-generator[229555]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:34:50 np0005541913.localdomain systemd-sysv-generator[229560]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: Starting nova_compute container...
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:34:50 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 02 09:34:50 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 09:34:50 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 09:34:50 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 09:34:50 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 09:34:50 np0005541913.localdomain podman[229573]: 2025-12-02 09:34:50.623970407 +0000 UTC m=+0.110036547 container init a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:34:50 np0005541913.localdomain podman[229573]: 2025-12-02 09:34:50.63070845 +0000 UTC m=+0.116774590 container start a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_managed=true)
Dec 02 09:34:50 np0005541913.localdomain podman[229573]: nova_compute
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: + sudo -E kolla_set_configs
Dec 02 09:34:50 np0005541913.localdomain systemd[1]: Started nova_compute container.
Dec 02 09:34:50 np0005541913.localdomain sudo[229530]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Validating config file
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying service configuration files
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Deleting /etc/ceph
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Creating directory /etc/ceph
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/ceph
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Writing out command to execute
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: ++ cat /run_command
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: + CMD=nova-compute
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: + ARGS=
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: + sudo kolla_copy_cacerts
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: + [[ ! -n '' ]]
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: + . kolla_extend_start
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: Running command: 'nova-compute'
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: + echo 'Running command: '\''nova-compute'\'''
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: + umask 0022
Dec 02 09:34:50 np0005541913.localdomain nova_compute[229589]: + exec nova-compute
Dec 02 09:34:51 np0005541913.localdomain python3.9[229709]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:34:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8099 DF PROTO=TCP SPT=46342 DPT=9101 SEQ=3209027010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799C4C00000000001030307) 
Dec 02 09:34:52 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:52.484 229593 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 02 09:34:52 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:52.485 229593 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 02 09:34:52 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:52.485 229593 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 02 09:34:52 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:52.485 229593 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 02 09:34:52 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:52.619 229593 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:34:52 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:52.641 229593 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:34:52 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:52.641 229593 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.074 229593 INFO nova.virt.driver [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 02 09:34:53 np0005541913.localdomain python3.9[229821]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.195 229593 INFO nova.compute.provider_config [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.207 229593 WARNING nova.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.207 229593 DEBUG oslo_concurrency.lockutils [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.208 229593 DEBUG oslo_concurrency.lockutils [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.208 229593 DEBUG oslo_concurrency.lockutils [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.208 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.208 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.208 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.209 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.209 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.209 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.209 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.209 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.209 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.209 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.210 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.210 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.210 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.210 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.210 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.210 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.210 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.211 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.211 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.211 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] console_host                   = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.211 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.211 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.211 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.211 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.212 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.212 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.212 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.212 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.212 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.212 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.212 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.213 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.213 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.213 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.213 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.213 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.213 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.213 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.214 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] host                           = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.214 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.214 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.214 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.214 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.214 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.215 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.215 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.215 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.215 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.215 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.215 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.216 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.216 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.216 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.216 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.216 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.216 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.216 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.217 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.217 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.217 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.217 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.217 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.217 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.217 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.217 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.218 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.218 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.218 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.218 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.218 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.218 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.218 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.218 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.219 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.219 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.219 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.219 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.219 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.219 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.219 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.220 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.220 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.220 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.220 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.220 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.220 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.220 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.221 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.221 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.221 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.221 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.221 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.221 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.222 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.222 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.222 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.222 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.222 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.222 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.222 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.222 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.223 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.223 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.223 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.223 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.223 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.223 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.224 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.224 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.224 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.224 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.224 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.224 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.224 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.225 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.225 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.225 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.225 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.225 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.225 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.225 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.226 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.226 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.226 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.226 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.226 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.226 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.226 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.226 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.227 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.227 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.227 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.227 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.227 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.227 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.228 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.228 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.228 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.228 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.228 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.228 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.228 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.229 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.229 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.229 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.229 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.229 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.229 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.229 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.230 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.230 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.230 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.230 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.230 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.230 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.230 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.231 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.231 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.231 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.231 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.231 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.231 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.231 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.232 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.232 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.232 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.232 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.232 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.232 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.233 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.233 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.233 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.233 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.233 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.233 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.233 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.234 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.234 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.234 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.234 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.234 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.234 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.234 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.235 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.235 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.235 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.235 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.235 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.235 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.235 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.236 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.236 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.236 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.236 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.236 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.236 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.237 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.237 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.237 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.237 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.237 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.237 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.237 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.238 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.238 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.238 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.238 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.238 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.238 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.238 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.239 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.239 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.239 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.239 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.239 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.239 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.239 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.240 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.240 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.240 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.240 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.240 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.240 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.240 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.241 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.241 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.241 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.241 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.241 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.241 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.241 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.242 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.242 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.242 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.242 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.242 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.242 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.242 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.243 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.243 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.243 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.243 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.243 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.243 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.243 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.244 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.244 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.244 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.244 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.244 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.244 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.245 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.245 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.245 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.245 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.245 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.245 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.245 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.246 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.246 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.246 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.246 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.246 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.246 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.246 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.247 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.247 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.247 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.247 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.247 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.247 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.247 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.248 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.248 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.248 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.248 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.248 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.248 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.248 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.248 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.249 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.249 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.249 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.249 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.249 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.249 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.250 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.250 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.250 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.250 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.250 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.250 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.250 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.250 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.251 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.251 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.251 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.251 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.251 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.251 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.251 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.252 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.252 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.252 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.252 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.252 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.252 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.252 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.253 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.253 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.253 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.253 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.253 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.253 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.253 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.254 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.254 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.254 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.254 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.254 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.254 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.254 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.255 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.255 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.255 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.255 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.255 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.255 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.255 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.255 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.256 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.256 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.256 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.256 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.256 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.256 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.257 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.257 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.257 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.257 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.257 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.257 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.257 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.258 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.258 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.258 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.258 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.258 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.258 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.258 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.258 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.259 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.259 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.259 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.259 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.259 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.259 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.260 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.260 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.260 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.260 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.260 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.260 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.261 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.261 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.261 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.261 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.261 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.261 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.261 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.262 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.262 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.262 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.262 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.262 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.262 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.263 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.263 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.263 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.263 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.263 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.263 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.263 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.264 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.264 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.264 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.264 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.264 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.264 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.264 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.265 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.265 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.265 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.265 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.265 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.265 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.266 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.266 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.266 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.266 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.266 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.266 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.267 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.267 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.267 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.267 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.267 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.267 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.268 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.268 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.268 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.268 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.268 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.268 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.268 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.269 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.269 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.269 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.269 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.269 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.269 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.270 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.270 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.270 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.270 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.270 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.270 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.271 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.271 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.271 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.271 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.271 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.271 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.271 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.272 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.272 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.272 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.272 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.272 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.272 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.273 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.273 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.273 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.273 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.273 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.273 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.273 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.274 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.274 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.274 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.274 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.274 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.274 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.274 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.275 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.275 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.275 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.275 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.275 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.275 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.276 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.276 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.276 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.276 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.276 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.276 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.277 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.277 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.277 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.277 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.277 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.277 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.278 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.278 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.278 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.278 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.278 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.278 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.278 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.279 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.279 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.279 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.279 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.279 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.279 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.280 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.280 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.280 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.280 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.280 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.280 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.281 229593 WARNING oslo_config.cfg [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: and ``live_migration_inbound_addr`` respectively.
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: ).  Its value may be silently ignored in the future.
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.281 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.281 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.281 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.281 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.282 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.282 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.282 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.282 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.282 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.282 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.282 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.283 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.283 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.283 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.283 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.283 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.283 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.284 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.284 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rbd_secret_uuid        = c7c8e171-a193-56fb-95fa-8879fcfa7074 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.284 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.284 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.284 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.284 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.285 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.285 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.285 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.285 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.285 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.285 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.286 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.286 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.286 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.286 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.286 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.286 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.287 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.287 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.287 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.287 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.287 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.287 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.288 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.288 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.288 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.288 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.288 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.288 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.288 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.289 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.289 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.289 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.289 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.289 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.289 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.290 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.290 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.290 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.290 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.290 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.290 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.291 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.291 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.291 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.291 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.291 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.291 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.291 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.292 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.292 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.292 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.292 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.292 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.292 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.293 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.293 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.293 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.293 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.293 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.293 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.293 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.294 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.294 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.294 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.294 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.294 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.294 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.295 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.295 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.295 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.295 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.295 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.296 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.296 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.296 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.296 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.296 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.296 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.296 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.297 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.297 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.297 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.297 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.297 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.297 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.297 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.298 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.298 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.298 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.298 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.298 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.298 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.299 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.299 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.299 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.299 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.299 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.299 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.299 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.300 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.300 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.300 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.300 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.300 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.300 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.301 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.301 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.301 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.301 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.301 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.301 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.301 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.302 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.302 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.302 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.302 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.302 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.302 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.303 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.303 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.303 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.303 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.303 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.303 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.304 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.304 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.304 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.304 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.304 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.304 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.305 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.305 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.305 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.305 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.305 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.305 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.305 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.306 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.306 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.306 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.306 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.306 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.306 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.306 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.307 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.307 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.307 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.307 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.307 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.307 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.307 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.308 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.308 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.308 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.308 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.308 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.308 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.309 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.309 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.309 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.309 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.309 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.309 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.310 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.310 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.310 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.310 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.310 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.310 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.310 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.311 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.311 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.311 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.311 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.311 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.311 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.311 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.312 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.312 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.312 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.312 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.312 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.312 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.313 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.313 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.313 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.313 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.313 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.313 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.313 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.314 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.314 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.314 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.314 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.314 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.314 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.314 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.315 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.315 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.315 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.315 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.315 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.315 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.315 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.315 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.316 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.316 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.316 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.316 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.316 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.316 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.316 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.317 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.317 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.317 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.317 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.317 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.317 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.317 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.317 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.318 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.318 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.318 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.318 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.318 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.318 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.318 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.319 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.319 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.319 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.319 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.319 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.319 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.320 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.320 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.320 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.320 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.320 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.320 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.320 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.321 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.321 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.321 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.321 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.321 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.321 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.321 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.322 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.322 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.322 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.322 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.322 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.322 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.322 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.323 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.323 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.323 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.323 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.323 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.323 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.323 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.324 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.324 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.324 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.324 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.324 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.324 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.324 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.325 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.325 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.325 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.325 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.325 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.325 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.326 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.326 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.326 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.326 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.326 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.326 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.326 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.326 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.327 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.327 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.327 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.327 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.327 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.327 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.327 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.328 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.328 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.328 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.328 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.328 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.328 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.328 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.329 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.329 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.329 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.329 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.329 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.329 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.329 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.330 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.330 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.330 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.330 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.330 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.330 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.330 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.330 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.331 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.331 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.331 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.331 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.331 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.331 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.331 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.332 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.332 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.332 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.332 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.332 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.332 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.333 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.333 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.333 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.333 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.333 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.333 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.333 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.334 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.334 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.334 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.334 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.334 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.334 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.334 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.334 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.335 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.335 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.335 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.335 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.335 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.335 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.335 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.336 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.336 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.336 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.336 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.336 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.336 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.336 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.337 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.337 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.337 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.337 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.337 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.337 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.337 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.337 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.338 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.338 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.338 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.338 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.338 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.338 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.338 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.339 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.339 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.339 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.339 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.339 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.339 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.339 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.339 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.340 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.340 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.340 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.340 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.340 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.340 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.341 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.341 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.341 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.341 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.341 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.341 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.341 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.341 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.342 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.342 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.342 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.342 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.342 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.342 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.342 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.343 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.343 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.343 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.343 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.343 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.343 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.343 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.344 229593 DEBUG oslo_service.service [None req-922028c2-4292-4952-b0e3-150ee468d167 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.345 229593 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.371 229593 INFO nova.virt.node [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Determined node identity c79215b2-6762-4f7f-a322-f44db2b0b9bd from /var/lib/nova/compute_id
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.371 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.372 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.372 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.373 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.384 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f86a46a3a00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.387 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f86a46a3a00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.388 229593 INFO nova.virt.libvirt.driver [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Connection event '1' reason 'None'
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.403 229593 DEBUG nova.virt.libvirt.volume.mount [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.405 229593 INFO nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Libvirt host capabilities <capabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <host>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <uuid>f041467c-26d0-44b9-832e-8db5f9b7a49d</uuid>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <arch>x86_64</arch>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model>EPYC-Rome-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <vendor>AMD</vendor>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <microcode version='16777317'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <signature family='23' model='49' stepping='0'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='x2apic'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='tsc-deadline'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='osxsave'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='hypervisor'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='tsc_adjust'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='spec-ctrl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='stibp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='arch-capabilities'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='cmp_legacy'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='topoext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='virt-ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='lbrv'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='tsc-scale'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='vmcb-clean'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='pause-filter'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='pfthreshold'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='svme-addr-chk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='rdctl-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='skip-l1dfl-vmentry'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='mds-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature name='pschange-mc-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <pages unit='KiB' size='4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <pages unit='KiB' size='2048'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <pages unit='KiB' size='1048576'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <power_management>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <suspend_mem/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <suspend_disk/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <suspend_hybrid/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </power_management>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <iommu support='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <migration_features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <live/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <uri_transports>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <uri_transport>tcp</uri_transport>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <uri_transport>rdma</uri_transport>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </uri_transports>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </migration_features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <topology>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <cells num='1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <cell id='0'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:           <memory unit='KiB'>16116612</memory>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:           <pages unit='KiB' size='4'>4029153</pages>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:           <pages unit='KiB' size='2048'>0</pages>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:           <distances>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:             <sibling id='0' value='10'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:           </distances>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:           <cpus num='8'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:           </cpus>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         </cell>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </cells>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </topology>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <cache>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </cache>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <secmodel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model>selinux</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <doi>0</doi>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </secmodel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <secmodel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model>dac</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <doi>0</doi>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </secmodel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </host>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <guest>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <os_type>hvm</os_type>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <arch name='i686'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <wordsize>32</wordsize>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <domain type='qemu'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <domain type='kvm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </arch>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <pae/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <nonpae/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <acpi default='on' toggle='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <apic default='on' toggle='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <cpuselection/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <deviceboot/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <disksnapshot default='on' toggle='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <externalSnapshot/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </guest>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <guest>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <os_type>hvm</os_type>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <arch name='x86_64'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <wordsize>64</wordsize>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <domain type='qemu'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <domain type='kvm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </arch>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <acpi default='on' toggle='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <apic default='on' toggle='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <cpuselection/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <deviceboot/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <disksnapshot default='on' toggle='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <externalSnapshot/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </guest>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: </capabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.416 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.433 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: <domainCapabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <domain>kvm</domain>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <arch>i686</arch>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <vcpu max='1024'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <iothreads supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <os supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <enum name='firmware'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <loader supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>rom</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pflash</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='readonly'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>yes</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>no</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='secure'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>no</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </loader>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </os>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>on</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>off</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='maximum' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='maximumMigratable'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>on</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>off</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='host-model' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <vendor>AMD</vendor>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='x2apic'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='stibp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='succor'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='lbrv'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='custom' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Dhyana-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Genoa'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='auto-ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='auto-ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-128'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-256'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-512'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='KnightsMill'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4fmaps'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4vnniw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512er'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512pf'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='KnightsMill-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4fmaps'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4vnniw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512er'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512pf'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tbm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tbm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SierraForest'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ne-convert'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cmpccxadd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SierraForest-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ne-convert'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cmpccxadd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='athlon'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='athlon-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='core2duo'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='core2duo-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='coreduo'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='coreduo-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='n270'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='n270-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='phenom'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='phenom-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <memoryBacking supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <enum name='sourceType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>file</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>anonymous</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>memfd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </memoryBacking>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <devices>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <disk supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='diskDevice'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>disk</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>cdrom</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>floppy</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>lun</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='bus'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>fdc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>scsi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>sata</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-non-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </disk>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <graphics supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vnc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>egl-headless</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dbus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </graphics>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <video supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='modelType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vga</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>cirrus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>none</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>bochs</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ramfb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </video>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <hostdev supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='mode'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>subsystem</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='startupPolicy'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>default</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>mandatory</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>requisite</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>optional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='subsysType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pci</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>scsi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='capsType'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='pciBackend'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </hostdev>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <rng supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-non-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>random</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>egd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>builtin</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </rng>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <filesystem supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='driverType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>path</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>handle</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtiofs</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </filesystem>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <tpm supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tpm-tis</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tpm-crb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>emulator</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>external</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendVersion'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>2.0</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </tpm>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <redirdev supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='bus'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </redirdev>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <channel supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pty</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>unix</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </channel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <crypto supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>qemu</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>builtin</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </crypto>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <interface supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>default</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>passt</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </interface>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <panic supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>isa</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>hyperv</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </panic>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <console supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>null</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pty</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dev</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>file</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pipe</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>stdio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>udp</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tcp</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>unix</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>qemu-vdagent</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dbus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </console>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </devices>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <gic supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <vmcoreinfo supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <genid supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <backingStoreInput supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <backup supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <async-teardown supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <ps2 supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <sev supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <sgx supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <hyperv supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='features'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>relaxed</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vapic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>spinlocks</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vpindex</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>runtime</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>synic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>stimer</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>reset</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vendor_id</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>frequencies</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>reenlightenment</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tlbflush</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ipi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>avic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>emsr_bitmap</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>xmm_input</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <defaults>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <spinlocks>4095</spinlocks>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <stimer_direct>on</stimer_direct>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </defaults>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </hyperv>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <launchSecurity supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='sectype'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tdx</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </launchSecurity>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: </domainCapabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.443 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: <domainCapabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <domain>kvm</domain>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <arch>i686</arch>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <vcpu max='240'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <iothreads supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <os supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <enum name='firmware'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <loader supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>rom</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pflash</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='readonly'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>yes</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>no</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='secure'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>no</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </loader>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </os>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>on</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>off</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='maximum' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='maximumMigratable'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>on</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>off</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='host-model' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <vendor>AMD</vendor>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='x2apic'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='stibp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='succor'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='lbrv'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='custom' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Dhyana-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Genoa'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='auto-ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='auto-ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-128'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-256'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-512'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='KnightsMill'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4fmaps'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4vnniw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512er'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512pf'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='KnightsMill-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4fmaps'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4vnniw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512er'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512pf'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tbm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tbm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SierraForest'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ne-convert'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cmpccxadd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SierraForest-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ne-convert'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cmpccxadd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='athlon'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='athlon-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='core2duo'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='core2duo-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='coreduo'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='coreduo-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='n270'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='n270-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='phenom'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='phenom-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <memoryBacking supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <enum name='sourceType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>file</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>anonymous</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>memfd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </memoryBacking>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <devices>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <disk supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='diskDevice'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>disk</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>cdrom</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>floppy</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>lun</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='bus'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ide</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>fdc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>scsi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>sata</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-non-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </disk>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <graphics supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vnc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>egl-headless</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dbus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </graphics>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <video supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='modelType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vga</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>cirrus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>none</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>bochs</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ramfb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </video>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <hostdev supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='mode'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>subsystem</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='startupPolicy'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>default</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>mandatory</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>requisite</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>optional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='subsysType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pci</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>scsi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='capsType'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='pciBackend'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </hostdev>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <rng supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-non-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>random</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>egd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>builtin</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </rng>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <filesystem supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='driverType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>path</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>handle</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtiofs</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </filesystem>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <tpm supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tpm-tis</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tpm-crb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>emulator</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>external</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendVersion'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>2.0</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </tpm>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <redirdev supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='bus'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </redirdev>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <channel supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pty</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>unix</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </channel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <crypto supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>qemu</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>builtin</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </crypto>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <interface supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>default</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>passt</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </interface>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <panic supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>isa</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>hyperv</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </panic>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <console supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>null</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pty</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dev</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>file</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pipe</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>stdio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>udp</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tcp</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>unix</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>qemu-vdagent</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dbus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </console>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </devices>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <gic supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <vmcoreinfo supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <genid supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <backingStoreInput supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <backup supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <async-teardown supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <ps2 supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <sev supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <sgx supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <hyperv supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='features'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>relaxed</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vapic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>spinlocks</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vpindex</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>runtime</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>synic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>stimer</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>reset</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vendor_id</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>frequencies</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>reenlightenment</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tlbflush</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ipi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>avic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>emsr_bitmap</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>xmm_input</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <defaults>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <spinlocks>4095</spinlocks>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <stimer_direct>on</stimer_direct>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </defaults>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </hyperv>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <launchSecurity supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='sectype'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tdx</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </launchSecurity>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: </domainCapabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.470 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.475 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: <domainCapabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <domain>kvm</domain>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <arch>x86_64</arch>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <vcpu max='1024'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <iothreads supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <os supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <enum name='firmware'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>efi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <loader supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>rom</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pflash</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='readonly'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>yes</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>no</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='secure'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>yes</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>no</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </loader>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </os>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>on</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>off</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='maximum' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='maximumMigratable'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>on</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>off</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='host-model' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <vendor>AMD</vendor>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='x2apic'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='stibp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='succor'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='lbrv'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='custom' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Dhyana-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Genoa'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='auto-ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='auto-ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-128'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-256'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-512'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='KnightsMill'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4fmaps'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4vnniw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512er'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512pf'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='KnightsMill-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4fmaps'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4vnniw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512er'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512pf'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tbm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tbm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SierraForest'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ne-convert'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cmpccxadd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SierraForest-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ne-convert'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cmpccxadd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='athlon'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='athlon-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='core2duo'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='core2duo-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='coreduo'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='coreduo-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='n270'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='n270-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='phenom'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='phenom-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <memoryBacking supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <enum name='sourceType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>file</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>anonymous</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>memfd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </memoryBacking>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <devices>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <disk supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='diskDevice'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>disk</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>cdrom</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>floppy</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>lun</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='bus'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>fdc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>scsi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>sata</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-non-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </disk>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <graphics supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vnc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>egl-headless</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dbus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </graphics>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <video supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='modelType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vga</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>cirrus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>none</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>bochs</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ramfb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </video>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <hostdev supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='mode'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>subsystem</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='startupPolicy'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>default</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>mandatory</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>requisite</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>optional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='subsysType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pci</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>scsi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='capsType'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='pciBackend'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </hostdev>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <rng supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-non-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>random</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>egd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>builtin</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </rng>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <filesystem supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='driverType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>path</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>handle</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtiofs</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </filesystem>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <tpm supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tpm-tis</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tpm-crb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>emulator</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>external</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendVersion'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>2.0</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </tpm>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <redirdev supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='bus'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </redirdev>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <channel supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pty</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>unix</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </channel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <crypto supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>qemu</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>builtin</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </crypto>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <interface supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>default</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>passt</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </interface>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <panic supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>isa</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>hyperv</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </panic>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <console supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>null</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pty</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dev</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>file</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pipe</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>stdio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>udp</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tcp</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>unix</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>qemu-vdagent</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dbus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </console>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </devices>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <gic supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <vmcoreinfo supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <genid supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <backingStoreInput supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <backup supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <async-teardown supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <ps2 supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <sev supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <sgx supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <hyperv supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='features'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>relaxed</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vapic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>spinlocks</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vpindex</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>runtime</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>synic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>stimer</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>reset</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vendor_id</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>frequencies</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>reenlightenment</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tlbflush</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ipi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>avic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>emsr_bitmap</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>xmm_input</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <defaults>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <spinlocks>4095</spinlocks>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <stimer_direct>on</stimer_direct>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </defaults>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </hyperv>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <launchSecurity supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='sectype'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tdx</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </launchSecurity>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: </domainCapabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.532 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: <domainCapabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <domain>kvm</domain>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <arch>x86_64</arch>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <vcpu max='240'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <iothreads supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <os supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <enum name='firmware'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <loader supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>rom</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pflash</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='readonly'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>yes</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>no</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='secure'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>no</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </loader>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </os>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>on</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>off</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='maximum' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='maximumMigratable'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>on</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>off</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='host-model' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <vendor>AMD</vendor>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='x2apic'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='stibp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='succor'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='lbrv'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <mode name='custom' supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Broadwell-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Cooperlake-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Denverton-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Dhyana-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Genoa'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='auto-ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='auto-ibrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amd-psfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='no-nested-data-bp'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='null-sel-clr-base'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='stibp-always-on'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='EPYC-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-128'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-256'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx10-512'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='prefetchiti'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Haswell-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='IvyBridge-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='KnightsMill'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4fmaps'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4vnniw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512er'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512pf'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='KnightsMill-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4fmaps'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-4vnniw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512er'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512pf'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tbm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fma4'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tbm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xop'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='amx-tile'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-bf16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-fp16'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bitalg'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vbmi2'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrc'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fzrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='la57'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='taa-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='tsx-ldtrk'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xfd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SierraForest'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ne-convert'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cmpccxadd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='SierraForest-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ifma'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-ne-convert'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx-vnni-int8'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='bus-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cmpccxadd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fbsdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='fsrs'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ibrs-all'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mcdt-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pbrsb-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='psdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='serialize'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vaes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='vpclmulqdq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='hle'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='rtm'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512bw'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512cd'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512dq'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512f'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='avx512vl'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='invpcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pcid'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='pku'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='mpx'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v2'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v3'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='core-capability'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='split-lock-detect'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='Snowridge-v4'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='cldemote'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='erms'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='gfni'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdir64b'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='movdiri'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='xsaves'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='athlon'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='athlon-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='core2duo'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='core2duo-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='coreduo'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='coreduo-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='n270'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='n270-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='ss'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='phenom'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <blockers model='phenom-v1'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnow'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <feature name='3dnowext'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </blockers>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </mode>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </cpu>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <memoryBacking supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <enum name='sourceType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>file</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>anonymous</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <value>memfd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </memoryBacking>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <devices>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <disk supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='diskDevice'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>disk</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>cdrom</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>floppy</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>lun</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='bus'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ide</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>fdc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>scsi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>sata</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-non-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </disk>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <graphics supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vnc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>egl-headless</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dbus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </graphics>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <video supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='modelType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vga</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>cirrus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>none</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>bochs</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ramfb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </video>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <hostdev supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='mode'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>subsystem</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='startupPolicy'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>default</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>mandatory</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>requisite</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>optional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='subsysType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pci</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>scsi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='capsType'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='pciBackend'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </hostdev>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <rng supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtio-non-transitional</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>random</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>egd</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>builtin</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </rng>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <filesystem supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='driverType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>path</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>handle</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>virtiofs</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </filesystem>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <tpm supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tpm-tis</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tpm-crb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>emulator</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>external</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendVersion'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>2.0</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </tpm>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <redirdev supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='bus'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>usb</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </redirdev>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <channel supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pty</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>unix</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </channel>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <crypto supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>qemu</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendModel'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>builtin</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </crypto>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <interface supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='backendType'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>default</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>passt</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </interface>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <panic supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='model'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>isa</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>hyperv</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </panic>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <console supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='type'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>null</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vc</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pty</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dev</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>file</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>pipe</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>stdio</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>udp</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tcp</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>unix</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>qemu-vdagent</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>dbus</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </console>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </devices>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   <features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <gic supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <vmcoreinfo supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <genid supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <backingStoreInput supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <backup supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <async-teardown supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <ps2 supported='yes'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <sev supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <sgx supported='no'/>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <hyperv supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='features'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>relaxed</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vapic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>spinlocks</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vpindex</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>runtime</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>synic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>stimer</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>reset</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>vendor_id</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>frequencies</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>reenlightenment</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tlbflush</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>ipi</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>avic</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>emsr_bitmap</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>xmm_input</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <defaults>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <spinlocks>4095</spinlocks>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <stimer_direct>on</stimer_direct>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </defaults>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </hyperv>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     <launchSecurity supported='yes'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       <enum name='sectype'>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:         <value>tdx</value>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:       </enum>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:     </launchSecurity>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:   </features>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: </domainCapabilities>
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.596 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.596 229593 INFO nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Secure Boot support detected
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.599 229593 INFO nova.virt.libvirt.driver [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.599 229593 INFO nova.virt.libvirt.driver [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.612 229593 DEBUG nova.virt.libvirt.driver [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.647 229593 INFO nova.virt.node [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Determined node identity c79215b2-6762-4f7f-a322-f44db2b0b9bd from /var/lib/nova/compute_id
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.666 229593 DEBUG nova.compute.manager [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Verified node c79215b2-6762-4f7f-a322-f44db2b0b9bd matches my host np0005541913.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.711 229593 DEBUG nova.compute.manager [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.715 229593 DEBUG nova.virt.libvirt.vif [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-02T08:31:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005541913.localdomain',hostname='test',id=2,image_ref='6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T08:31:55Z,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005541913.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e2d97696ab6749899bb8ba5ce29a3de2',ramdisk_id='',reservation_id='r-6ofcfgb6',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T08:31:55Z,user_data=None,user_id='cb8b7d2a63b642aa999db12e17eeb9e4',uuid=b254bb7f-2891-4b37-9c44-9700e301ce16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.715 229593 DEBUG nova.network.os_vif_util [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Converting VIF {"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.715 229593 DEBUG nova.network.os_vif_util [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.716 229593 DEBUG os_vif [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.740 229593 DEBUG ovsdbapp.backend.ovs_idl [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.740 229593 DEBUG ovsdbapp.backend.ovs_idl [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.740 229593 DEBUG ovsdbapp.backend.ovs_idl [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.740 229593 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.741 229593 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.741 229593 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.741 229593 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.742 229593 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.744 229593 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.755 229593 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.755 229593 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.756 229593 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:34:53 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:53.756 229593 INFO oslo.privsep.daemon [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpokq8zc9r/privsep.sock']
Dec 02 09:34:53 np0005541913.localdomain python3.9[229954]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.309 229593 INFO oslo.privsep.daemon [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Spawned new privsep daemon via rootwrap
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.214 229976 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.220 229976 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.225 229976 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.225 229976 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229976
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.605 229593 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.605 229593 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a318f6a-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.606 229593 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a318f6a-b3, col_values=(('external_ids', {'iface-id': '4a318f6a-b3c1-4690-8246-f7d046ccd64a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:b2:03', 'vm-uuid': 'b254bb7f-2891-4b37-9c44-9700e301ce16'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.607 229593 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.608 229593 INFO os_vif [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3')
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.608 229593 DEBUG nova.compute.manager [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.612 229593 DEBUG nova.compute.manager [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Dec 02 09:34:54 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:54.613 229593 INFO nova.compute.manager [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 02 09:34:55 np0005541913.localdomain sudo[230070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwawowyrlbsxbqcleoewrrjjcamddrmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668094.5172305-4265-141791782429385/AnsiballZ_podman_container.py
Dec 02 09:34:55 np0005541913.localdomain sudo[230070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:55 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:55.069 229593 INFO nova.service [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Updating service version for nova-compute on np0005541913.localdomain from 57 to 66
Dec 02 09:34:55 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:55.102 229593 DEBUG oslo_concurrency.lockutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:34:55 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:55.103 229593 DEBUG oslo_concurrency.lockutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:34:55 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:55.103 229593 DEBUG oslo_concurrency.lockutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:34:55 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:55.104 229593 DEBUG nova.compute.resource_tracker [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:34:55 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:55.105 229593 DEBUG oslo_concurrency.processutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:34:55 np0005541913.localdomain python3.9[230072]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 02 09:34:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8101 DF PROTO=TCP SPT=46342 DPT=9101 SEQ=3209027010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799D0E50000000001030307) 
Dec 02 09:34:55 np0005541913.localdomain systemd-journald[47611]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 122.2 (407 of 333 items), suggesting rotation.
Dec 02 09:34:55 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 09:34:55 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:34:55 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:34:55 np0005541913.localdomain sudo[230070]: pam_unix(sudo:session): session closed for user root
Dec 02 09:34:55 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:55.572 229593 DEBUG oslo_concurrency.processutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:34:55 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:55.664 229593 DEBUG nova.virt.libvirt.driver [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:34:55 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:55.664 229593 DEBUG nova.virt.libvirt.driver [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:34:55 np0005541913.localdomain systemd[1]: Started libvirt nodedev daemon.
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.022 229593 WARNING nova.virt.libvirt.driver [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.023 229593 DEBUG nova.compute.resource_tracker [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12949MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.023 229593 DEBUG oslo_concurrency.lockutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.024 229593 DEBUG oslo_concurrency.lockutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:34:56 np0005541913.localdomain sudo[230306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdftcrjwgxnrdxdjiskgwggrswnqvbdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668095.7246878-4289-87293875337135/AnsiballZ_systemd.py
Dec 02 09:34:56 np0005541913.localdomain sudo[230306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.211 229593 DEBUG nova.compute.resource_tracker [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.212 229593 DEBUG nova.compute.resource_tracker [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.212 229593 DEBUG nova.compute.resource_tracker [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.229 229593 DEBUG nova.scheduler.client.report [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.248 229593 DEBUG nova.scheduler.client.report [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.248 229593 DEBUG nova.compute.provider_tree [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.264 229593 DEBUG nova.scheduler.client.report [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.290 229593 DEBUG nova.scheduler.client.report [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX2,HW_CPU_X86_ABM,HW_CPU_X86_SHA,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SVM,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.342 229593 DEBUG oslo_concurrency.processutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:34:56 np0005541913.localdomain python3.9[230308]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.774 229593 DEBUG oslo_concurrency.processutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.779 229593 DEBUG nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.779 229593 INFO nova.virt.libvirt.host [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] kernel doesn't support AMD SEV
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.780 229593 DEBUG nova.compute.provider_tree [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.780 229593 DEBUG nova.virt.libvirt.driver [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.823 229593 DEBUG nova.scheduler.client.report [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Updated inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.824 229593 DEBUG nova.compute.provider_tree [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Updating resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.824 229593 DEBUG nova.compute.provider_tree [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.894 229593 DEBUG nova.compute.provider_tree [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Updating resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.914 229593 DEBUG nova.compute.resource_tracker [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.915 229593 DEBUG oslo_concurrency.lockutils [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.891s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.915 229593 DEBUG nova.service [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.974 229593 DEBUG nova.service [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 02 09:34:56 np0005541913.localdomain nova_compute[229589]: 2025-12-02 09:34:56.975 229593 DEBUG nova.servicegroup.drivers.db [None req-6701026c-4a93-489b-bff1-de903c44d2f8 - - - - - -] DB_Driver: join new ServiceGroup member np0005541913.localdomain to the compute group, service = <Service: host=np0005541913.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 02 09:34:57 np0005541913.localdomain systemd[1]: Stopping nova_compute container...
Dec 02 09:34:57 np0005541913.localdomain virtqemud[203664]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 02 09:34:57 np0005541913.localdomain systemd[1]: libpod-a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f.scope: Deactivated successfully.
Dec 02 09:34:57 np0005541913.localdomain virtqemud[203664]: hostname: np0005541913.localdomain
Dec 02 09:34:57 np0005541913.localdomain virtqemud[203664]: End of file while reading data: Input/output error
Dec 02 09:34:57 np0005541913.localdomain systemd[1]: libpod-a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f.scope: Consumed 4.487s CPU time.
Dec 02 09:34:57 np0005541913.localdomain podman[230334]: 2025-12-02 09:34:57.497646303 +0000 UTC m=+0.080513993 container died a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:34:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f-userdata-shm.mount: Deactivated successfully.
Dec 02 09:34:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0-merged.mount: Deactivated successfully.
Dec 02 09:34:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8102 DF PROTO=TCP SPT=46342 DPT=9101 SEQ=3209027010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799E0A50000000001030307) 
Dec 02 09:35:02 np0005541913.localdomain podman[230334]: 2025-12-02 09:35:02.492121909 +0000 UTC m=+5.074989539 container cleanup a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:35:02 np0005541913.localdomain podman[230334]: nova_compute
Dec 02 09:35:02 np0005541913.localdomain podman[230622]: error opening file `/run/crun/a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f/status`: No such file or directory
Dec 02 09:35:02 np0005541913.localdomain podman[230611]: 2025-12-02 09:35:02.576138335 +0000 UTC m=+0.052381467 container cleanup a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 02 09:35:02 np0005541913.localdomain podman[230611]: nova_compute
Dec 02 09:35:02 np0005541913.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 02 09:35:02 np0005541913.localdomain systemd[1]: Stopped nova_compute container.
Dec 02 09:35:02 np0005541913.localdomain systemd[1]: Starting nova_compute container...
Dec 02 09:35:02 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:35:02 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 02 09:35:02 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 09:35:02 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 09:35:02 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 09:35:02 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 09:35:02 np0005541913.localdomain podman[230624]: 2025-12-02 09:35:02.711555401 +0000 UTC m=+0.103900359 container init a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:35:02 np0005541913.localdomain podman[230624]: 2025-12-02 09:35:02.721880132 +0000 UTC m=+0.114225070 container start a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:35:02 np0005541913.localdomain podman[230624]: nova_compute
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: + sudo -E kolla_set_configs
Dec 02 09:35:02 np0005541913.localdomain systemd[1]: Started nova_compute container.
Dec 02 09:35:02 np0005541913.localdomain sudo[230306]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Validating config file
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying service configuration files
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /etc/ceph
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Creating directory /etc/ceph
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/ceph
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Writing out command to execute
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: ++ cat /run_command
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: + CMD=nova-compute
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: + ARGS=
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: + sudo kolla_copy_cacerts
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: + [[ ! -n '' ]]
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: + . kolla_extend_start
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: + echo 'Running command: '\''nova-compute'\'''
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: Running command: 'nova-compute'
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: + umask 0022
Dec 02 09:35:02 np0005541913.localdomain nova_compute[230637]: + exec nova-compute
Dec 02 09:35:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:35:03.016 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:35:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:35:03.017 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:35:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:35:03.018 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:35:03 np0005541913.localdomain sudo[230757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsiewwfoqyhmbhmwbxalqtxbkctfvxma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668103.0005476-4317-266386413588385/AnsiballZ_podman_container.py
Dec 02 09:35:03 np0005541913.localdomain sudo[230757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:03 np0005541913.localdomain python3.9[230759]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 02 09:35:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6762 DF PROTO=TCP SPT=57932 DPT=9102 SEQ=3257595781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799F27E0000000001030307) 
Dec 02 09:35:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55579 DF PROTO=TCP SPT=35406 DPT=9105 SEQ=372350077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799F2FE0000000001030307) 
Dec 02 09:35:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:04.564 230641 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 02 09:35:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:04.565 230641 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 02 09:35:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:04.565 230641 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 02 09:35:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:04.565 230641 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 02 09:35:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:04.691 230641 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:35:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:04.713 230641 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:35:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:04.713 230641 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.082 230641 INFO nova.virt.driver [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 02 09:35:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:35:05 np0005541913.localdomain systemd[1]: Started libpod-conmon-ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a.scope.
Dec 02 09:35:05 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:35:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 02 09:35:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 02 09:35:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.213 230641 INFO nova.compute.provider_config [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.221 230641 WARNING nova.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.221 230641 DEBUG oslo_concurrency.lockutils [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.221 230641 DEBUG oslo_concurrency.lockutils [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.222 230641 DEBUG oslo_concurrency.lockutils [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.222 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.222 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.222 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.223 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.223 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.223 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.223 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.223 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.223 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.224 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.224 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.224 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.224 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain podman[230798]: 2025-12-02 09:35:05.223765381 +0000 UTC m=+0.079177125 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.224 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.224 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.225 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.225 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.225 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.225 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.225 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] console_host                   = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.225 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.225 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.225 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.226 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.226 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.226 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.226 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.226 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.226 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.227 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.227 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.227 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.227 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.227 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.227 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.227 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.228 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.228 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.228 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] host                           = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.228 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.228 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.228 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.228 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.229 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.229 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.229 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.229 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.229 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.229 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.230 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.230 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.230 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.230 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.230 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.230 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.231 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.231 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.231 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.231 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain podman[230798]: 2025-12-02 09:35:05.231535093 +0000 UTC m=+0.086946807 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.231 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.231 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.231 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.231 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.232 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.232 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.232 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.232 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.232 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.232 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.232 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.233 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.233 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.233 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.233 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.233 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.233 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.233 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.234 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.234 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.234 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.234 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.234 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.234 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.234 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.234 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.235 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.235 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.235 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.235 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.235 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.235 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.235 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.236 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.236 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.236 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.236 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.236 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.236 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.236 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.236 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.237 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.237 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.237 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.237 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.237 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.237 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.237 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.238 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.238 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.238 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.238 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.238 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.238 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.238 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.239 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.239 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.239 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.239 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.239 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.240 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.240 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.240 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.240 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.240 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.240 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.240 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.240 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.241 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.241 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.241 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.241 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.241 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.241 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.241 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.242 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.242 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.242 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.242 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.242 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.242 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.242 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.243 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.243 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.243 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.243 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.243 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.243 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.243 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.244 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.244 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.244 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.244 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.244 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.245 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.245 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.245 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.245 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.246 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.246 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.246 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain podman[230785]: 2025-12-02 09:35:05.245588706 +0000 UTC m=+0.176622019 container init ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=nova_compute_init)
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.246 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.246 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.246 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.246 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.247 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.247 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.247 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.247 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.247 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.247 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.248 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.248 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.248 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.248 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.248 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.248 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.248 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.249 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.249 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.249 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.249 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.249 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.249 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.249 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.250 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.250 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.250 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.250 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.250 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.250 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.250 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.251 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.251 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.251 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.251 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.251 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.251 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.251 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.252 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.252 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.252 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.252 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.252 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.252 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.252 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.253 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.253 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.253 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.253 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.253 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.253 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.253 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.254 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.254 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.254 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.254 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.254 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain podman[230785]: 2025-12-02 09:35:05.254544819 +0000 UTC m=+0.185578132 container start ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.254 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.255 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.255 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.255 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.255 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.255 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.255 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.256 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.256 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.256 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.256 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.256 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.256 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.256 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.257 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.257 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.257 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.257 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.257 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.257 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain python3.9[230759]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.257 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.258 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.259 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.259 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.259 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.260 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.260 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.260 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.261 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.261 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.261 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.262 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.262 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.262 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.263 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.263 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.263 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.264 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.264 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.264 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.264 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.265 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.265 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.265 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.266 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.266 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.266 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.267 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.267 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.267 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.267 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.268 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.268 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.268 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.269 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.269 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.269 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.270 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.270 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.270 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.270 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.271 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.271 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.271 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.272 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.272 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.272 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.273 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.273 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.273 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.273 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.274 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.274 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.274 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.275 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.275 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.275 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.275 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.276 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.276 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.276 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.277 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.277 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.277 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.278 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.278 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.278 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.279 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.279 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.279 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.279 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.280 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.280 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.280 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.281 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.281 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.281 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.281 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.282 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.282 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.282 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.283 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.283 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.283 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.284 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.284 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.284 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.284 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.285 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.285 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.285 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.286 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.286 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.286 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.286 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.286 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.286 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.287 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.287 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.287 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.287 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.287 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.288 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.288 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.288 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.288 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.289 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.289 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.289 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.289 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.289 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.290 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.290 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.290 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.290 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.290 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.291 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.291 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.291 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.291 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.292 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.292 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.292 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.292 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.292 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.293 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.293 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.293 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.293 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.294 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.294 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.294 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.294 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.294 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.294 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.295 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.295 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.295 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.295 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.295 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.296 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.296 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.296 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.296 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.296 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.297 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.297 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.297 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.297 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.297 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.298 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.298 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.298 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.298 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.299 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.299 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.299 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.299 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.299 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.300 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.300 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.300 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.300 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.300 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.301 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.301 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.301 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.301 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.302 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.302 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.302 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.302 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.302 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.303 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.303 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.303 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.303 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.304 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.304 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.304 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.304 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.304 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.305 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.305 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.305 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.305 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.305 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.306 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.306 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.306 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.306 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.306 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.307 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.307 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.307 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.307 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.307 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.308 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.308 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.308 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.308 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.308 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.309 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.309 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.309 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.309 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.309 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.310 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.310 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.310 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.310 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.310 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.311 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.311 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.311 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.311 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.311 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.312 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.312 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.312 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.312 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.313 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.313 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.313 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.313 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.313 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.313 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.314 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.314 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.314 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.314 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.315 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Applying nova statedir ownership
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.315 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.315 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.316 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/b254bb7f-2891-4b37-9c44-9700e301ce16/
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/b254bb7f-2891-4b37-9c44-9700e301ce16 already 42436:42436
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.316 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/b254bb7f-2891-4b37-9c44-9700e301ce16 to system_u:object_r:container_file_t:s0
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/b254bb7f-2891-4b37-9c44-9700e301ce16/console.log
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.316 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/4ee0f3f792b433d78f415a6f600ca9c7d9f0adb3
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.316 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.317 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-4ee0f3f792b433d78f415a6f600ca9c7d9f0adb3
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.317 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.317 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.317 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.317 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.318 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.318 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/20273498b7380904530133bcb3f720bd45f4f00b810dc4597d81d23acd8f9673
Dec 02 09:35:05 np0005541913.localdomain nova_compute_init[230823]: INFO:nova_statedir:Nova statedir ownership complete
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.318 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.318 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.318 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.319 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.319 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.319 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.319 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.319 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.320 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.320 230641 WARNING oslo_config.cfg [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: and ``live_migration_inbound_addr`` respectively.
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: ).  Its value may be silently ignored in the future.
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.320 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.320 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.321 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.321 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.321 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.321 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.322 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.322 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.322 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.322 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.322 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.323 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.323 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.323 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.323 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.324 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.324 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.324 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.324 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rbd_secret_uuid        = c7c8e171-a193-56fb-95fa-8879fcfa7074 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.324 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.325 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.325 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.325 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.325 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.325 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.326 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.326 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.326 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.326 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.326 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain systemd[1]: libpod-ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a.scope: Deactivated successfully.
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.326 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.328 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.328 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.328 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.329 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.329 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.329 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.329 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.329 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.329 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.329 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.329 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.330 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.330 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.330 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.330 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.331 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.331 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.331 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.331 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.331 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.331 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.332 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.332 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.332 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.332 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.332 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.332 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.332 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.332 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.333 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.333 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.333 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.333 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.333 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.334 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.334 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.334 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.334 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.334 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.334 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.334 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.334 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.335 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.335 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.335 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.335 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.335 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.335 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.335 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.336 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.336 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.336 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.336 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.336 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.336 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.337 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.337 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.337 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.337 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.337 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.337 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.337 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.337 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.338 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.338 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.338 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.338 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.338 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.338 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.338 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.338 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.339 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.339 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.339 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.339 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.339 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.339 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.339 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.340 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.340 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.340 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.340 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.340 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.340 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.340 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.340 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.341 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.341 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.341 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.341 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.341 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.341 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.341 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.342 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.342 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.342 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.342 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.342 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.342 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.342 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.343 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.343 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.343 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.343 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.343 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.343 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.343 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.343 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.344 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.344 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.344 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.344 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.344 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.344 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.345 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.345 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.345 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.345 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.345 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.345 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.345 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.346 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.346 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.346 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.346 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.346 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.346 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.346 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.347 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.347 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.347 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.347 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.347 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.347 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.347 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.348 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.348 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.348 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.348 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.348 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.348 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.348 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.348 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.349 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.349 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.349 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.349 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.349 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.349 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.349 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.350 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.350 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.350 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.350 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.350 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.350 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.350 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.351 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.351 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.351 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.351 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.351 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.351 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.351 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.351 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.352 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.352 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.352 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.352 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.352 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.352 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.352 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.353 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.353 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.353 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.353 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.353 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.353 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.353 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.354 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.354 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.354 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.354 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.354 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.354 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.354 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.354 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.355 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.355 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.355 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.355 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.355 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.355 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.355 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.356 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.356 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.356 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.356 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.356 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.356 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.356 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.356 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.357 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.357 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.357 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.357 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.357 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.357 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.357 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.357 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.358 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.358 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.358 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.358 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.358 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.358 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.358 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.359 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.359 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.359 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.359 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.359 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.359 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.360 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.360 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.360 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.360 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.360 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.360 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.360 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.360 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.361 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.361 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.361 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.361 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.361 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.361 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.361 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.362 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.362 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.362 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.362 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.362 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.362 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.362 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.363 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.363 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.363 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.363 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.363 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.363 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.363 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.364 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.364 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.364 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.364 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.364 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.364 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.364 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.365 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.365 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.365 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.365 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.365 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.365 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.365 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.366 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.366 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.366 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.366 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.366 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.366 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.366 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.366 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.367 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.367 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.367 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.367 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.367 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.367 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.367 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.368 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.368 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.368 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.368 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.368 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.368 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.368 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.369 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.369 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.369 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.369 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.369 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.369 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.369 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.370 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.370 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.370 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.370 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.370 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.370 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.370 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.370 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.371 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.371 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.371 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.371 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.371 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.371 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.371 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.372 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.372 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.372 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.372 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.372 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.372 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.372 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.373 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.373 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.373 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.373 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.373 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.373 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.373 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.374 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.374 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.374 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.374 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.374 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.374 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.374 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.375 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.375 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.375 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.375 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.375 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.375 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.375 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.376 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.376 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.376 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.376 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.376 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.376 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.376 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.377 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.377 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.377 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.377 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.377 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.377 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.377 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.378 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.378 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.378 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.378 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.378 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.378 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.378 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.379 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.379 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.379 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.379 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.379 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.379 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.379 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.380 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.380 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.380 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.380 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.380 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.380 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.381 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.381 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.381 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.381 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.381 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.381 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.381 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.381 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.382 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.382 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.382 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.382 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.382 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.382 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.382 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.383 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.383 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.383 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.383 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.383 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.383 230641 DEBUG oslo_service.service [None req-b5362df6-6f13-436d-974e-686253e69105 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.386 230641 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 02 09:35:05 np0005541913.localdomain podman[230838]: 2025-12-02 09:35:05.415904422 +0000 UTC m=+0.068398634 container died ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:35:05 np0005541913.localdomain sudo[230757]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.424 230641 INFO nova.virt.node [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Determined node identity c79215b2-6762-4f7f-a322-f44db2b0b9bd from /var/lib/nova/compute_id
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.425 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.426 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.426 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.426 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.435 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7c93d30580> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.436 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7c93d30580> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.437 230641 INFO nova.virt.libvirt.driver [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Connection event '1' reason 'None'
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.444 230641 INFO nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Libvirt host capabilities <capabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <host>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <uuid>f041467c-26d0-44b9-832e-8db5f9b7a49d</uuid>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <arch>x86_64</arch>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model>EPYC-Rome-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <vendor>AMD</vendor>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <microcode version='16777317'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <signature family='23' model='49' stepping='0'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='x2apic'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='tsc-deadline'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='osxsave'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='hypervisor'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='tsc_adjust'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='spec-ctrl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='stibp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='arch-capabilities'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='cmp_legacy'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='topoext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='virt-ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='lbrv'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='tsc-scale'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='vmcb-clean'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='pause-filter'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='pfthreshold'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='svme-addr-chk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='rdctl-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='skip-l1dfl-vmentry'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='mds-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature name='pschange-mc-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <pages unit='KiB' size='4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <pages unit='KiB' size='2048'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <pages unit='KiB' size='1048576'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <power_management>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <suspend_mem/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <suspend_disk/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <suspend_hybrid/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </power_management>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <iommu support='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <migration_features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <live/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <uri_transports>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <uri_transport>tcp</uri_transport>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <uri_transport>rdma</uri_transport>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </uri_transports>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </migration_features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <topology>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <cells num='1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <cell id='0'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:           <memory unit='KiB'>16116612</memory>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:           <pages unit='KiB' size='4'>4029153</pages>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:           <pages unit='KiB' size='2048'>0</pages>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:           <distances>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:             <sibling id='0' value='10'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:           </distances>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:           <cpus num='8'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:           </cpus>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         </cell>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </cells>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </topology>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <cache>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </cache>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <secmodel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model>selinux</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <doi>0</doi>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </secmodel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <secmodel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model>dac</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <doi>0</doi>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </secmodel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </host>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <guest>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <os_type>hvm</os_type>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <arch name='i686'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <wordsize>32</wordsize>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <domain type='qemu'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <domain type='kvm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </arch>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <pae/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <nonpae/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <acpi default='on' toggle='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <apic default='on' toggle='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <cpuselection/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <deviceboot/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <disksnapshot default='on' toggle='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <externalSnapshot/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </guest>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <guest>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <os_type>hvm</os_type>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <arch name='x86_64'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <wordsize>64</wordsize>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <domain type='qemu'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <domain type='kvm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </arch>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <acpi default='on' toggle='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <apic default='on' toggle='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <cpuselection/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <deviceboot/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <disksnapshot default='on' toggle='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <externalSnapshot/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </guest>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: </capabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 
Dec 02 09:35:05 np0005541913.localdomain podman[230838]: 2025-12-02 09:35:05.450827232 +0000 UTC m=+0.103321354 container cleanup ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, container_name=nova_compute_init, managed_by=edpm_ansible)
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.453 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 02 09:35:05 np0005541913.localdomain systemd[1]: libpod-conmon-ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a.scope: Deactivated successfully.
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.458 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: <domainCapabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <domain>kvm</domain>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <arch>i686</arch>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <vcpu max='1024'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <iothreads supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <os supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <enum name='firmware'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <loader supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>rom</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pflash</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='readonly'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>yes</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>no</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='secure'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>no</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </loader>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </os>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>on</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>off</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='maximum' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='maximumMigratable'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>on</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>off</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='host-model' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <vendor>AMD</vendor>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='x2apic'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='stibp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='succor'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='lbrv'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='custom' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Dhyana-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Genoa'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='auto-ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='auto-ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-128'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-256'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-512'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='KnightsMill'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4fmaps'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4vnniw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512er'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512pf'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='KnightsMill-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4fmaps'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4vnniw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512er'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512pf'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tbm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tbm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SierraForest'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ne-convert'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cmpccxadd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SierraForest-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ne-convert'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cmpccxadd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='athlon'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='athlon-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='core2duo'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='core2duo-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='coreduo'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='coreduo-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='n270'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='n270-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='phenom'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='phenom-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <memoryBacking supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <enum name='sourceType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>file</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>anonymous</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>memfd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </memoryBacking>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <devices>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <disk supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='diskDevice'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>disk</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>cdrom</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>floppy</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>lun</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='bus'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>fdc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>scsi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>sata</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-non-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </disk>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <graphics supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vnc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>egl-headless</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dbus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </graphics>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <video supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='modelType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vga</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>cirrus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>none</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>bochs</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ramfb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </video>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <hostdev supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='mode'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>subsystem</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='startupPolicy'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>default</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>mandatory</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>requisite</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>optional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='subsysType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pci</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>scsi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='capsType'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='pciBackend'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </hostdev>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <rng supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-non-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>random</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>egd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>builtin</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </rng>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <filesystem supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='driverType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>path</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>handle</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtiofs</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </filesystem>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <tpm supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tpm-tis</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tpm-crb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>emulator</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>external</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendVersion'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>2.0</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </tpm>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <redirdev supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='bus'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </redirdev>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <channel supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pty</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>unix</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </channel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <crypto supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>qemu</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>builtin</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </crypto>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <interface supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>default</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>passt</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </interface>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <panic supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>isa</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>hyperv</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </panic>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <console supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>null</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pty</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dev</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>file</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pipe</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>stdio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>udp</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tcp</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>unix</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>qemu-vdagent</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dbus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </console>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </devices>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <gic supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <vmcoreinfo supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <genid supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <backingStoreInput supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <backup supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <async-teardown supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <ps2 supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <sev supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <sgx supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <hyperv supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='features'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>relaxed</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vapic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>spinlocks</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vpindex</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>runtime</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>synic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>stimer</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>reset</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vendor_id</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>frequencies</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>reenlightenment</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tlbflush</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ipi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>avic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>emsr_bitmap</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>xmm_input</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <defaults>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <spinlocks>4095</spinlocks>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <stimer_direct>on</stimer_direct>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </defaults>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </hyperv>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <launchSecurity supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='sectype'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tdx</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </launchSecurity>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: </domainCapabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.463 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: <domainCapabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <domain>kvm</domain>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <arch>i686</arch>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <vcpu max='240'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <iothreads supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <os supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <enum name='firmware'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <loader supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>rom</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pflash</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='readonly'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>yes</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>no</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='secure'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>no</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </loader>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </os>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>on</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>off</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='maximum' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='maximumMigratable'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>on</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>off</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='host-model' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <vendor>AMD</vendor>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='x2apic'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='stibp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='succor'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='lbrv'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='custom' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Dhyana-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Genoa'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='auto-ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='auto-ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-128'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-256'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-512'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='KnightsMill'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4fmaps'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4vnniw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512er'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512pf'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='KnightsMill-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4fmaps'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4vnniw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512er'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512pf'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tbm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tbm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SierraForest'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ne-convert'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cmpccxadd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SierraForest-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ne-convert'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cmpccxadd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='athlon'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='athlon-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='core2duo'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='core2duo-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='coreduo'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='coreduo-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='n270'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='n270-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='phenom'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='phenom-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <memoryBacking supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <enum name='sourceType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>file</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>anonymous</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>memfd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </memoryBacking>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <devices>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <disk supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='diskDevice'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>disk</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>cdrom</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>floppy</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>lun</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='bus'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ide</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>fdc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>scsi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>sata</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-non-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </disk>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <graphics supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vnc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>egl-headless</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dbus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </graphics>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <video supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='modelType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vga</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>cirrus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>none</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>bochs</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ramfb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </video>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <hostdev supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='mode'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>subsystem</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='startupPolicy'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>default</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>mandatory</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>requisite</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>optional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='subsysType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pci</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>scsi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='capsType'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='pciBackend'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </hostdev>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <rng supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-non-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>random</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>egd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>builtin</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </rng>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <filesystem supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='driverType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>path</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>handle</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtiofs</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </filesystem>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <tpm supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tpm-tis</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tpm-crb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>emulator</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>external</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendVersion'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>2.0</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </tpm>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <redirdev supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='bus'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </redirdev>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <channel supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pty</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>unix</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </channel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <crypto supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>qemu</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>builtin</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </crypto>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <interface supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>default</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>passt</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </interface>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <panic supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>isa</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>hyperv</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </panic>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <console supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>null</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pty</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dev</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>file</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pipe</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>stdio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>udp</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tcp</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>unix</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>qemu-vdagent</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dbus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </console>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </devices>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <gic supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <vmcoreinfo supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <genid supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <backingStoreInput supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <backup supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <async-teardown supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <ps2 supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <sev supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <sgx supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <hyperv supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='features'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>relaxed</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vapic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>spinlocks</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vpindex</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>runtime</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>synic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>stimer</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>reset</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vendor_id</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>frequencies</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>reenlightenment</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tlbflush</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ipi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>avic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>emsr_bitmap</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>xmm_input</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <defaults>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <spinlocks>4095</spinlocks>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <stimer_direct>on</stimer_direct>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </defaults>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </hyperv>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <launchSecurity supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='sectype'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tdx</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </launchSecurity>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: </domainCapabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.502 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.504 230641 DEBUG nova.virt.libvirt.volume.mount [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.507 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: <domainCapabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <domain>kvm</domain>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <arch>x86_64</arch>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <vcpu max='1024'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <iothreads supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <os supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <enum name='firmware'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>efi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <loader supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>rom</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pflash</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='readonly'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>yes</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>no</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='secure'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>yes</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>no</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </loader>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </os>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>on</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>off</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='maximum' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='maximumMigratable'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>on</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>off</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='host-model' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <vendor>AMD</vendor>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='x2apic'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='stibp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='succor'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='lbrv'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='custom' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Dhyana-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Genoa'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='auto-ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='auto-ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-128'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-256'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-512'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='KnightsMill'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4fmaps'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4vnniw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512er'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512pf'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='KnightsMill-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4fmaps'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4vnniw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512er'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512pf'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tbm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tbm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SierraForest'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ne-convert'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cmpccxadd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SierraForest-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ne-convert'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cmpccxadd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='athlon'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='athlon-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='core2duo'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='core2duo-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='coreduo'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='coreduo-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='n270'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='n270-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='phenom'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='phenom-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <memoryBacking supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <enum name='sourceType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>file</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>anonymous</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>memfd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </memoryBacking>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <devices>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <disk supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='diskDevice'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>disk</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>cdrom</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>floppy</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>lun</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='bus'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>fdc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>scsi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>sata</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-non-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </disk>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <graphics supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vnc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>egl-headless</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dbus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </graphics>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <video supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='modelType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vga</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>cirrus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>none</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>bochs</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ramfb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </video>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <hostdev supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='mode'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>subsystem</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='startupPolicy'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>default</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>mandatory</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>requisite</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>optional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='subsysType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pci</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>scsi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='capsType'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='pciBackend'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </hostdev>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <rng supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-non-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>random</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>egd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>builtin</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </rng>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <filesystem supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='driverType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>path</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>handle</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtiofs</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </filesystem>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <tpm supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tpm-tis</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tpm-crb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>emulator</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>external</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendVersion'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>2.0</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </tpm>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <redirdev supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='bus'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </redirdev>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <channel supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pty</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>unix</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </channel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <crypto supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>qemu</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>builtin</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </crypto>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <interface supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>default</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>passt</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </interface>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <panic supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>isa</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>hyperv</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </panic>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <console supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>null</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pty</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dev</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>file</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pipe</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>stdio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>udp</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tcp</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>unix</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>qemu-vdagent</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dbus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </console>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </devices>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <gic supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <vmcoreinfo supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <genid supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <backingStoreInput supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <backup supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <async-teardown supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <ps2 supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <sev supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <sgx supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <hyperv supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='features'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>relaxed</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vapic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>spinlocks</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vpindex</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>runtime</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>synic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>stimer</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>reset</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vendor_id</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>frequencies</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>reenlightenment</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tlbflush</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ipi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>avic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>emsr_bitmap</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>xmm_input</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <defaults>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <spinlocks>4095</spinlocks>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <stimer_direct>on</stimer_direct>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </defaults>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </hyperv>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <launchSecurity supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='sectype'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tdx</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </launchSecurity>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: </domainCapabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.569 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: <domainCapabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <domain>kvm</domain>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <arch>x86_64</arch>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <vcpu max='240'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <iothreads supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <os supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <enum name='firmware'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <loader supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>rom</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pflash</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='readonly'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>yes</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>no</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='secure'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>no</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </loader>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </os>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>on</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>off</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='maximum' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='maximumMigratable'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>on</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>off</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='host-model' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <vendor>AMD</vendor>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='x2apic'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='stibp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='succor'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='lbrv'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <mode name='custom' supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Broadwell-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Cooperlake-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Denverton-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Dhyana-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Genoa'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='auto-ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='auto-ibrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amd-psfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='no-nested-data-bp'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='null-sel-clr-base'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='stibp-always-on'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='EPYC-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-128'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-256'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx10-512'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='prefetchiti'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Haswell-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='IvyBridge-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='KnightsMill'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4fmaps'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4vnniw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512er'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512pf'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='KnightsMill-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4fmaps'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-4vnniw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512er'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512pf'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tbm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fma4'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tbm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xop'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='amx-tile'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-bf16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-fp16'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bitalg'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vbmi2'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrc'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fzrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='la57'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='taa-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='tsx-ldtrk'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xfd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SierraForest'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ne-convert'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cmpccxadd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='SierraForest-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ifma'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-ne-convert'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx-vnni-int8'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='bus-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cmpccxadd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fbsdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='fsrs'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ibrs-all'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mcdt-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pbrsb-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='psdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='serialize'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vaes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='vpclmulqdq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='hle'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='rtm'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512bw'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512cd'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512dq'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512f'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='avx512vl'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='invpcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pcid'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='pku'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='mpx'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v2'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v3'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='core-capability'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='split-lock-detect'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='Snowridge-v4'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='cldemote'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='erms'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='gfni'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdir64b'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='movdiri'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='xsaves'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='athlon'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='athlon-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='core2duo'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='core2duo-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='coreduo'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='coreduo-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='n270'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='n270-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='ss'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='phenom'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <blockers model='phenom-v1'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnow'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <feature name='3dnowext'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </blockers>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </mode>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </cpu>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <memoryBacking supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <enum name='sourceType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>file</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>anonymous</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <value>memfd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </memoryBacking>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <devices>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <disk supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='diskDevice'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>disk</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>cdrom</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>floppy</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>lun</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='bus'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ide</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>fdc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>scsi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>sata</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-non-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </disk>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <graphics supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vnc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>egl-headless</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dbus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </graphics>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <video supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='modelType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vga</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>cirrus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>none</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>bochs</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ramfb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </video>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <hostdev supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='mode'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>subsystem</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='startupPolicy'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>default</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>mandatory</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>requisite</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>optional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='subsysType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pci</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>scsi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='capsType'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='pciBackend'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </hostdev>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <rng supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtio-non-transitional</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>random</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>egd</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>builtin</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </rng>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <filesystem supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='driverType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>path</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>handle</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>virtiofs</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </filesystem>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <tpm supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tpm-tis</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tpm-crb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>emulator</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>external</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendVersion'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>2.0</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </tpm>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <redirdev supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='bus'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>usb</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </redirdev>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <channel supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pty</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>unix</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </channel>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <crypto supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>qemu</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendModel'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>builtin</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </crypto>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <interface supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='backendType'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>default</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>passt</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </interface>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <panic supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='model'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>isa</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>hyperv</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </panic>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <console supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='type'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>null</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vc</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pty</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dev</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>file</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>pipe</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>stdio</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>udp</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tcp</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>unix</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>qemu-vdagent</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>dbus</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </console>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </devices>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   <features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <gic supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <vmcoreinfo supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <genid supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <backingStoreInput supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <backup supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <async-teardown supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <ps2 supported='yes'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <sev supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <sgx supported='no'/>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <hyperv supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='features'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>relaxed</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vapic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>spinlocks</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vpindex</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>runtime</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>synic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>stimer</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>reset</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>vendor_id</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>frequencies</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>reenlightenment</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tlbflush</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>ipi</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>avic</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>emsr_bitmap</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>xmm_input</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <defaults>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <spinlocks>4095</spinlocks>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <stimer_direct>on</stimer_direct>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </defaults>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </hyperv>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     <launchSecurity supported='yes'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       <enum name='sectype'>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:         <value>tdx</value>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:       </enum>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:     </launchSecurity>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:   </features>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: </domainCapabilities>
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.635 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.635 230641 INFO nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Secure Boot support detected
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.638 230641 INFO nova.virt.libvirt.driver [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.639 230641 INFO nova.virt.libvirt.driver [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.657 230641 DEBUG nova.virt.libvirt.driver [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.736 230641 INFO nova.virt.node [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Determined node identity c79215b2-6762-4f7f-a322-f44db2b0b9bd from /var/lib/nova/compute_id
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.814 230641 DEBUG nova.compute.manager [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Verified node c79215b2-6762-4f7f-a322-f44db2b0b9bd matches my host np0005541913.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.878 230641 DEBUG nova.compute.manager [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.881 230641 DEBUG nova.virt.libvirt.vif [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-02T08:31:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005541913.localdomain',hostname='test',id=2,image_ref='6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T08:31:55Z,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005541913.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e2d97696ab6749899bb8ba5ce29a3de2',ramdisk_id='',reservation_id='r-6ofcfgb6',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T08:31:55Z,user_data=None,user_id='cb8b7d2a63b642aa999db12e17eeb9e4',uuid=b254bb7f-2891-4b37-9c44-9700e301ce16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.882 230641 DEBUG nova.network.os_vif_util [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Converting VIF {"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.882 230641 DEBUG nova.network.os_vif_util [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.883 230641 DEBUG os_vif [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.917 230641 DEBUG ovsdbapp.backend.ovs_idl [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.918 230641 DEBUG ovsdbapp.backend.ovs_idl [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.918 230641 DEBUG ovsdbapp.backend.ovs_idl [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.918 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.919 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.919 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.920 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.939 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.940 230641 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.940 230641 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:35:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:05.941 230641 INFO oslo.privsep.daemon [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpzgrglbqo/privsep.sock']
Dec 02 09:35:06 np0005541913.localdomain sshd[208830]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:35:06 np0005541913.localdomain systemd[1]: session-54.scope: Deactivated successfully.
Dec 02 09:35:06 np0005541913.localdomain systemd[1]: session-54.scope: Consumed 2min 18.979s CPU time.
Dec 02 09:35:06 np0005541913.localdomain systemd-logind[757]: Session 54 logged out. Waiting for processes to exit.
Dec 02 09:35:06 np0005541913.localdomain systemd-logind[757]: Removed session 54.
Dec 02 09:35:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7-merged.mount: Deactivated successfully.
Dec 02 09:35:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a-userdata-shm.mount: Deactivated successfully.
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.543 230641 INFO oslo.privsep.daemon [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Spawned new privsep daemon via rootwrap
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.423 230903 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.429 230903 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.433 230903 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.433 230903 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230903
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.836 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.837 230641 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a318f6a-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.837 230641 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a318f6a-b3, col_values=(('external_ids', {'iface-id': '4a318f6a-b3c1-4690-8246-f7d046ccd64a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:b2:03', 'vm-uuid': 'b254bb7f-2891-4b37-9c44-9700e301ce16'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.838 230641 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.838 230641 INFO os_vif [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3')
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.838 230641 DEBUG nova.compute.manager [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.842 230641 DEBUG nova.compute.manager [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.842 230641 INFO nova.compute.manager [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.905 230641 DEBUG oslo_concurrency.lockutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.905 230641 DEBUG oslo_concurrency.lockutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.905 230641 DEBUG oslo_concurrency.lockutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.906 230641 DEBUG nova.compute.resource_tracker [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:35:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:06.907 230641 DEBUG oslo_concurrency.processutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:35:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6764 DF PROTO=TCP SPT=57932 DPT=9102 SEQ=3257595781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A4799FEA50000000001030307) 
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.352 230641 DEBUG oslo_concurrency.processutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.522 230641 DEBUG nova.virt.libvirt.driver [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.522 230641 DEBUG nova.virt.libvirt.driver [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.721 230641 WARNING nova.virt.libvirt.driver [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.724 230641 DEBUG nova.compute.resource_tracker [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12937MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.724 230641 DEBUG oslo_concurrency.lockutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.725 230641 DEBUG oslo_concurrency.lockutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.850 230641 DEBUG nova.compute.resource_tracker [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.851 230641 DEBUG nova.compute.resource_tracker [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.851 230641 DEBUG nova.compute.resource_tracker [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.893 230641 DEBUG nova.scheduler.client.report [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.949 230641 DEBUG nova.scheduler.client.report [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.950 230641 DEBUG nova.compute.provider_tree [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 09:35:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:07.992 230641 DEBUG nova.scheduler.client.report [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.028 230641 DEBUG nova.scheduler.client.report [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_VOLUME_EXTEND,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,HW_CPU_X86_FMA3,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.067 230641 DEBUG oslo_concurrency.processutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.423 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.502 230641 DEBUG oslo_concurrency.processutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.508 230641 DEBUG nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.509 230641 INFO nova.virt.libvirt.host [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] kernel doesn't support AMD SEV
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.511 230641 DEBUG nova.compute.provider_tree [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.512 230641 DEBUG nova.virt.libvirt.driver [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.537 230641 DEBUG nova.scheduler.client.report [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.561 230641 DEBUG nova.compute.resource_tracker [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.561 230641 DEBUG oslo_concurrency.lockutils [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.837s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.562 230641 DEBUG nova.service [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.589 230641 DEBUG nova.service [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 02 09:35:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:08.590 230641 DEBUG nova.servicegroup.drivers.db [None req-4b1c03a2-4e75-4ab0-a7b4-31e0ad296a73 - - - - - -] DB_Driver: join new ServiceGroup member np0005541913.localdomain to the compute group, service = <Service: host=np0005541913.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 02 09:35:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28139 DF PROTO=TCP SPT=52202 DPT=9100 SEQ=179153176 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A0A640000000001030307) 
Dec 02 09:35:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:35:10 np0005541913.localdomain podman[230951]: 2025-12-02 09:35:10.427495184 +0000 UTC m=+0.065146525 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:35:10 np0005541913.localdomain podman[230951]: 2025-12-02 09:35:10.459283559 +0000 UTC m=+0.096934900 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 09:35:10 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:35:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:10.926 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:11.592 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:35:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:11.617 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Triggering sync for uuid b254bb7f-2891-4b37-9c44-9700e301ce16 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 02 09:35:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:11.617 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:35:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:11.618 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:35:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:11.618 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:35:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:11.672 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:35:11 np0005541913.localdomain sshd[230976]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:35:12 np0005541913.localdomain sshd[230976]: Accepted publickey for zuul from 192.168.122.30 port 38172 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:35:12 np0005541913.localdomain systemd-logind[757]: New session 56 of user zuul.
Dec 02 09:35:12 np0005541913.localdomain systemd[1]: Started Session 56 of User zuul.
Dec 02 09:35:12 np0005541913.localdomain sshd[230976]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:35:12 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23699 DF PROTO=TCP SPT=58080 DPT=9100 SEQ=3870753883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A15E40000000001030307) 
Dec 02 09:35:13 np0005541913.localdomain python3.9[231087]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:35:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:35:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:13.466 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:13 np0005541913.localdomain podman[231092]: 2025-12-02 09:35:13.479048566 +0000 UTC m=+0.116114281 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 09:35:13 np0005541913.localdomain podman[231092]: 2025-12-02 09:35:13.513831552 +0000 UTC m=+0.150897237 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 09:35:13 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:35:14 np0005541913.localdomain sudo[231218]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jptmnhigagnrcrzuhqadqyhreazgzxix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668113.8177724-69-167081228041586/AnsiballZ_systemd_service.py
Dec 02 09:35:14 np0005541913.localdomain sudo[231218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:14 np0005541913.localdomain python3.9[231220]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:35:14 np0005541913.localdomain systemd-rc-local-generator[231248]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:35:14 np0005541913.localdomain systemd-sysv-generator[231251]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:14 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:15 np0005541913.localdomain sudo[231218]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:15 np0005541913.localdomain python3.9[231364]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:35:15 np0005541913.localdomain network[231381]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:35:15 np0005541913.localdomain network[231382]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:35:15 np0005541913.localdomain network[231383]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:35:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:15.930 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28141 DF PROTO=TCP SPT=52202 DPT=9100 SEQ=179153176 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A22240000000001030307) 
Dec 02 09:35:18 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:35:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:18.510 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6766 DF PROTO=TCP SPT=57932 DPT=9102 SEQ=3257595781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A2DE40000000001030307) 
Dec 02 09:35:19 np0005541913.localdomain sudo[231436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:35:19 np0005541913.localdomain sudo[231436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:35:19 np0005541913.localdomain sudo[231436]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:19 np0005541913.localdomain sudo[231454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:35:19 np0005541913.localdomain sudo[231454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:35:20 np0005541913.localdomain sudo[231454]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:20 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:20.935 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:21 np0005541913.localdomain sudo[231556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:35:21 np0005541913.localdomain sudo[231556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:35:21 np0005541913.localdomain sudo[231556]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9496 DF PROTO=TCP SPT=41536 DPT=9101 SEQ=1365803905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A39F10000000001030307) 
Dec 02 09:35:23 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:23.542 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:24 np0005541913.localdomain sudo[231702]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mesxzdnocylczyheuuqwrfphmmtbwkgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668124.2241392-126-82065249437646/AnsiballZ_systemd_service.py
Dec 02 09:35:24 np0005541913.localdomain sudo[231702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:24 np0005541913.localdomain python3.9[231704]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:35:24 np0005541913.localdomain sudo[231702]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9498 DF PROTO=TCP SPT=41536 DPT=9101 SEQ=1365803905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A45E40000000001030307) 
Dec 02 09:35:25 np0005541913.localdomain sudo[231813]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkgnlpdfwdjoylmgeksexyxrjtoemegc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668125.1846066-156-82500919321501/AnsiballZ_file.py
Dec 02 09:35:25 np0005541913.localdomain sudo[231813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:25 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:25.939 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:26 np0005541913.localdomain python3.9[231815]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:26 np0005541913.localdomain sudo[231813]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:26 np0005541913.localdomain systemd-journald[47611]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 76.6 (255 of 333 items), suggesting rotation.
Dec 02 09:35:26 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 09:35:26 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:35:26 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:35:26 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:35:26 np0005541913.localdomain sudo[231924]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcvppqflxlpymhbjrkgzgxviyzmmvhci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668126.3252091-180-14022821049642/AnsiballZ_file.py
Dec 02 09:35:26 np0005541913.localdomain sudo[231924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:26 np0005541913.localdomain python3.9[231926]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:26 np0005541913.localdomain sudo[231924]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:27 np0005541913.localdomain sudo[232034]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thiuahlwsujmulnxvuaxunaqjaewjhfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668127.2237427-208-186521886689481/AnsiballZ_command.py
Dec 02 09:35:27 np0005541913.localdomain sudo[232034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:27 np0005541913.localdomain python3.9[232036]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:35:27 np0005541913.localdomain sudo[232034]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:28 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:28.577 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:28 np0005541913.localdomain python3.9[232146]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 09:35:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9499 DF PROTO=TCP SPT=41536 DPT=9101 SEQ=1365803905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A55A40000000001030307) 
Dec 02 09:35:29 np0005541913.localdomain sudo[232254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuayxsqkbyfopnocitdwvgjccpwbwdqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668129.469444-261-236527411372269/AnsiballZ_systemd_service.py
Dec 02 09:35:29 np0005541913.localdomain sudo[232254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:30 np0005541913.localdomain python3.9[232256]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:35:30 np0005541913.localdomain systemd-sysv-generator[232286]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:35:30 np0005541913.localdomain systemd-rc-local-generator[232283]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:30 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:30 np0005541913.localdomain sudo[232254]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:30 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:30.943 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:31 np0005541913.localdomain sudo[232399]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnriaugcwxbobakypsxdmywvcgrekmxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668130.9393213-285-92117129508102/AnsiballZ_command.py
Dec 02 09:35:31 np0005541913.localdomain sudo[232399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:31 np0005541913.localdomain python3.9[232401]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:35:31 np0005541913.localdomain sudo[232399]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:32 np0005541913.localdomain sudo[232510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlqgltsqyvxzfbddteujvifzhxkoaxzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668132.6537938-312-98704523072954/AnsiballZ_file.py
Dec 02 09:35:32 np0005541913.localdomain sudo[232510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:33 np0005541913.localdomain python3.9[232512]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:35:33 np0005541913.localdomain sudo[232510]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:33 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:33.592 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45032 DF PROTO=TCP SPT=47504 DPT=9102 SEQ=606966644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A67AD0000000001030307) 
Dec 02 09:35:33 np0005541913.localdomain python3.9[232620]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:35:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62666 DF PROTO=TCP SPT=46636 DPT=9105 SEQ=3780615674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A682E0000000001030307) 
Dec 02 09:35:34 np0005541913.localdomain python3.9[232730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:35 np0005541913.localdomain python3.9[232816]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668134.1926758-360-172485861246736/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=bfc5245921d5dcd25b60f488666da7c4ada35563 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:35:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:35:35 np0005541913.localdomain podman[232840]: 2025-12-02 09:35:35.45556416 +0000 UTC m=+0.083660078 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 09:35:35 np0005541913.localdomain podman[232840]: 2025-12-02 09:35:35.494468329 +0000 UTC m=+0.122564227 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 02 09:35:35 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:35:35 np0005541913.localdomain sudo[232944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bthmheasudljiqydthvpnvirjorgckmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668135.3767898-405-16473921696429/AnsiballZ_group.py
Dec 02 09:35:35 np0005541913.localdomain sudo[232944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:35 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:35.948 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:36 np0005541913.localdomain python3.9[232946]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 02 09:35:36 np0005541913.localdomain sudo[232944]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:36 np0005541913.localdomain sudo[233054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrnuecygfkmnmofmfyadswvwxugexspx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668136.4354982-438-40598609269635/AnsiballZ_getent.py
Dec 02 09:35:36 np0005541913.localdomain sudo[233054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:36 np0005541913.localdomain python3.9[233056]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 02 09:35:36 np0005541913.localdomain sudo[233054]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45034 DF PROTO=TCP SPT=47504 DPT=9102 SEQ=606966644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A73A40000000001030307) 
Dec 02 09:35:37 np0005541913.localdomain sudo[233165]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfvlnxpoymvvvrygozrbrpjsnxyfzlos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668137.2252266-462-25126719654840/AnsiballZ_group.py
Dec 02 09:35:37 np0005541913.localdomain sudo[233165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:37 np0005541913.localdomain python3.9[233167]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 09:35:37 np0005541913.localdomain groupadd[233168]: group added to /etc/group: name=ceilometer, GID=42405
Dec 02 09:35:37 np0005541913.localdomain groupadd[233168]: group added to /etc/gshadow: name=ceilometer
Dec 02 09:35:37 np0005541913.localdomain groupadd[233168]: new group: name=ceilometer, GID=42405
Dec 02 09:35:37 np0005541913.localdomain sudo[233165]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:38 np0005541913.localdomain sudo[233281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbxnuxlnuxvnlafytmlncafxifawvotg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668138.014656-486-22939140749080/AnsiballZ_user.py
Dec 02 09:35:38 np0005541913.localdomain sudo[233281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:38 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:38.629 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:38 np0005541913.localdomain python3.9[233283]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005541913.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 09:35:38 np0005541913.localdomain useradd[233285]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Dec 02 09:35:38 np0005541913.localdomain useradd[233285]: add 'ceilometer' to group 'libvirt'
Dec 02 09:35:38 np0005541913.localdomain useradd[233285]: add 'ceilometer' to shadow group 'libvirt'
Dec 02 09:35:38 np0005541913.localdomain sudo[233281]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34916 DF PROTO=TCP SPT=42940 DPT=9100 SEQ=3703420671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A7FA40000000001030307) 
Dec 02 09:35:40 np0005541913.localdomain python3.9[233399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:40 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:40.952 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:35:41 np0005541913.localdomain podman[233457]: 2025-12-02 09:35:41.447578128 +0000 UTC m=+0.084776228 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 09:35:41 np0005541913.localdomain podman[233457]: 2025-12-02 09:35:41.484043801 +0000 UTC m=+0.121241831 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 02 09:35:41 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:35:41 np0005541913.localdomain python3.9[233498]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764668140.0651753-564-75473137919702/.source.conf _original_basename=ceilometer.conf follow=False checksum=9b40aa523dc31738ea523cc852832670ccea382a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:42 np0005541913.localdomain python3.9[233618]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:42 np0005541913.localdomain python3.9[233704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764668141.7342327-564-233550408512229/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63247 DF PROTO=TCP SPT=49968 DPT=9882 SEQ=4282664875 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A8BA40000000001030307) 
Dec 02 09:35:43 np0005541913.localdomain python3.9[233812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:43 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:43.629 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:43 np0005541913.localdomain python3.9[233898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764668142.7743542-564-168919232804067/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:35:44 np0005541913.localdomain podman[234007]: 2025-12-02 09:35:44.446154187 +0000 UTC m=+0.086195538 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 02 09:35:44 np0005541913.localdomain podman[234007]: 2025-12-02 09:35:44.481347855 +0000 UTC m=+0.121389246 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:35:44 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:35:44 np0005541913.localdomain python3.9[234006]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:35:45 np0005541913.localdomain python3.9[234133]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:35:45 np0005541913.localdomain python3.9[234241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:45 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:45.955 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34918 DF PROTO=TCP SPT=42940 DPT=9100 SEQ=3703420671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479A97640000000001030307) 
Dec 02 09:35:46 np0005541913.localdomain python3.9[234327]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668145.4032545-741-188944311389369/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:46 np0005541913.localdomain python3.9[234435]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:47 np0005541913.localdomain python3.9[234490]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:47 np0005541913.localdomain python3.9[234598]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:48 np0005541913.localdomain python3.9[234684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668147.3689075-741-250313499665376/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:48 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:48.631 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:48 np0005541913.localdomain python3.9[234792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:49 np0005541913.localdomain python3.9[234878]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668148.4249544-741-8876780757638/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62670 DF PROTO=TCP SPT=46636 DPT=9105 SEQ=3780615674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479AA3E40000000001030307) 
Dec 02 09:35:49 np0005541913.localdomain python3.9[234986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:50 np0005541913.localdomain python3.9[235072]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668149.4338768-741-150443456949805/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:50 np0005541913.localdomain python3.9[235180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:50 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:50.958 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:51 np0005541913.localdomain python3.9[235266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668150.3922884-741-177399318022042/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:51 np0005541913.localdomain python3.9[235374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51926 DF PROTO=TCP SPT=52794 DPT=9101 SEQ=4271548448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479AAF200000000001030307) 
Dec 02 09:35:52 np0005541913.localdomain python3.9[235460]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668151.4334655-741-135975058895471/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:52 np0005541913.localdomain python3.9[235568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:53 np0005541913.localdomain python3.9[235654]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668152.570758-741-174614124792098/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:53 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:53.634 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:54 np0005541913.localdomain python3.9[235762]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:54 np0005541913.localdomain python3.9[235848]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668153.597405-741-253459030562561/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51928 DF PROTO=TCP SPT=52794 DPT=9101 SEQ=4271548448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479ABB240000000001030307) 
Dec 02 09:35:55 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:55.962 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:55 np0005541913.localdomain python3.9[235956]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:56 np0005541913.localdomain python3.9[236042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668155.4869292-741-67139106266429/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:57 np0005541913.localdomain python3.9[236150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:35:58 np0005541913.localdomain python3.9[236236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668156.6512158-741-88948770972438/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:35:58 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:35:58.636 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:35:58 np0005541913.localdomain sudo[236344]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlpitxnisnnzdfmpzopzandazahfahnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668158.589981-1206-64665930033839/AnsiballZ_file.py
Dec 02 09:35:58 np0005541913.localdomain sudo[236344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:59 np0005541913.localdomain python3.9[236346]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:35:59 np0005541913.localdomain sudo[236344]: pam_unix(sudo:session): session closed for user root
Dec 02 09:35:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51929 DF PROTO=TCP SPT=52794 DPT=9101 SEQ=4271548448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479ACAE40000000001030307) 
Dec 02 09:35:59 np0005541913.localdomain sudo[236454]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcrgwcgreiasrwczethzaoxnerorthuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668159.298164-1230-34276316059389/AnsiballZ_systemd_service.py
Dec 02 09:35:59 np0005541913.localdomain sudo[236454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:35:59 np0005541913.localdomain python3.9[236456]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:35:59 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:35:59 np0005541913.localdomain systemd-rc-local-generator[236480]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:35:59 np0005541913.localdomain systemd-sysv-generator[236484]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:35:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:35:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:36:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:00 np0005541913.localdomain systemd[1]: Listening on Podman API Socket.
Dec 02 09:36:00 np0005541913.localdomain sudo[236454]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:00 np0005541913.localdomain sudo[236603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfpfouyxfhehuaketkcwjogpgnachxms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668160.61259-1257-146607915792693/AnsiballZ_stat.py
Dec 02 09:36:00 np0005541913.localdomain sudo[236603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:00 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:00.964 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:01 np0005541913.localdomain python3.9[236605]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:36:01 np0005541913.localdomain sudo[236603]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:01 np0005541913.localdomain sudo[236691]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qujnnjsgzjoujljmufcgdaykljinauwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668160.61259-1257-146607915792693/AnsiballZ_copy.py
Dec 02 09:36:01 np0005541913.localdomain sudo[236691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:01 np0005541913.localdomain python3.9[236693]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668160.61259-1257-146607915792693/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:36:01 np0005541913.localdomain sudo[236691]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:01 np0005541913.localdomain sudo[236746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cainiydycsnfahdmadoghixkdllwwtxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668160.61259-1257-146607915792693/AnsiballZ_stat.py
Dec 02 09:36:01 np0005541913.localdomain sudo[236746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:01 np0005541913.localdomain python3.9[236748]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:36:01 np0005541913.localdomain sudo[236746]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:02 np0005541913.localdomain sudo[236834]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lklvdnvmeikhcnulqklafgziqrelxyak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668160.61259-1257-146607915792693/AnsiballZ_copy.py
Dec 02 09:36:02 np0005541913.localdomain sudo[236834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:02 np0005541913.localdomain python3.9[236836]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668160.61259-1257-146607915792693/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:36:02 np0005541913.localdomain sudo[236834]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:36:03.018 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:36:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:36:03.018 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:36:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:36:03.020 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:36:03 np0005541913.localdomain sudo[236944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wizszkqpsqylntetxeojiofretrbtpqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668163.0795858-1341-62584817404954/AnsiballZ_container_config_data.py
Dec 02 09:36:03 np0005541913.localdomain sudo[236944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:03 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:03.639 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:03 np0005541913.localdomain python3.9[236946]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Dec 02 09:36:03 np0005541913.localdomain sudo[236944]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27093 DF PROTO=TCP SPT=53614 DPT=9102 SEQ=3585547451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479ADCDD0000000001030307) 
Dec 02 09:36:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39646 DF PROTO=TCP SPT=59352 DPT=9105 SEQ=3701199635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479ADD5F0000000001030307) 
Dec 02 09:36:04 np0005541913.localdomain sudo[237054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcnjxgeeycpbalozpuyljepsautedlde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668164.0308256-1368-266366283419463/AnsiballZ_container_config_hash.py
Dec 02 09:36:04 np0005541913.localdomain sudo[237054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:04.780 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:36:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:04.781 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:36:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:04.781 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:36:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:04.781 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:36:04 np0005541913.localdomain python3.9[237056]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:36:04 np0005541913.localdomain sudo[237054]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:05.970 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:36:06 np0005541913.localdomain systemd[1]: tmp-crun.muGbOk.mount: Deactivated successfully.
Dec 02 09:36:06 np0005541913.localdomain podman[237128]: 2025-12-02 09:36:06.445848665 +0000 UTC m=+0.077942014 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:36:06 np0005541913.localdomain podman[237128]: 2025-12-02 09:36:06.458975673 +0000 UTC m=+0.091069032 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd)
Dec 02 09:36:06 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:36:06 np0005541913.localdomain sudo[237183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhqwxqqdftworlsrygyjbkdhjgiujnte ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668165.3142514-1398-34125618955622/AnsiballZ_edpm_container_manage.py
Dec 02 09:36:06 np0005541913.localdomain sudo[237183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:06.957 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:36:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:06.957 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:36:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:06.957 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:36:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:06.957 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:36:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27095 DF PROTO=TCP SPT=53614 DPT=9102 SEQ=3585547451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479AE8E40000000001030307) 
Dec 02 09:36:07 np0005541913.localdomain python3[237185]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:36:07 np0005541913.localdomain python3[237185]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2",
                                                                    "Digest": "sha256:667029e1ec7e63fffa1a096f432f6160b441ba36df1bddc9066cbd1129b82009",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:667029e1ec7e63fffa1a096f432f6160b441ba36df1bddc9066cbd1129b82009"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:21:53.58682213Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 505175293,
                                                                    "VirtualSize": 505175293,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:a47016624274f5ebad76019f5a2e465c1737f96caa539b36f90ab8e33592f415",
                                                                              "sha256:38a03f5e96658211fb28e2f87c11ffad531281d1797368f48e6cd4af7ac97c0e"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:14:56.244673147Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:14:56.960273159Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:37.588899909Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:41.197123864Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:21:19.680010224Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:21:53.584924649Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:21:56.278821402Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 02 09:36:07 np0005541913.localdomain podman[237234]: 2025-12-02 09:36:07.92202396 +0000 UTC m=+0.071424657 container remove 4dfd7f1c1fc63e05e05b9092f58ce77f66c4bad4f443866ab42107af804ab4be (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '72848ce4d815e5b4e89ff3e01c5f9f7e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Dec 02 09:36:07 np0005541913.localdomain python3[237185]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Dec 02 09:36:07 np0005541913.localdomain podman[237247]: 2025-12-02 09:36:07.974948541 +0000 UTC m=+0.037292600 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 02 09:36:08 np0005541913.localdomain podman[237247]: 
Dec 02 09:36:08 np0005541913.localdomain podman[237247]: 2025-12-02 09:36:08.012284011 +0000 UTC m=+0.074628070 container create 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 02 09:36:08 np0005541913.localdomain python3[237185]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Dec 02 09:36:08 np0005541913.localdomain sudo[237183]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:08.643 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:08 np0005541913.localdomain sudo[237390]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pngbiiexajlmevqrricidryjfjkezncr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668168.4335089-1422-98808382033227/AnsiballZ_stat.py
Dec 02 09:36:08 np0005541913.localdomain sudo[237390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:08 np0005541913.localdomain python3.9[237392]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:36:08 np0005541913.localdomain sudo[237390]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.242 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.267 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.267 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.267 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.268 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.268 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.268 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.268 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.269 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.269 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.269 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.285 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.286 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.286 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.286 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.286 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:36:09 np0005541913.localdomain sudo[237522]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jinjuxbnwuxebjujqxxwoprlbkdvjata ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668169.1904955-1449-111710315424544/AnsiballZ_file.py
Dec 02 09:36:09 np0005541913.localdomain sudo[237522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:09 np0005541913.localdomain python3.9[237524]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:36:09 np0005541913.localdomain sudo[237522]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.701 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.913 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:36:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:09.914 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:36:10 np0005541913.localdomain sudo[237633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aohzrudxqjdjdxnsfgozwgawhghgnaio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668169.6609342-1449-157646911716245/AnsiballZ_copy.py
Dec 02 09:36:10 np0005541913.localdomain sudo[237633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11273 DF PROTO=TCP SPT=60532 DPT=9100 SEQ=1810058814 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479AF4E50000000001030307) 
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.122 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.123 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12928MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.123 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.123 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:36:10 np0005541913.localdomain python3.9[237635]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764668169.6609342-1449-157646911716245/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:36:10 np0005541913.localdomain sudo[237633]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.427 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.428 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.428 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.483 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:36:10 np0005541913.localdomain sudo[237708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omfdkpjfdsltplybydzvkgflbkbiaolw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668169.6609342-1449-157646911716245/AnsiballZ_systemd.py
Dec 02 09:36:10 np0005541913.localdomain sudo[237708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.918 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.925 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.947 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.950 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:36:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:10.950 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:36:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:11.001 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:11 np0005541913.localdomain python3.9[237710]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:36:11 np0005541913.localdomain systemd-sysv-generator[237739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:36:11 np0005541913.localdomain systemd-rc-local-generator[237735]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:36:11 np0005541913.localdomain sudo[237708]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:11 np0005541913.localdomain podman[237749]: 2025-12-02 09:36:11.619315774 +0000 UTC m=+0.058448694 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:36:11 np0005541913.localdomain podman[237749]: 2025-12-02 09:36:11.654869008 +0000 UTC m=+0.094001928 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:36:11 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:36:11 np0005541913.localdomain sudo[237826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebtmsyeoflualxwjxxbqpmpdlqxbwevp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668169.6609342-1449-157646911716245/AnsiballZ_systemd.py
Dec 02 09:36:11 np0005541913.localdomain sudo[237826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:12 np0005541913.localdomain python3.9[237828]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:36:12 np0005541913.localdomain systemd-rc-local-generator[237856]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:36:12 np0005541913.localdomain systemd-sysv-generator[237860]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:36:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ddd6aaec82e09d3b6ef171e2ac941eb72bf7461746145a1488501bf649c854/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 02 09:36:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ddd6aaec82e09d3b6ef171e2ac941eb72bf7461746145a1488501bf649c854/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:36:12 np0005541913.localdomain podman[237868]: 2025-12-02 09:36:12.660025333 +0000 UTC m=+0.142730791 container init 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: + sudo -E kolla_set_configs
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: sudo: unable to send audit message: Operation not permitted
Dec 02 09:36:12 np0005541913.localdomain sudo[237887]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 09:36:12 np0005541913.localdomain sudo[237887]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 02 09:36:12 np0005541913.localdomain sudo[237887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:36:12 np0005541913.localdomain podman[237868]: 2025-12-02 09:36:12.705896482 +0000 UTC m=+0.188601890 container start 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 09:36:12 np0005541913.localdomain podman[237868]: ceilometer_agent_compute
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: Started ceilometer_agent_compute container.
Dec 02 09:36:12 np0005541913.localdomain sudo[237826]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Validating config file
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Copying service configuration files
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: INFO:__main__:Writing out command to execute
Dec 02 09:36:12 np0005541913.localdomain sudo[237887]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: ++ cat /run_command
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: + ARGS=
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: + sudo kolla_copy_cacerts
Dec 02 09:36:12 np0005541913.localdomain podman[237890]: 2025-12-02 09:36:12.78944525 +0000 UTC m=+0.079150626 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: sudo: unable to send audit message: Operation not permitted
Dec 02 09:36:12 np0005541913.localdomain sudo[237905]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 02 09:36:12 np0005541913.localdomain sudo[237905]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 02 09:36:12 np0005541913.localdomain sudo[237905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 02 09:36:12 np0005541913.localdomain sudo[237905]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: + [[ ! -n '' ]]
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: + . kolla_extend_start
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: + umask 0022
Dec 02 09:36:12 np0005541913.localdomain ceilometer_agent_compute[237880]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 02 09:36:12 np0005541913.localdomain podman[237890]: 2025-12-02 09:36:12.822066858 +0000 UTC m=+0.111772254 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm)
Dec 02 09:36:12 np0005541913.localdomain podman[237890]: unhealthy
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:36:12 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Failed with result 'exit-code'.
Dec 02 09:36:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45363 DF PROTO=TCP SPT=43532 DPT=9882 SEQ=1779290621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B00E40000000001030307) 
Dec 02 09:36:13 np0005541913.localdomain sudo[238019]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmczpfxnmldxbbnetjeenujngmwlqblx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668173.0193079-1521-176923051421568/AnsiballZ_systemd.py
Dec 02 09:36:13 np0005541913.localdomain sudo[238019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.509 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.510 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.511 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.512 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.513 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.514 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.515 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.516 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.517 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.518 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.519 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.520 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.521 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.522 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.523 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.523 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.523 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.523 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.523 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.523 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.523 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.523 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.541 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.543 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.544 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 02 09:36:13 np0005541913.localdomain python3.9[238021]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:36:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:13.644 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.648 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 02 09:36:13 np0005541913.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.714 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.714 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.714 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.714 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.714 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.714 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.714 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.715 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.716 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.717 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.718 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.719 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.720 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.721 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.722 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.723 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.724 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.725 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.726 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.727 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.728 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.729 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.730 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.731 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.732 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.734 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.742 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.755 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.856 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.856 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Dec 02 09:36:13 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:13.857 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.116 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}732655fd94880f2cb79d6c2d7618e43553cd830a8505bfe543b3beb2043aad73" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.216 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Tue, 02 Dec 2025 09:36:14 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-8d02e764-7805-4f9c-ad21-a8ed3c0f0e19 x-openstack-request-id: req-8d02e764-7805-4f9c-ad21-a8ed3c0f0e19 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.216 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "45a99238-6f19-4f9e-be82-6ef3af1dcb31", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.216 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-8d02e764-7805-4f9c-ad21-a8ed3c0f0e19 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.219 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}732655fd94880f2cb79d6c2d7618e43553cd830a8505bfe543b3beb2043aad73" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.246 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Tue, 02 Dec 2025 09:36:14 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-b2cf5dcc-c772-4a60-8f04-859f032aad6b x-openstack-request-id: req-b2cf5dcc-c772-4a60-8f04-859f032aad6b _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.246 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "45a99238-6f19-4f9e-be82-6ef3af1dcb31", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.246 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31 used request id req-b2cf5dcc-c772-4a60-8f04-859f032aad6b request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.248 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.253 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b254bb7f-2891-4b37-9c44-9700e301ce16 / tap4a318f6a-b3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.254 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9671c9a-b35d-4eda-afa9-673561907c85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.249217', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '583bec56-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': 'c6c114e6a73922f4a5eb39892eb243d69261a422cf05918d8753520ecf3d0aca'}]}, 'timestamp': '2025-12-02 09:36:14.255748', '_unique_id': 'be1253fff72b4a00bb7995df37afc65d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.263 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.268 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.268 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8793f53-e9b8-4667-824c-aa9bdd5871b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.268266', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '583df91a-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': 'b5a6c02117e884f0c09694022f655b0cc212d3a49e4b4750e731fc966548e842'}]}, 'timestamp': '2025-12-02 09:36:14.268968', '_unique_id': '35e7b5badfd648768aad2f4e964e677f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.270 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.271 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 8783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1ee184a-c8cb-45d1-86d8-a35d3c3359cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8783, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.271870', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '583e8236-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': '93e03528d9f855dbbdf1d55efa508fcd6697b995b07bd2d3d5427ee38e368018'}]}, 'timestamp': '2025-12-02 09:36:14.272452', '_unique_id': 'b2857a52c8764f32bb14a2386624d152'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.273 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.275 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.275 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fb7cdab-2b60-42e8-a8bb-85430e9be17e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.275377', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '583f0b8e-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': 'b9528425c9ef1dd14184ab43125467d2c9851704db5b1295ae613b39e8f1d8d2'}]}, 'timestamp': '2025-12-02 09:36:14.275966', '_unique_id': 'eb6adac3ecfc49838ab8f0cea774339d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.277 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.278 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.316 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.317 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4de888c2-a866-41e4-bb60-d676f81e1dab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:14.279027', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '584562fe-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': 'a855d23d30179f25f95c660f00cfc09eab516cc760774d742470c0e138a350ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:14.279027', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '58456de4-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '539accf333fe0423ecf1b98e7f000883b356b1031fc6be33fa0e7035b3e66088'}]}, 'timestamp': '2025-12-02 09:36:14.317712', '_unique_id': '66fd777f82934b79ac8253fa10be1889'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.318 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.319 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44aafa2c-1012-4322-875a-8a40773442c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.319450', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '5845be7a-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': '8723a344d6a9dfeb09acb493b00a91e6f8ac9a9c85b26f6f3b829564e585f418'}]}, 'timestamp': '2025-12-02 09:36:14.319716', '_unique_id': '7ba6399589774e87ab5f1c1e3a20f120'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.320 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fd2d381-5e1d-4580-8363-820faa313817', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:14.320804', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5845f2aa-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '0b3446c7e73403be1f30dc118c9f826ed9e3d42543bc16679f68523744c42a3d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:14.320804', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5845fa52-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '7bf67f16a7938318dc91529988feefacb323d12400c2d02088c73461b23ea76c'}]}, 'timestamp': '2025-12-02 09:36:14.321208', '_unique_id': 'd746a7cde64e49e7ac02f911dcd59498'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.321 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.322 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.322 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.322 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.345 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 51600000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96a9c44d-cab2-4f13-8944-dbd7e2a3c464', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51600000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:36:14.322796', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5849b0b6-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.564024255, 'message_signature': 'd2e2168b3899449cc6a224a3043d4085ffbff6061496903c5906a8b34d321457'}]}, 'timestamp': '2025-12-02 09:36:14.345572', '_unique_id': '6378ea32f7cb469f97e70959ee19e451'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.346 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 286697561 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 39228582 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ac9181d-fa1b-475b-8fe0-8234c5910900', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 286697561, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:14.347007', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5849f242-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '4d157c954ca33884c02825786766e7c2af024cf2738bd4b760a3d5f4c2454be9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39228582, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:14.347007', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5849f9f4-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '3cb68ce76858c6ed043e24f5e745873a3529fe58b137608908db5f8afedc74b9'}]}, 'timestamp': '2025-12-02 09:36:14.347413', '_unique_id': '9e835ad899fa462187ad2bead531797c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.347 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.348 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.348 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1433516318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.348 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 164656012 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ce0de5d-7010-46af-9cfa-8cba067d5191', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1433516318, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:14.348456', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '584a2b04-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '378eb934d99e728c62ae0742859a93c74bb438bdf2bb7a1e3c53ae8082c0a2ce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164656012, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:14.348456', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '584a3360-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '41e031c0b2af089f8ac20fd4665e0d049a0fdcaf76cca5724459650753ea1649'}]}, 'timestamp': '2025-12-02 09:36:14.348881', '_unique_id': '5f0fe1606627428f8e9e2c05d5c3e603'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.349 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d18c086-77af-4ed6-9f9b-b2affbb440f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.349889', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '584a62f4-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': '141d66931daa5a03e22b15d05478820e67b9f0014228e6ea007c00f3f635fec2'}]}, 'timestamp': '2025-12-02 09:36:14.350109', '_unique_id': '84e7bd85bd4d4077adfd627889db059e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.350 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '743be9b0-e7bb-4741-bda3-c7c63b3984de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.351102', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '584a922e-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': '6566f74ba8c5e53a9b777c7c0aff812dc89edabec606246ec81a9619c829dc2e'}]}, 'timestamp': '2025-12-02 09:36:14.351319', '_unique_id': 'cd4b07197de74f498162bfddf3323e44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.351 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.352 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.352 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.352 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.352 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.352 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.352 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e07f67d4-faaa-4805-82a9-787783d64269', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.352839', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '584ad5d6-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': 'fe6b60d534f36fcffe11b80d581a8dd0a591f0dacd5799facfb9e5703c1d9108'}]}, 'timestamp': '2025-12-02 09:36:14.353053', '_unique_id': '724f1a5d109849e2a6fbba2bc5051598'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.353 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a6f9318-a7d0-49ed-aafb-2b2ccccdd049', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.354030', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '584b04b6-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': 'ec968e7dbb49b7b264302aa31fbbcee7ca8d59ddc2cc0c8bf913b3811039a662'}]}, 'timestamp': '2025-12-02 09:36:14.354251', '_unique_id': '1444bec25ce74c01830889731cb29254'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.354 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.355 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.355 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.355 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c63d7fbc-0978-4708-b545-d897fe85acdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:14.355229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '584b3332-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': 'dc15a66f408eb5d16497b26d78d8106993a53add8fcafbd82b338d949a6c0331'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:14.355229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '584b3b84-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '0ad61b9e30aca39bdf630802035f736ac9f3520b5312e8ccc647745b9112cb35'}]}, 'timestamp': '2025-12-02 09:36:14.355671', '_unique_id': 'fdf458f6b38b47e4992aec252ec65ea0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.356 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.368 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.368 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f7c4e22-8505-4101-aaf1-d031505a9f72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:14.356733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '584d4226-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.57587362, 'message_signature': '29aaf5a4943308fb512a44a06e3bb59b8d9616bd899adb2ad86697d0b1e284bb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:14.356733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '584d4a6e-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.57587362, 'message_signature': '8ea11760aee8a910272b152a233eed45232e6b64ad2762f90e99088cdea4e24d'}]}, 'timestamp': '2025-12-02 09:36:14.369133', '_unique_id': 'cb9f3e16ad6145f19457ed8e8e262b4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.369 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.370 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.370 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.370 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.370 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.370 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c93ec8f7-feca-4b0c-8a87-2b4bcc42b888', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:14.370532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '584d8a38-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.57587362, 'message_signature': '1388faf98aea55da9f5ba0ab52066175b1f245cb8e6662062644e8478f578bec'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:14.370532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '584d91cc-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.57587362, 'message_signature': '77cea6be55432a8aae6a60b61b0bf1de7dd9d044edfe88085d5b50479a8a650d'}]}, 'timestamp': '2025-12-02 09:36:14.370958', '_unique_id': '4a01b6a131954990a29306246496e6b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.371 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f9bbe2b-ef74-49c0-8d31-c9608418cb86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:14.371948', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '584dc07a-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.468406627, 'message_signature': '9ae026c779e68295f69ff09229e8c8ddc7132603d94ca5527a5dd7148d430674'}]}, 'timestamp': '2025-12-02 09:36:14.372164', '_unique_id': 'be6444faa9504647b47f174564a03dbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.372 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94a252b6-412c-4608-b294-a394e584b81c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:36:14.373194', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '584df0e0-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.564024255, 'message_signature': '837757a18ffe39772b370aa4e522a56853b7ce7d5e252dfc50efcc579cb1a8ae'}]}, 'timestamp': '2025-12-02 09:36:14.373396', '_unique_id': '4b6e69c80b434de386d501d968a3be44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.373 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.374 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.374 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.374 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '660e8d6b-65ed-408f-8206-c30d826a7c15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:14.374367', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '584e1ea8-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '99c83c30a258af158d0ac242442be8bed2f19dad016e82ed346085061e8f6007'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:14.374367', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '584e2682-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.498296316, 'message_signature': '5c432f2693a9c6f887bfd41023af7f01c1a26d511d253fe79f26bdb64518980c'}]}, 'timestamp': '2025-12-02 09:36:14.374795', '_unique_id': '327e5b5709e5418fb1a12a5d4eb5cab4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.375 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad449b7b-f2b0-4cb3-8bc5-344f22238c60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:14.375775', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '584e55a8-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.57587362, 'message_signature': 'cac9e69306f15158c396aa1d57754d64543800ae7e8cd03cd291d77cce5c96ad'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:14.375775', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '584e5cd8-cf62-11f0-aa38-fa163e3f40cc', 'monotonic_time': 10336.57587362, 'message_signature': '941227a063d8f0c0a970b0d05821c936b4b2c910109dd999e7ec7085da039240'}]}, 'timestamp': '2025-12-02 09:36:14.376153', '_unique_id': '99fc5f802f3b458ca268f112ed0fdba1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.376 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[237880]: 2025-12-02 09:36:14.388 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Dec 02 09:36:14 np0005541913.localdomain virtqemud[203664]: End of file while reading data: Input/output error
Dec 02 09:36:14 np0005541913.localdomain virtqemud[203664]: End of file while reading data: Input/output error
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: libpod-31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.scope: Deactivated successfully.
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: libpod-31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.scope: Consumed 1.326s CPU time.
Dec 02 09:36:14 np0005541913.localdomain podman[238028]: 2025-12-02 09:36:14.555460967 +0000 UTC m=+0.881929697 container died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.timer: Deactivated successfully.
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563-userdata-shm.mount: Deactivated successfully.
Dec 02 09:36:14 np0005541913.localdomain podman[238052]: 2025-12-02 09:36:14.6574713 +0000 UTC m=+0.077088493 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:36:14 np0005541913.localdomain podman[238028]: 2025-12-02 09:36:14.679065025 +0000 UTC m=+1.005533725 container cleanup 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:36:14 np0005541913.localdomain podman[238028]: ceilometer_agent_compute
Dec 02 09:36:14 np0005541913.localdomain podman[238052]: 2025-12-02 09:36:14.698589487 +0000 UTC m=+0.118206730 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-56ddd6aaec82e09d3b6ef171e2ac941eb72bf7461746145a1488501bf649c854-merged.mount: Deactivated successfully.
Dec 02 09:36:14 np0005541913.localdomain podman[238075]: 2025-12-02 09:36:14.778148683 +0000 UTC m=+0.060427095 container cleanup 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 09:36:14 np0005541913.localdomain podman[238075]: ceilometer_agent_compute
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:36:14 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ddd6aaec82e09d3b6ef171e2ac941eb72bf7461746145a1488501bf649c854/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 02 09:36:14 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ddd6aaec82e09d3b6ef171e2ac941eb72bf7461746145a1488501bf649c854/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 02 09:36:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:36:14 np0005541913.localdomain podman[238086]: 2025-12-02 09:36:14.970015586 +0000 UTC m=+0.143859090 container init 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[238101]: + sudo -E kolla_set_configs
Dec 02 09:36:14 np0005541913.localdomain ceilometer_agent_compute[238101]: sudo: unable to send audit message: Operation not permitted
Dec 02 09:36:14 np0005541913.localdomain sudo[238107]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 09:36:15 np0005541913.localdomain sudo[238107]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 02 09:36:15 np0005541913.localdomain sudo[238107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 02 09:36:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:36:15 np0005541913.localdomain podman[238086]: 2025-12-02 09:36:15.015752442 +0000 UTC m=+0.189595956 container start 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 09:36:15 np0005541913.localdomain podman[238086]: ceilometer_agent_compute
Dec 02 09:36:15 np0005541913.localdomain systemd[1]: Started ceilometer_agent_compute container.
Dec 02 09:36:15 np0005541913.localdomain sudo[238019]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Validating config file
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Copying service configuration files
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: INFO:__main__:Writing out command to execute
Dec 02 09:36:15 np0005541913.localdomain sudo[238107]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: ++ cat /run_command
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: + ARGS=
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: + sudo kolla_copy_cacerts
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: sudo: unable to send audit message: Operation not permitted
Dec 02 09:36:15 np0005541913.localdomain sudo[238123]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 02 09:36:15 np0005541913.localdomain sudo[238123]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 02 09:36:15 np0005541913.localdomain sudo[238123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 02 09:36:15 np0005541913.localdomain sudo[238123]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: + [[ ! -n '' ]]
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: + . kolla_extend_start
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: + umask 0022
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 02 09:36:15 np0005541913.localdomain podman[238110]: 2025-12-02 09:36:15.108733963 +0000 UTC m=+0.092252513 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 09:36:15 np0005541913.localdomain podman[238110]: 2025-12-02 09:36:15.115877316 +0000 UTC m=+0.099395856 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=edpm)
Dec 02 09:36:15 np0005541913.localdomain podman[238110]: unhealthy
Dec 02 09:36:15 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:36:15 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Failed with result 'exit-code'.
Dec 02 09:36:15 np0005541913.localdomain sudo[238238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prrxhzghnelcronppyamnlhsftmigrpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668175.5398223-1545-119609008531271/AnsiballZ_stat.py
Dec 02 09:36:15 np0005541913.localdomain sudo[238238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.871 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.872 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.873 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.874 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.875 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.876 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.877 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.878 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.879 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.880 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.881 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.882 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.883 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.884 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.885 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.903 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.904 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.905 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 02 09:36:15 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:15.922 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 02 09:36:15 np0005541913.localdomain python3.9[238240]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:36:15 np0005541913.localdomain sudo[238238]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:16.005 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.066 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.066 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.066 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.066 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.066 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.066 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.067 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.068 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.068 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.068 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.068 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.068 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.068 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.068 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.068 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.068 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.069 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.070 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.071 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.072 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.072 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.072 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.072 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.072 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.072 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.072 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.072 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.072 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.073 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.073 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.073 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.073 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.073 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.073 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.073 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.073 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.076 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.077 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.078 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.079 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.080 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.081 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.082 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.083 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.084 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.085 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.086 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.086 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.088 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.093 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 02 09:36:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11275 DF PROTO=TCP SPT=60532 DPT=9100 SEQ=1810058814 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B0CA40000000001030307) 
Dec 02 09:36:16 np0005541913.localdomain sudo[238332]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdniibgjewshividuwkktaruxuroyqiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668175.5398223-1545-119609008531271/AnsiballZ_copy.py
Dec 02 09:36:16 np0005541913.localdomain sudo[238332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.438 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a966654efc63eb79f395da865ed495916856f318e31034e86d5a2b1abae24291" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 02 09:36:16 np0005541913.localdomain python3.9[238334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668175.5398223-1545-119609008531271/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:36:16 np0005541913.localdomain sudo[238332]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.504 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Tue, 02 Dec 2025 09:36:16 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a91009ba-9531-4b7d-9b7d-290113c0ab02 x-openstack-request-id: req-a91009ba-9531-4b7d-9b7d-290113c0ab02 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.505 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "45a99238-6f19-4f9e-be82-6ef3af1dcb31", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.505 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-a91009ba-9531-4b7d-9b7d-290113c0ab02 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.507 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a966654efc63eb79f395da865ed495916856f318e31034e86d5a2b1abae24291" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.523 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Tue, 02 Dec 2025 09:36:16 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-10f44dac-5fc2-4dc2-8cc6-106b849fc591 x-openstack-request-id: req-10f44dac-5fc2-4dc2-8cc6-106b849fc591 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.523 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "45a99238-6f19-4f9e-be82-6ef3af1dcb31", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.523 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31 used request id req-10f44dac-5fc2-4dc2-8cc6-106b849fc591 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.525 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.525 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.535 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.536 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd757fbc4-12ed-4a4d-bf88-8d8859e50ae3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:16.525935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '599801d4-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.745062044, 'message_signature': 'bed582c53f4d509a3545d68d326cd2b2e8fe67c20e369c64d2fbc271cd81e1e2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:16.525935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59981494-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.745062044, 'message_signature': 'd96303b3ad2e29ec426d0efac193f11597cfc3f63bc12b1e23e69e3f43473fc6'}]}, 'timestamp': '2025-12-02 09:36:16.537113', '_unique_id': '1cbafc776a9f4843a5f1632dacc7af3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.543 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.546 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.550 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b254bb7f-2891-4b37-9c44-9700e301ce16 / tap4a318f6a-b3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.550 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01d043d8-e05a-467d-b9bf-df3a9057bb3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.546734', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '599a3760-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': 'a237c727c7c22967c6aa24e0e067006572cc5bbce638e8d4a7ddac3345fd6c91'}]}, 'timestamp': '2025-12-02 09:36:16.551094', '_unique_id': '7b592cf786564f79b160d36147a6484c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.552 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.553 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.553 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b995e11-9850-4a4b-98ea-394ec3717d0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.553104', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '599a93c2-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': '6fe7a141641470b99c91595c27568eee954cb160054cf49eb15a69cb5a573865'}]}, 'timestamp': '2025-12-02 09:36:16.553420', '_unique_id': 'df668e9f63ad49e59ff5b4ed2046fc67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.554 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.555 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd9818b7-68c5-4da0-a17c-2e3ed5f2ac88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.555032', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '599adeea-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': '0a58079191405013352b35ddfe0ab5b52d1d62d29101de9dafd079eafd963274'}]}, 'timestamp': '2025-12-02 09:36:16.555403', '_unique_id': '6bc246a47bb047d98e19cba110eb3964'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.556 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.596 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.597 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc57f9fd-63ec-4ffe-9ba1-d3c4614376ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:16.556966', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59a1388a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': '9d4779421b4631cbaa8ef2623222eb11fea5694bf7e39c376d446ce94e9acd46'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:16.556966', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59a15248-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': '340fbaa0477497b16abaad2eed26998eacbdc4504471ce9a2145d27b2e3def9b'}]}, 'timestamp': '2025-12-02 09:36:16.597882', '_unique_id': '04b014c0b5d14487bddc4154d1a689d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.599 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.601 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.601 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.602 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.602 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.603 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1b9520b-87b1-4b2f-9406-1fc804124e93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:16.602656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59a22858-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': '349a810e20c2b952b2d1e35e45699bcfb7850bf418b1a68820485d335deef1ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:16.602656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59a239ce-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': 'db0e48a93a4b4802e65fffbd03c55b6ab5f46d80828556f18316925796c73bf7'}]}, 'timestamp': '2025-12-02 09:36:16.603604', '_unique_id': 'edb6e89cf5bb4cb58a7f2c11d27f1d73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.604 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.605 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.624 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 51630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9c0efdc-1fad-4513-bd7c-762b522f84f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51630000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:36:16.606088', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '59a5891c-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.843092954, 'message_signature': '4350d92703403e247b96d5de17ca595049ffd5e8c83e1de2a87c359308612944'}]}, 'timestamp': '2025-12-02 09:36:16.625473', '_unique_id': '571b5cede90148b5a0d85baf820a6506'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.627 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.629 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.629 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9075fc8-ab28-4a55-88f9-03c49667c62c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.629371', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '59a643fc-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': 'fd97341876c6905ef526b0b2bed4a6af99c155f56bcf43eb305c955708f2365e'}]}, 'timestamp': '2025-12-02 09:36:16.630127', '_unique_id': 'f49bdaa8d2a94caa980e6f1ba04d7ec3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.631 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.632 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.632 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd54f9db4-c351-4581-9e81-edb078da56b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.632605', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '59a6bb20-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': '6d124f840b12f9d88048393a5940d916b5ab5d3f72a379c721bbfae7311e82c5'}]}, 'timestamp': '2025-12-02 09:36:16.633161', '_unique_id': '26f4b7e6e9144c648cc87398c9b11bb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.634 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.635 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1433516318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.636 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 164656012 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a025a389-96f2-4115-ba47-c85903074065', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1433516318, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:16.635694', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59a7323a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': 'dd31c0fc360e0ad3155c824232da4ad307abd625c306eef33057c899166818cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164656012, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:16.635694', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59a74388-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': 'a50d9d1b5ad6581cb7b0590537a4df4bd8ca60c164d254d1e44825a75e7ecf59'}]}, 'timestamp': '2025-12-02 09:36:16.636649', '_unique_id': '3a53ee5a26e84252801efbe61b53d789'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.637 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.638 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.639 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b903ee4-3e51-4c30-9167-c7c86cc3bb5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.639074', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '59a7b566-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': 'c9ea90561b87630844b572c36accd676716f4003eeab2000cc09fcabbb85450a'}]}, 'timestamp': '2025-12-02 09:36:16.639560', '_unique_id': 'c03f582f6a054bb6b5a429f26fcd8be0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.640 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.641 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.642 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 286697561 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.642 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 39228582 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f4a882b-224e-4367-b94a-c77bdc725cc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 286697561, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:16.642030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59a8287a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': '603866ff0da33aed7f0b0870302643506b7c818fc1f806ddccc8f066ff50d992'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39228582, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:16.642030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59a83acc-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': '3da80195bc834d3c0aea18dafea103fe5b4f8275436c72f43562d381528bce9d'}]}, 'timestamp': '2025-12-02 09:36:16.642965', '_unique_id': '6d83e53a506b41d99eac096a3f9de38f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.643 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.645 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82168af6-5fd3-40dd-b37c-4dc9f8c552f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.645297', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '59a8a84a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': 'e975020149ed277add29508d6163b1a80f0435197f2ee6dae28ad54641e438cd'}]}, 'timestamp': '2025-12-02 09:36:16.645839', '_unique_id': '1d825a3d67f34e9880c919b1fa94b49a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.646 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.647 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.648 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.648 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6ffa6c8-6402-41b5-892f-4756eaa43236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:16.648066', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59a9142e-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.745062044, 'message_signature': 'b5132f9a36502d092975ab7e72e67de563b4992a10b554abebce9f0537a2cf13'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:16.648066', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59a92752-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.745062044, 'message_signature': '41761297292e5053ea8d48582e8b029e1f6a38ec4b81fefd8bfbee1129cd911d'}]}, 'timestamp': '2025-12-02 09:36:16.649084', '_unique_id': '2174d1fd65fb40bfb6383c964a5510c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.650 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.652 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.652 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e743d1f-7f0e-4eb5-b4da-395f394e2806', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.652401', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '59a9cb62-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': '1957cae905b96d1039b68a5ac6c341280872cb3d6e388ee230c19cc4a9969387'}]}, 'timestamp': '2025-12-02 09:36:16.653340', '_unique_id': '11692eb1f4464711a608da4a0b6ffb87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.654 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.655 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.655 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.655 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b52649ca-9581-465f-a735-ca30ac2e70aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:16.655553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59aa385e-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.745062044, 'message_signature': 'c53ca9ec4336f02ced723e7af085ffdcd2f7a7616a9fe7c58a7379f45963affd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:16.655553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59aa461e-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.745062044, 'message_signature': '590606e95ec970d5aef66a896bf238383804dfe089de7df26249277e1fd2426d'}]}, 'timestamp': '2025-12-02 09:36:16.656264', '_unique_id': 'a6a2328f2ee14eb486a602433e7de811'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.656 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.657 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.657 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.658 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f5dbcaf-01d6-493f-bfa7-118537f28aa3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:16.657813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59aa8dfe-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': 'f92b38957eda90482edfdb49ae8a5c67f611225744ea2cf89aa2f5062d7a22ea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:16.657813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59aa9894-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': '9e0e18eb09947a89783e80b448effd48a3a621d5ab2e7ef754244afc13ec8c43'}]}, 'timestamp': '2025-12-02 09:36:16.658373', '_unique_id': '1415e9215b9f4cc5bceeb3345572131c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.659 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.660 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.660 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6461f87-e100-46b4-9328-53468585c35b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:36:16.660209', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '59aaeb82-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.843092954, 'message_signature': '68081e69af8915757b98677613389d88a7bd48600d5ab65cf72f90c6accfe756'}]}, 'timestamp': '2025-12-02 09:36:16.660505', '_unique_id': 'c903b63704694042b6a77e25276aac6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.661 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 8783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5db2c948-44a7-46e4-bb56-3639177f15a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8783, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.661975', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '59ab3042-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': '49bc20e93582777eceb8b271e7485a93c6076915a3ac81b87db28dfcbbd12450'}]}, 'timestamp': '2025-12-02 09:36:16.662274', '_unique_id': '2be1aaabee064a87bc81724c1311cc67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.662 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.663 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.663 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.663 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.663 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.664 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.664 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.664 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.664 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.664 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a427cb59-f031-4528-b5d4-e9258ad277b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:36:16.664401', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59ab8f1a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': '796a67ec424b7516d2beba3b2ebb1c65b1a95f170c6b7f87025abc44ba56b42c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:36:16.664401', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59ab9b90-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.776104782, 'message_signature': 'b9d96f045d45b495c0e5376544d8cb5a1645ead5d211dc3477d27b0e9192ef84'}]}, 'timestamp': '2025-12-02 09:36:16.665008', '_unique_id': 'ce1a566e08314be9800173c7b820b918'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.665 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.666 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.666 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b3ed643-4ee2-4b4c-8dce-407038f5030d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:36:16.666663', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '59abe852-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10338.76591754, 'message_signature': '82aa8c0d0998c010644d7dbcebd29da248c372507fe54f3408a561128d69578e'}]}, 'timestamp': '2025-12-02 09:36:16.666994', '_unique_id': 'd4308881449c4f619cd8fd316fd816a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:36:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:36:16.667 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:36:17 np0005541913.localdomain sudo[238442]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdirxcsbvgwfvkdoqzjhlqtwljwvqosy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668176.949718-1596-78731226860466/AnsiballZ_container_config_data.py
Dec 02 09:36:17 np0005541913.localdomain sudo[238442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:17 np0005541913.localdomain python3.9[238444]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Dec 02 09:36:17 np0005541913.localdomain sudo[238442]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:17 np0005541913.localdomain sudo[238552]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kesjpeccbqfujrptsmmtfezwqnxiixvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668177.7070172-1623-172035737318853/AnsiballZ_container_config_hash.py
Dec 02 09:36:17 np0005541913.localdomain sudo[238552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:18 np0005541913.localdomain python3.9[238554]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:36:18 np0005541913.localdomain sudo[238552]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:18.646 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:18 np0005541913.localdomain sudo[238662]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdsltjhbqsxaoaucqquxtylunlhqlazr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668178.5346158-1653-128743938323882/AnsiballZ_edpm_container_manage.py
Dec 02 09:36:18 np0005541913.localdomain sudo[238662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:19 np0005541913.localdomain python3[238664]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:36:19 np0005541913.localdomain podman[238701]: 
Dec 02 09:36:19 np0005541913.localdomain podman[238701]: 2025-12-02 09:36:19.326311563 +0000 UTC m=+0.073070009 container create 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm)
Dec 02 09:36:19 np0005541913.localdomain podman[238701]: 2025-12-02 09:36:19.289553968 +0000 UTC m=+0.036312414 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Dec 02 09:36:19 np0005541913.localdomain python3[238664]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec 02 09:36:19 np0005541913.localdomain sudo[238662]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27097 DF PROTO=TCP SPT=53614 DPT=9102 SEQ=3585547451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B19E50000000001030307) 
Dec 02 09:36:20 np0005541913.localdomain sudo[238847]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnyqzbwjlxddawasddejnduynuyqzqns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668179.8817492-1677-242920701254675/AnsiballZ_stat.py
Dec 02 09:36:20 np0005541913.localdomain sudo[238847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:20 np0005541913.localdomain python3.9[238849]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:36:20 np0005541913.localdomain sudo[238847]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:21 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:21.009 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:21 np0005541913.localdomain sudo[238959]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wugtvducimyedemxeclbizvrkdszcfrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668180.7111173-1704-189909562589634/AnsiballZ_file.py
Dec 02 09:36:21 np0005541913.localdomain sudo[238959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:21 np0005541913.localdomain sudo[238961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:36:21 np0005541913.localdomain sudo[238961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:36:21 np0005541913.localdomain sudo[238961]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:21 np0005541913.localdomain sudo[238980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:36:21 np0005541913.localdomain sudo[238980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:36:21 np0005541913.localdomain python3.9[238972]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:36:21 np0005541913.localdomain sudo[238959]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:22 np0005541913.localdomain sudo[239124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbrhdqukrxaswnxhyifynzggkgzbttsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668181.720109-1704-58902281780546/AnsiballZ_copy.py
Dec 02 09:36:22 np0005541913.localdomain sudo[239124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:22 np0005541913.localdomain sudo[238980]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16874 DF PROTO=TCP SPT=51228 DPT=9101 SEQ=2887775470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B24500000000001030307) 
Dec 02 09:36:22 np0005541913.localdomain python3.9[239127]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764668181.720109-1704-58902281780546/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:36:22 np0005541913.localdomain sudo[239124]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:22 np0005541913.localdomain sudo[239192]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vizzxrnwuqmggaftlwekntqpofflfgys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668181.720109-1704-58902281780546/AnsiballZ_systemd.py
Dec 02 09:36:22 np0005541913.localdomain sudo[239192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:22 np0005541913.localdomain python3.9[239194]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:36:22 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:36:22 np0005541913.localdomain systemd-rc-local-generator[239220]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:36:22 np0005541913.localdomain systemd-sysv-generator[239223]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:36:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:22 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:36:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:23 np0005541913.localdomain sudo[239231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:36:23 np0005541913.localdomain sudo[239231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:36:23 np0005541913.localdomain sudo[239231]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:23 np0005541913.localdomain sudo[239192]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:23 np0005541913.localdomain sudo[239301]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnrykptgfxksrsdvrtyknlovlbssoohc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668181.720109-1704-58902281780546/AnsiballZ_systemd.py
Dec 02 09:36:23 np0005541913.localdomain sudo[239301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:23 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:23.661 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:23 np0005541913.localdomain python3.9[239303]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:36:23 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:36:23 np0005541913.localdomain systemd-sysv-generator[239328]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:36:23 np0005541913.localdomain systemd-rc-local-generator[239324]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:36:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:23 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: Starting node_exporter container...
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:36:24 np0005541913.localdomain podman[239344]: 2025-12-02 09:36:24.391330584 +0000 UTC m=+0.154455923 container init 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.407Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.407Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.407Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.408Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.408Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.408Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.408Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.408Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.408Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=arp
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=bcache
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=bonding
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=cpu
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=edac
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=filefd
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=netclass
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=netdev
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=netstat
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=nfs
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=nvme
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=softnet
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=systemd
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=xfs
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.409Z caller=node_exporter.go:117 level=info collector=zfs
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.410Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 02 09:36:24 np0005541913.localdomain node_exporter[239357]: ts=2025-12-02T09:36:24.410Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:36:24 np0005541913.localdomain podman[239344]: 2025-12-02 09:36:24.421808747 +0000 UTC m=+0.184934106 container start 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:36:24 np0005541913.localdomain podman[239344]: node_exporter
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: Started node_exporter container.
Dec 02 09:36:24 np0005541913.localdomain sudo[239301]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:24 np0005541913.localdomain podman[239366]: 2025-12-02 09:36:24.494474886 +0000 UTC m=+0.070110164 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:36:24 np0005541913.localdomain podman[239366]: 2025-12-02 09:36:24.508042125 +0000 UTC m=+0.083677383 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:36:24 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:36:24 np0005541913.localdomain sudo[239496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjyoagxjdsfbxfqegtgkwjmzcznefsba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668184.6334226-1776-277412494363138/AnsiballZ_systemd.py
Dec 02 09:36:24 np0005541913.localdomain sudo[239496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:25 np0005541913.localdomain python3.9[239498]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: Stopping node_exporter container...
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: libpod-89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.scope: Deactivated successfully.
Dec 02 09:36:25 np0005541913.localdomain podman[239502]: 2025-12-02 09:36:25.269014901 +0000 UTC m=+0.078896820 container died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.timer: Deactivated successfully.
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:36:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16876 DF PROTO=TCP SPT=51228 DPT=9101 SEQ=2887775470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B30640000000001030307) 
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e-userdata-shm.mount: Deactivated successfully.
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-dfdf2ab7fe5ce6537ec1c19b07cda773ff79d984ef31b505b40a5e19ca784be0-merged.mount: Deactivated successfully.
Dec 02 09:36:25 np0005541913.localdomain podman[239502]: 2025-12-02 09:36:25.371736761 +0000 UTC m=+0.181618680 container cleanup 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:36:25 np0005541913.localdomain podman[239502]: node_exporter
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 02 09:36:25 np0005541913.localdomain podman[239531]: 2025-12-02 09:36:25.472956374 +0000 UTC m=+0.066658504 container cleanup 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:36:25 np0005541913.localdomain podman[239531]: node_exporter
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: Stopped node_exporter container.
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: Starting node_exporter container...
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:36:25 np0005541913.localdomain podman[239544]: 2025-12-02 09:36:25.599346034 +0000 UTC m=+0.102094366 container init 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.611Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.611Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.611Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.611Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.611Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.612Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.612Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.612Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.612Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=arp
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=bcache
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=bonding
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=cpu
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=edac
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=filefd
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=netclass
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=netdev
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=netstat
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=nfs
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=nvme
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=softnet
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=systemd
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=xfs
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.613Z caller=node_exporter.go:117 level=info collector=zfs
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.614Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 02 09:36:25 np0005541913.localdomain node_exporter[239558]: ts=2025-12-02T09:36:25.614Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:36:25 np0005541913.localdomain podman[239544]: 2025-12-02 09:36:25.62445732 +0000 UTC m=+0.127205632 container start 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:36:25 np0005541913.localdomain podman[239544]: node_exporter
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: Started node_exporter container.
Dec 02 09:36:25 np0005541913.localdomain sudo[239496]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:25 np0005541913.localdomain podman[239567]: 2025-12-02 09:36:25.678113279 +0000 UTC m=+0.050390376 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:36:25 np0005541913.localdomain podman[239567]: 2025-12-02 09:36:25.708258504 +0000 UTC m=+0.080535581 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:36:25 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:36:25 np0005541913.localdomain rsyslogd[754]: imjournal from <localhost:node_exporter>: begin to drop messages due to rate-limiting
Dec 02 09:36:26 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:26.013 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:26 np0005541913.localdomain sudo[239697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plvtpczhhrqgrejzvjmwawfzazgaethp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668185.8761094-1800-245438140519361/AnsiballZ_stat.py
Dec 02 09:36:26 np0005541913.localdomain sudo[239697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:26 np0005541913.localdomain python3.9[239699]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:36:26 np0005541913.localdomain sudo[239697]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:26 np0005541913.localdomain sudo[239785]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vceudgeszvvrrsxkynuwevfgkvjjbnyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668185.8761094-1800-245438140519361/AnsiballZ_copy.py
Dec 02 09:36:26 np0005541913.localdomain sudo[239785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:26 np0005541913.localdomain python3.9[239787]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668185.8761094-1800-245438140519361/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:36:26 np0005541913.localdomain sudo[239785]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:27 np0005541913.localdomain sudo[239895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwobtbfcobakqqgblubnkbjfpnaurvdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668187.254716-1851-15573638891217/AnsiballZ_container_config_data.py
Dec 02 09:36:27 np0005541913.localdomain sudo[239895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:27 np0005541913.localdomain python3.9[239897]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 02 09:36:27 np0005541913.localdomain sudo[239895]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:28 np0005541913.localdomain sudo[240005]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbkyhcgpzshggdiwhhpfpbrstwcwntag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668187.995023-1878-102656424417350/AnsiballZ_container_config_hash.py
Dec 02 09:36:28 np0005541913.localdomain sudo[240005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:28 np0005541913.localdomain python3.9[240007]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:36:28 np0005541913.localdomain sudo[240005]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:28 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:28.700 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:29 np0005541913.localdomain sudo[240115]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flcclyxzjwkncxzcustplbqltwaifeir ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668188.8340604-1908-35096746574118/AnsiballZ_edpm_container_manage.py
Dec 02 09:36:29 np0005541913.localdomain sudo[240115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16877 DF PROTO=TCP SPT=51228 DPT=9101 SEQ=2887775470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B40250000000001030307) 
Dec 02 09:36:29 np0005541913.localdomain python3[240117]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:36:31 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:31.058 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:31 np0005541913.localdomain podman[240131]: 2025-12-02 09:36:29.521724346 +0000 UTC m=+0.046749254 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 02 09:36:31 np0005541913.localdomain podman[240202]: 
Dec 02 09:36:31 np0005541913.localdomain podman[240202]: 2025-12-02 09:36:31.754539695 +0000 UTC m=+0.048188750 container create 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Dec 02 09:36:31 np0005541913.localdomain podman[240202]: 2025-12-02 09:36:31.730737302 +0000 UTC m=+0.024386367 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 02 09:36:31 np0005541913.localdomain python3[240117]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 02 09:36:31 np0005541913.localdomain sudo[240115]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:33 np0005541913.localdomain sudo[240345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfwdcbwfsnsjmqhorsxidqyvwibrmvfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668193.374285-1933-274781030033432/AnsiballZ_stat.py
Dec 02 09:36:33 np0005541913.localdomain sudo[240345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:33 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:33.708 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:33 np0005541913.localdomain python3.9[240347]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:36:33 np0005541913.localdomain sudo[240345]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3123 DF PROTO=TCP SPT=50206 DPT=9102 SEQ=291053170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B520D0000000001030307) 
Dec 02 09:36:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56362 DF PROTO=TCP SPT=42470 DPT=9105 SEQ=1452460053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B528F0000000001030307) 
Dec 02 09:36:34 np0005541913.localdomain sudo[240457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyavjlbwxygqdfgbsrhwiwcqzhiehhfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668194.1188424-1959-87992114069419/AnsiballZ_file.py
Dec 02 09:36:34 np0005541913.localdomain sudo[240457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:34 np0005541913.localdomain python3.9[240459]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:36:34 np0005541913.localdomain sudo[240457]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:35 np0005541913.localdomain sudo[240566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qobgdwjrjqbawmydxbymhxjrcimsqcix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668194.611698-1959-38462321998559/AnsiballZ_copy.py
Dec 02 09:36:35 np0005541913.localdomain sudo[240566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:35 np0005541913.localdomain python3.9[240568]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764668194.611698-1959-38462321998559/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:36:35 np0005541913.localdomain sudo[240566]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:35 np0005541913.localdomain sudo[240621]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toplnhyqareuuwytbgvetpvrvflensph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668194.611698-1959-38462321998559/AnsiballZ_systemd.py
Dec 02 09:36:35 np0005541913.localdomain sudo[240621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:35 np0005541913.localdomain python3.9[240623]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:36:35 np0005541913.localdomain systemd-rc-local-generator[240650]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:36:35 np0005541913.localdomain systemd-sysv-generator[240655]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:35 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:36 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:36.062 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:36 np0005541913.localdomain sudo[240621]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:36 np0005541913.localdomain sudo[240713]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rporbktyqwvulwtuqupipadllyihufmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668194.611698-1959-38462321998559/AnsiballZ_systemd.py
Dec 02 09:36:36 np0005541913.localdomain sudo[240713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:36 np0005541913.localdomain python3.9[240715]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:36:36 np0005541913.localdomain podman[240717]: 2025-12-02 09:36:36.86295006 +0000 UTC m=+0.093378802 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:36:36 np0005541913.localdomain podman[240717]: 2025-12-02 09:36:36.874881677 +0000 UTC m=+0.105310409 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:36:36 np0005541913.localdomain systemd-rc-local-generator[240759]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:36:36 np0005541913.localdomain systemd-sysv-generator[240765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:36 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:36:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3125 DF PROTO=TCP SPT=50206 DPT=9102 SEQ=291053170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B5E240000000001030307) 
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: Starting podman_exporter container...
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:36:37 np0005541913.localdomain podman[240774]: 2025-12-02 09:36:37.305334474 +0000 UTC m=+0.147414160 container init 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:36:37 np0005541913.localdomain podman_exporter[240787]: ts=2025-12-02T09:36:37.328Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 02 09:36:37 np0005541913.localdomain podman_exporter[240787]: ts=2025-12-02T09:36:37.328Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 02 09:36:37 np0005541913.localdomain podman_exporter[240787]: ts=2025-12-02T09:36:37.329Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 02 09:36:37 np0005541913.localdomain podman_exporter[240787]: ts=2025-12-02T09:36:37.329Z caller=handler.go:105 level=info collector=container
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: Starting Podman API Service...
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: Started Podman API Service.
Dec 02 09:36:37 np0005541913.localdomain podman[240774]: 2025-12-02 09:36:37.358092291 +0000 UTC m=+0.200171977 container start 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:36:37 np0005541913.localdomain podman[240774]: podman_exporter
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: Started podman_exporter container.
Dec 02 09:36:37 np0005541913.localdomain sudo[240713]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:37 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:37Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 02 09:36:37 np0005541913.localdomain podman[240798]: 2025-12-02 09:36:37.453139685 +0000 UTC m=+0.101789768 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:36:37 np0005541913.localdomain podman[240798]: 2025-12-02 09:36:37.460948056 +0000 UTC m=+0.109598239 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:36:37 np0005541913.localdomain podman[240798]: unhealthy
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:36:37 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:36:37 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:37Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 02 09:36:37 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:37Z" level=info msg="Setting parallel job count to 25"
Dec 02 09:36:37 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:37Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 02 09:36:37 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:37Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Dec 02 09:36:37 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:36:37 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 02 09:36:37 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:37Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:36:37 np0005541913.localdomain sudo[240942]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqfcqxzkepvenfisozvekmrbyquxqpbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668197.67451-2031-81398535241747/AnsiballZ_systemd.py
Dec 02 09:36:37 np0005541913.localdomain sudo[240942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:38 np0005541913.localdomain python3.9[240944]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:36:38 np0005541913.localdomain systemd[1]: Stopping podman_exporter container...
Dec 02 09:36:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:38 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:36:37 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 0 "" "Go-http-client/1.1"
Dec 02 09:36:38 np0005541913.localdomain systemd[1]: libpod-53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.scope: Deactivated successfully.
Dec 02 09:36:38 np0005541913.localdomain podman[240949]: 2025-12-02 09:36:38.407900504 +0000 UTC m=+0.093776783 container died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:36:38 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.timer: Deactivated successfully.
Dec 02 09:36:38 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:36:38 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:38.742 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:39 np0005541913.localdomain systemd[1]: tmp-crun.xbNyoV.mount: Deactivated successfully.
Dec 02 09:36:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709-userdata-shm.mount: Deactivated successfully.
Dec 02 09:36:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:39 np0005541913.localdomain podman[240949]: 2025-12-02 09:36:39.391803381 +0000 UTC m=+1.077679660 container cleanup 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:36:39 np0005541913.localdomain podman[240949]: podman_exporter
Dec 02 09:36:39 np0005541913.localdomain podman[240963]: 2025-12-02 09:36:39.407670449 +0000 UTC m=+0.989360759 container cleanup 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:36:39 np0005541913.localdomain systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 02 09:36:39 np0005541913.localdomain podman[240977]: 2025-12-02 09:36:39.90241641 +0000 UTC m=+0.057672104 container cleanup 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:36:39 np0005541913.localdomain podman[240977]: podman_exporter
Dec 02 09:36:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63250 DF PROTO=TCP SPT=49968 DPT=9882 SEQ=4282664875 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B69E40000000001030307) 
Dec 02 09:36:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d798324d721791a611107213f65527bdd04968a53e63105834972eb522cf6a99-merged.mount: Deactivated successfully.
Dec 02 09:36:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b388412fca905b307e07ab1555f64621018b9abe733ff2c7e7266decb6c12c8d-merged.mount: Deactivated successfully.
Dec 02 09:36:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b388412fca905b307e07ab1555f64621018b9abe733ff2c7e7266decb6c12c8d-merged.mount: Deactivated successfully.
Dec 02 09:36:40 np0005541913.localdomain systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 02 09:36:40 np0005541913.localdomain systemd[1]: Stopped podman_exporter container.
Dec 02 09:36:40 np0005541913.localdomain systemd[1]: Starting podman_exporter container...
Dec 02 09:36:41 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:41.096 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 02 09:36:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 02 09:36:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:36:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:36:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:36:41 np0005541913.localdomain podman[240989]: 2025-12-02 09:36:41.699180268 +0000 UTC m=+1.093954769 container init 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:36:41 np0005541913.localdomain podman_exporter[241003]: ts=2025-12-02T09:36:41.723Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 02 09:36:41 np0005541913.localdomain podman_exporter[241003]: ts=2025-12-02T09:36:41.723Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 02 09:36:41 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:36:41 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 02 09:36:41 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:36:41 np0005541913.localdomain podman_exporter[241003]: ts=2025-12-02T09:36:41.724Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 02 09:36:41 np0005541913.localdomain podman_exporter[241003]: ts=2025-12-02T09:36:41.724Z caller=handler.go:105 level=info collector=container
Dec 02 09:36:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:36:41 np0005541913.localdomain podman[240989]: 2025-12-02 09:36:41.846377402 +0000 UTC m=+1.241151923 container start 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:36:41 np0005541913.localdomain podman[240989]: podman_exporter
Dec 02 09:36:41 np0005541913.localdomain podman[241006]: 2025-12-02 09:36:41.853747522 +0000 UTC m=+0.168347249 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec 02 09:36:41 np0005541913.localdomain podman[241021]: 2025-12-02 09:36:41.908208582 +0000 UTC m=+0.163030333 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:36:41 np0005541913.localdomain podman[241006]: 2025-12-02 09:36:41.933163844 +0000 UTC m=+0.247763501 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:36:41 np0005541913.localdomain podman[241021]: 2025-12-02 09:36:41.940143794 +0000 UTC m=+0.194965585 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:36:41 np0005541913.localdomain podman[241021]: unhealthy
Dec 02 09:36:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:42 np0005541913.localdomain systemd[1]: Started podman_exporter container.
Dec 02 09:36:42 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:36:42 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:36:42 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:36:42 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:42 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:42 np0005541913.localdomain sudo[240942]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34921 DF PROTO=TCP SPT=42940 DPT=9100 SEQ=3703420671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B75E40000000001030307) 
Dec 02 09:36:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 02 09:36:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:43 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:43 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:43 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:43Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged: device or resource busy"
Dec 02 09:36:43 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:43Z" level=error msg="Getting root fs size for \"028da319cb862756f8deb29d192de17dcf54c8ccbdc71829a48e764c124ec0bf\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": creating overlay mount to /var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/QKXJKO4MENC2JONUN57ZG7NBIN,upperdir=/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/diff,workdir=/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/work,nodev,metacopy=on\": no such file or directory"
Dec 02 09:36:43 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:43.785 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b388412fca905b307e07ab1555f64621018b9abe733ff2c7e7266decb6c12c8d-merged.mount: Deactivated successfully.
Dec 02 09:36:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b388412fca905b307e07ab1555f64621018b9abe733ff2c7e7266decb6c12c8d-merged.mount: Deactivated successfully.
Dec 02 09:36:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:36:45 np0005541913.localdomain podman[241075]: 2025-12-02 09:36:45.07945993 +0000 UTC m=+0.087537851 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:36:45 np0005541913.localdomain podman[241075]: 2025-12-02 09:36:45.108580449 +0000 UTC m=+0.116658300 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 09:36:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-86f9ede822be11b60c0a1703a4ec9607dd292d56847ed8465c37bae8fb9e0d08-merged.mount: Deactivated successfully.
Dec 02 09:36:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:36:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 02 09:36:46 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:46.144 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5602 DF PROTO=TCP SPT=40590 DPT=9100 SEQ=1021608642 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B81E40000000001030307) 
Dec 02 09:36:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 02 09:36:46 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:36:46 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:46 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:46 np0005541913.localdomain podman[241092]: 2025-12-02 09:36:46.284536655 +0000 UTC m=+1.115555274 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 09:36:46 np0005541913.localdomain podman[241092]: 2025-12-02 09:36:46.317167354 +0000 UTC m=+1.148185983 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:36:46 np0005541913.localdomain podman[241092]: unhealthy
Dec 02 09:36:46 np0005541913.localdomain sudo[241199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apikrbgmupmzmdvgtbneofsfahtgyvpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668206.7334676-2055-20058821035548/AnsiballZ_stat.py
Dec 02 09:36:46 np0005541913.localdomain podman[240799]: time="2025-12-02T09:36:46Z" level=error msg="Getting root fs size for \"0d5eefb6ce98c7322a99b5b5705f907e8a9ccd858f2a9e4deb8757260d2c605b\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 02 09:36:47 np0005541913.localdomain sudo[241199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 02 09:36:47 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:47 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:47 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:36:47 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Failed with result 'exit-code'.
Dec 02 09:36:47 np0005541913.localdomain python3.9[241201]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:36:47 np0005541913.localdomain sudo[241199]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:47 np0005541913.localdomain sudo[241287]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjlyzkfeoiwytctolruioezqpfrhfhgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668206.7334676-2055-20058821035548/AnsiballZ_copy.py
Dec 02 09:36:47 np0005541913.localdomain sudo[241287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:47 np0005541913.localdomain python3.9[241289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668206.7334676-2055-20058821035548/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:36:47 np0005541913.localdomain sudo[241287]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:47 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:47 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:47 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d5dc9262725001f2f73a799452ce705d444359a7e34fc5a93c05c8a39696c355-merged.mount: Deactivated successfully.
Dec 02 09:36:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:48 np0005541913.localdomain sudo[241397]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktqkkxbbivjvbmbojgkzecblvljccfkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668208.2085028-2106-86195988031994/AnsiballZ_container_config_data.py
Dec 02 09:36:48 np0005541913.localdomain sudo[241397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:48 np0005541913.localdomain python3.9[241399]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 02 09:36:48 np0005541913.localdomain sudo[241397]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:48 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:48.837 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 02 09:36:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-86f9ede822be11b60c0a1703a4ec9607dd292d56847ed8465c37bae8fb9e0d08-merged.mount: Deactivated successfully.
Dec 02 09:36:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-86f9ede822be11b60c0a1703a4ec9607dd292d56847ed8465c37bae8fb9e0d08-merged.mount: Deactivated successfully.
Dec 02 09:36:49 np0005541913.localdomain sudo[241507]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvxrosaxyrsucbgtjcnyzywxspsczeur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668208.9786806-2133-263509461995676/AnsiballZ_container_config_hash.py
Dec 02 09:36:49 np0005541913.localdomain sudo[241507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3127 DF PROTO=TCP SPT=50206 DPT=9102 SEQ=291053170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B8DE40000000001030307) 
Dec 02 09:36:49 np0005541913.localdomain python3.9[241509]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:36:49 np0005541913.localdomain sudo[241507]: pam_unix(sudo:session): session closed for user root
Dec 02 09:36:50 np0005541913.localdomain sudo[241617]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlpmhazhxwtyjmhgreypjajhtnbehauh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668209.860763-2163-184776366953957/AnsiballZ_edpm_container_manage.py
Dec 02 09:36:50 np0005541913.localdomain sudo[241617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:36:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:50 np0005541913.localdomain python3[241619]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:36:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:36:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:36:50 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:36:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:36:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:51 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:51.206 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 02 09:36:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d5dc9262725001f2f73a799452ce705d444359a7e34fc5a93c05c8a39696c355-merged.mount: Deactivated successfully.
Dec 02 09:36:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13417 DF PROTO=TCP SPT=46578 DPT=9101 SEQ=125172175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479B99800000000001030307) 
Dec 02 09:36:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:36:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:36:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:36:53 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:53.869 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:36:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:36:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13419 DF PROTO=TCP SPT=46578 DPT=9101 SEQ=125172175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479BA5A50000000001030307) 
Dec 02 09:36:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:36:56 np0005541913.localdomain podman[241670]: 2025-12-02 09:36:56.050599001 +0000 UTC m=+0.120058092 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:36:56 np0005541913.localdomain podman[241670]: 2025-12-02 09:36:56.061885297 +0000 UTC m=+0.131344418 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:36:56 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:56.240 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:36:57 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:36:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:58 np0005541913.localdomain podman[241632]: 2025-12-02 09:36:50.99776933 +0000 UTC m=+0.061539693 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 02 09:36:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:36:58 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:36:58.872 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:36:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13420 DF PROTO=TCP SPT=46578 DPT=9101 SEQ=125172175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479BB5640000000001030307) 
Dec 02 09:37:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699-merged.mount: Deactivated successfully.
Dec 02 09:37:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699-merged.mount: Deactivated successfully.
Dec 02 09:37:01 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:01.281 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:37:03.019 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:37:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:37:03.020 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:37:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:37:03.021 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:37:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:37:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:37:03 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:03.873 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14500 DF PROTO=TCP SPT=55096 DPT=9102 SEQ=3060013710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479BC73D0000000001030307) 
Dec 02 09:37:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56855 DF PROTO=TCP SPT=47846 DPT=9105 SEQ=3161961171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479BC7BE0000000001030307) 
Dec 02 09:37:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:37:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:37:06 np0005541913.localdomain podman[241728]: 2025-12-02 09:37:03.712074098 +0000 UTC m=+0.046945603 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 02 09:37:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:06.336 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14502 DF PROTO=TCP SPT=55096 DPT=9102 SEQ=3060013710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479BD3650000000001030307) 
Dec 02 09:37:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:37:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:37:07 np0005541913.localdomain podman[241739]: 2025-12-02 09:37:07.692775906 +0000 UTC m=+0.438644487 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:37:07 np0005541913.localdomain podman[241739]: 2025-12-02 09:37:07.735929925 +0000 UTC m=+0.481798426 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_id=multipathd, org.label-schema.schema-version=1.0)
Dec 02 09:37:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:08 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:37:08 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:08 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:08 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:08 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:08.947 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:09 np0005541913.localdomain podman[240799]: time="2025-12-02T09:37:09Z" level=error msg="Getting root fs size for \"17e1761fa5bc0f54a6d8d8ad155d5bbe472ffac96dffb76c1739ddef9d187159\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": no such file or directory"
Dec 02 09:37:09 np0005541913.localdomain podman[241728]: 
Dec 02 09:37:09 np0005541913.localdomain podman[241728]: 2025-12-02 09:37:09.077678923 +0000 UTC m=+5.412550428 container create 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Dec 02 09:37:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25875 DF PROTO=TCP SPT=56198 DPT=9100 SEQ=770833956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479BDF250000000001030307) 
Dec 02 09:37:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:37:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 4776 writes, 21K keys, 4776 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4776 writes, 569 syncs, 8.39 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:37:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:10.888 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:10.888 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:10.909 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:10.910 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:37:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:10.910 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:37:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:11.383 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:37:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-aed02a8eef27d7fad5076c16a3501516599cfd6963ae4f4d75e8f0b164242bc5-merged.mount: Deactivated successfully.
Dec 02 09:37:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8867bb5d598ed9b36fb1a635e0a2434418503870d99edcec1a2221aec233d699-merged.mount: Deactivated successfully.
Dec 02 09:37:11 np0005541913.localdomain python3[241619]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 02 09:37:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:12.001 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:37:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:12.001 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:37:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:12.001 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:37:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:12.002 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:37:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-aed02a8eef27d7fad5076c16a3501516599cfd6963ae4f4d75e8f0b164242bc5-merged.mount: Deactivated successfully.
Dec 02 09:37:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21333 DF PROTO=TCP SPT=48768 DPT=9882 SEQ=3266517374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479BEB640000000001030307) 
Dec 02 09:37:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:37:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:37:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:13.958 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.058 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.079 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.080 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.081 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.081 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.081 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.082 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.082 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.082 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.083 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.083 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.108 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.108 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.109 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.109 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.110 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:37:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:37:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:37:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.777 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.667s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.826 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:37:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:14.826 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:37:15 np0005541913.localdomain podman[241773]: 2025-12-02 09:37:15.001549843 +0000 UTC m=+1.625273066 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:37:15 np0005541913.localdomain podman[241773]: 2025-12-02 09:37:15.007999548 +0000 UTC m=+1.631722801 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:37:15 np0005541913.localdomain podman[241773]: unhealthy
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.032 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.033 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12490MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.034 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.034 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:37:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:37:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.2 total, 600.0 interval
                                                          Cumulative writes: 5722 writes, 25K keys, 5722 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5722 writes, 780 syncs, 7.34 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:37:15 np0005541913.localdomain podman[241774]: 2025-12-02 09:37:15.080180253 +0000 UTC m=+1.697141692 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.094 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.094 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.094 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.155 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:37:15 np0005541913.localdomain podman[241774]: 2025-12-02 09:37:15.164972838 +0000 UTC m=+1.781934247 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.608 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.613 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.630 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.632 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:37:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:15.632 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:37:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25877 DF PROTO=TCP SPT=56198 DPT=9100 SEQ=770833956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479BF6E40000000001030307) 
Dec 02 09:37:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:37:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:16.404 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:37:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:37:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:37:17 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:37:17 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:37:17 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:37:17 np0005541913.localdomain podman[241875]: 2025-12-02 09:37:17.450015866 +0000 UTC m=+1.089853389 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 09:37:17 np0005541913.localdomain podman[241875]: 2025-12-02 09:37:17.455581687 +0000 UTC m=+1.095419290 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 02 09:37:17 np0005541913.localdomain podman[241887]: 2025-12-02 09:37:17.494228973 +0000 UTC m=+0.298600485 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 02 09:37:17 np0005541913.localdomain podman[241887]: 2025-12-02 09:37:17.524207625 +0000 UTC m=+0.328579127 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:37:17 np0005541913.localdomain podman[241887]: unhealthy
Dec 02 09:37:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:37:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:37:18 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:18 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:37:18 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:37:18 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Failed with result 'exit-code'.
Dec 02 09:37:18 np0005541913.localdomain sudo[241617]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:18.986 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56859 DF PROTO=TCP SPT=47846 DPT=9105 SEQ=3161961171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C03E40000000001030307) 
Dec 02 09:37:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:19 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:19 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:37:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:21 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:21.408 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2462 DF PROTO=TCP SPT=50564 DPT=9101 SEQ=2429369611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C0EB00000000001030307) 
Dec 02 09:37:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261-merged.mount: Deactivated successfully.
Dec 02 09:37:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:37:23 np0005541913.localdomain sudo[241929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:37:23 np0005541913.localdomain sudo[241929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:37:23 np0005541913.localdomain sudo[241929]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:37:23 np0005541913.localdomain sudo[241947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:37:23 np0005541913.localdomain sudo[241947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:37:24 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:24.039 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:24 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:37:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-aed02a8eef27d7fad5076c16a3501516599cfd6963ae4f4d75e8f0b164242bc5-merged.mount: Deactivated successfully.
Dec 02 09:37:24 np0005541913.localdomain sudo[241947]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2464 DF PROTO=TCP SPT=50564 DPT=9101 SEQ=2429369611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C1AA40000000001030307) 
Dec 02 09:37:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:37:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-082042a751b48593af3e4b42b09156dbc115dd133d7891319f3ff1ad0b672b0b-merged.mount: Deactivated successfully.
Dec 02 09:37:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-082042a751b48593af3e4b42b09156dbc115dd133d7891319f3ff1ad0b672b0b-merged.mount: Deactivated successfully.
Dec 02 09:37:26 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:26.413 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:37:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:26 np0005541913.localdomain sudo[241998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:37:26 np0005541913.localdomain sudo[241998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:37:26 np0005541913.localdomain sudo[241998]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 02 09:37:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 02 09:37:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:37:28 np0005541913.localdomain systemd[1]: tmp-crun.z7GZkF.mount: Deactivated successfully.
Dec 02 09:37:28 np0005541913.localdomain podman[242016]: 2025-12-02 09:37:28.471958349 +0000 UTC m=+0.104206152 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:37:28 np0005541913.localdomain podman[242016]: 2025-12-02 09:37:28.4808727 +0000 UTC m=+0.113120513 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:37:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:37:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 02 09:37:29 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:29.077 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2465 DF PROTO=TCP SPT=50564 DPT=9101 SEQ=2429369611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C2A640000000001030307) 
Dec 02 09:37:29 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:37:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:37:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:37:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:37:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:37:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:37:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:37:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:37:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:37:31 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:31.464 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 02 09:37:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47082 DF PROTO=TCP SPT=41114 DPT=9102 SEQ=3624587672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C3C6E0000000001030307) 
Dec 02 09:37:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15497 DF PROTO=TCP SPT=54356 DPT=9105 SEQ=1961061869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C3CEF0000000001030307) 
Dec 02 09:37:34 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:34.126 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 02 09:37:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 02 09:37:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 02 09:37:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-28a9a64287106d93235dbfecf490680361e8b3523afa4b3bf3ce2b25f0636261-merged.mount: Deactivated successfully.
Dec 02 09:37:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:37:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:37:36 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:36.469 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 02 09:37:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47084 DF PROTO=TCP SPT=41114 DPT=9102 SEQ=3624587672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C48640000000001030307) 
Dec 02 09:37:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 02 09:37:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:37:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-082042a751b48593af3e4b42b09156dbc115dd133d7891319f3ff1ad0b672b0b-merged.mount: Deactivated successfully.
Dec 02 09:37:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:37:38 np0005541913.localdomain systemd[1]: tmp-crun.x4JZaA.mount: Deactivated successfully.
Dec 02 09:37:38 np0005541913.localdomain podman[242037]: 2025-12-02 09:37:38.570666174 +0000 UTC m=+0.113658979 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Dec 02 09:37:38 np0005541913.localdomain podman[242037]: 2025-12-02 09:37:38.584215491 +0000 UTC m=+0.127208316 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd)
Dec 02 09:37:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:37:38 np0005541913.localdomain sudo[242148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwxsejzcnuzqisozkgirkvobkevtmhal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668258.7133756-2187-141858611970886/AnsiballZ_stat.py
Dec 02 09:37:38 np0005541913.localdomain sudo[242148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:37:39 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:39.187 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:39 np0005541913.localdomain python3.9[242150]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:37:39 np0005541913.localdomain sudo[242148]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 02 09:37:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 02 09:37:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 02 09:37:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 02 09:37:39 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:37:39 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:39 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24139 DF PROTO=TCP SPT=47806 DPT=9882 SEQ=3832826310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C53E50000000001030307) 
Dec 02 09:37:40 np0005541913.localdomain sudo[242260]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tprroofrckuyzxtbafglgicngtpssfrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668260.0119731-2214-226725802044309/AnsiballZ_file.py
Dec 02 09:37:40 np0005541913.localdomain sudo[242260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:37:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:37:40 np0005541913.localdomain python3.9[242262]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:37:40 np0005541913.localdomain sudo[242260]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:40 np0005541913.localdomain sudo[242369]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxopczjbkljhiadkwvgjimshhxuailof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668260.522813-2214-223711509750356/AnsiballZ_copy.py
Dec 02 09:37:40 np0005541913.localdomain sudo[242369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:37:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:37:41 np0005541913.localdomain python3.9[242371]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764668260.522813-2214-223711509750356/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:37:41 np0005541913.localdomain sudo[242369]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:41 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:41 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:41 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:41 np0005541913.localdomain sudo[242424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmnpymxegfbxjntuwonvcgohqmdhmlaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668260.522813-2214-223711509750356/AnsiballZ_systemd.py
Dec 02 09:37:41 np0005541913.localdomain sudo[242424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:37:41 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 02 09:37:41 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:41.490 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:41 np0005541913.localdomain python3.9[242426]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:37:41 np0005541913.localdomain systemd-rc-local-generator[242450]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:37:41 np0005541913.localdomain systemd-sysv-generator[242455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:37:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:37:41 np0005541913.localdomain sudo[242424]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:37:42 np0005541913.localdomain sudo[242514]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnresfpjtzbepugsppkoitrbbkcoesnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668260.522813-2214-223711509750356/AnsiballZ_systemd.py
Dec 02 09:37:42 np0005541913.localdomain sudo[242514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:37:42 np0005541913.localdomain python3.9[242516]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:37:42 np0005541913.localdomain systemd-sysv-generator[242551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:37:42 np0005541913.localdomain systemd-rc-local-generator[242545]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: Starting openstack_network_exporter container...
Dec 02 09:37:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:37:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5605 DF PROTO=TCP SPT=40590 DPT=9100 SEQ=1021608642 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C5FE50000000001030307) 
Dec 02 09:37:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:37:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 02 09:37:43 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:37:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e31213169be31eb4c808999caa7cbdd96c7b827b617cb9188c92117140ee895/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 09:37:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e31213169be31eb4c808999caa7cbdd96c7b827b617cb9188c92117140ee895/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 09:37:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:37:43 np0005541913.localdomain podman[242557]: 2025-12-02 09:37:43.493993774 +0000 UTC m=+0.616747069 container init 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *bridge.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *coverage.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *datapath.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *iface.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *memory.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *ovnnorthd.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *ovn.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *ovsdbserver.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *pmd_perf.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *pmd_rxq.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: INFO    09:37:43 main.go:48: registering *vswitch.Collector
Dec 02 09:37:43 np0005541913.localdomain openstack_network_exporter[242570]: NOTICE  09:37:43 main.go:82: listening on http://:9105/metrics
Dec 02 09:37:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:37:43 np0005541913.localdomain podman[242557]: 2025-12-02 09:37:43.545393147 +0000 UTC m=+0.668146422 container start 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350)
Dec 02 09:37:43 np0005541913.localdomain podman[242557]: openstack_network_exporter
Dec 02 09:37:44 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:44.221 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:37:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 02 09:37:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:37:44 np0005541913.localdomain systemd[1]: Started openstack_network_exporter container.
Dec 02 09:37:44 np0005541913.localdomain sudo[242514]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:44 np0005541913.localdomain podman[242580]: 2025-12-02 09:37:44.768226665 +0000 UTC m=+1.222918622 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, io.buildah.version=1.33.7, container_name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public)
Dec 02 09:37:44 np0005541913.localdomain podman[242580]: 2025-12-02 09:37:44.803981273 +0000 UTC m=+1.258673200 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 02 09:37:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:37:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28637 DF PROTO=TCP SPT=56124 DPT=9100 SEQ=855657760 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C6C250000000001030307) 
Dec 02 09:37:46 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:46.520 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 02 09:37:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 02 09:37:47 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:37:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:37:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:37:47 np0005541913.localdomain sudo[242709]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cutvgercsaiighbfkuaaltkixmvmrfws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668267.378147-2286-37862067678757/AnsiballZ_systemd.py
Dec 02 09:37:47 np0005541913.localdomain sudo[242709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:37:47 np0005541913.localdomain systemd[1]: tmp-crun.91n5DI.mount: Deactivated successfully.
Dec 02 09:37:47 np0005541913.localdomain podman[242712]: 2025-12-02 09:37:47.718467313 +0000 UTC m=+0.104927941 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:37:47 np0005541913.localdomain podman[242711]: 2025-12-02 09:37:47.69325358 +0000 UTC m=+0.085206327 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:37:47 np0005541913.localdomain podman[242711]: 2025-12-02 09:37:47.77594598 +0000 UTC m=+0.167898727 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:37:47 np0005541913.localdomain podman[242711]: unhealthy
Dec 02 09:37:47 np0005541913.localdomain podman[242712]: 2025-12-02 09:37:47.796370763 +0000 UTC m=+0.182831361 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:37:47 np0005541913.localdomain python3.9[242726]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:37:47 np0005541913.localdomain systemd[1]: Stopping openstack_network_exporter container...
Dec 02 09:37:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:37:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47086 DF PROTO=TCP SPT=41114 DPT=9102 SEQ=3624587672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C77E40000000001030307) 
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:37:49 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:49.225 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: libpod-6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.scope: Deactivated successfully.
Dec 02 09:37:49 np0005541913.localdomain podman[242759]: 2025-12-02 09:37:49.455861244 +0000 UTC m=+1.476709484 container died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.timer: Deactivated successfully.
Dec 02 09:37:49 np0005541913.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:37:49 np0005541913.localdomain podman[242770]: 2025-12-02 09:37:49.5203487 +0000 UTC m=+0.365607520 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 09:37:49 np0005541913.localdomain podman[242770]: 2025-12-02 09:37:49.600475609 +0000 UTC m=+0.445734419 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:37:49 np0005541913.localdomain podman[242770]: unhealthy
Dec 02 09:37:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2-userdata-shm.mount: Deactivated successfully.
Dec 02 09:37:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:37:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:37:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:37:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 02 09:37:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 02 09:37:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3e31213169be31eb4c808999caa7cbdd96c7b827b617cb9188c92117140ee895-merged.mount: Deactivated successfully.
Dec 02 09:37:51 np0005541913.localdomain podman[242759]: 2025-12-02 09:37:51.502052915 +0000 UTC m=+3.522901135 container cleanup 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 09:37:51 np0005541913.localdomain podman[242759]: openstack_network_exporter
Dec 02 09:37:51 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:37:51 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Failed with result 'exit-code'.
Dec 02 09:37:51 np0005541913.localdomain podman[242797]: 2025-12-02 09:37:51.523016422 +0000 UTC m=+2.067164520 container cleanup 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 09:37:51 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:51.550 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33066 DF PROTO=TCP SPT=39396 DPT=9101 SEQ=4054965690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C83E00000000001030307) 
Dec 02 09:37:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:37:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:37:52 np0005541913.localdomain systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 02 09:37:52 np0005541913.localdomain podman[242817]: 2025-12-02 09:37:52.673310327 +0000 UTC m=+0.077359426 container cleanup 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, managed_by=edpm_ansible, distribution-scope=public, version=9.6, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 09:37:52 np0005541913.localdomain podman[242817]: openstack_network_exporter
Dec 02 09:37:53 np0005541913.localdomain systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 02 09:37:53 np0005541913.localdomain systemd[1]: Stopped openstack_network_exporter container.
Dec 02 09:37:53 np0005541913.localdomain systemd[1]: Starting openstack_network_exporter container...
Dec 02 09:37:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:37:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:37:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:37:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:37:54 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:37:54 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e31213169be31eb4c808999caa7cbdd96c7b827b617cb9188c92117140ee895/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 09:37:54 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e31213169be31eb4c808999caa7cbdd96c7b827b617cb9188c92117140ee895/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 09:37:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:37:54 np0005541913.localdomain podman[242830]: 2025-12-02 09:37:54.125774673 +0000 UTC m=+0.987835737 container init 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Dec 02 09:37:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:37:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b063472ae149eb518ac7d99c3a97d11dcdfc09eaeb34ff91e9c6e02d02ccc47e-merged.mount: Deactivated successfully.
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *bridge.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *coverage.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *datapath.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *iface.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *memory.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *ovnnorthd.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *ovn.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *ovsdbserver.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *pmd_perf.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *pmd_rxq.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: INFO    09:37:54 main.go:48: registering *vswitch.Collector
Dec 02 09:37:54 np0005541913.localdomain openstack_network_exporter[242845]: NOTICE  09:37:54 main.go:82: listening on http://:9105/metrics
Dec 02 09:37:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:37:54 np0005541913.localdomain podman[242830]: 2025-12-02 09:37:54.191102062 +0000 UTC m=+1.053163106 container start 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Dec 02 09:37:54 np0005541913.localdomain podman[242830]: openstack_network_exporter
Dec 02 09:37:54 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:54.263 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33068 DF PROTO=TCP SPT=39396 DPT=9101 SEQ=4054965690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C8FE40000000001030307) 
Dec 02 09:37:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 02 09:37:56 np0005541913.localdomain podman[242771]: 2025-12-02 09:37:56.012693561 +0000 UTC m=+6.855291749 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:37:56 np0005541913.localdomain podman[242771]: 2025-12-02 09:37:56.020975926 +0000 UTC m=+6.863574104 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 09:37:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:37:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:56 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:56.594 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:56 np0005541913.localdomain systemd[1]: Started openstack_network_exporter container.
Dec 02 09:37:56 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:37:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:37:56 np0005541913.localdomain podman[242855]: 2025-12-02 09:37:56.900830338 +0000 UTC m=+2.715346539 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter)
Dec 02 09:37:56 np0005541913.localdomain sudo[242709]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:56 np0005541913.localdomain podman[242855]: 2025-12-02 09:37:56.945116287 +0000 UTC m=+2.759632478 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 09:37:57 np0005541913.localdomain sudo[242987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qftdozwaxtfoecaanwfivbfwunyvhfyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668277.0786703-2310-218440532009312/AnsiballZ_find.py
Dec 02 09:37:57 np0005541913.localdomain sudo[242987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:37:57 np0005541913.localdomain python3.9[242989]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 09:37:57 np0005541913.localdomain sudo[242987]: pam_unix(sudo:session): session closed for user root
Dec 02 09:37:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:37:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:37:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:37:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:37:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:37:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:37:59 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:37:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:37:59 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:37:59.270 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:37:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33069 DF PROTO=TCP SPT=39396 DPT=9101 SEQ=4054965690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479C9FA40000000001030307) 
Dec 02 09:37:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:37:59 np0005541913.localdomain podman[243007]: 2025-12-02 09:37:59.968249984 +0000 UTC m=+0.105950884 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:37:59 np0005541913.localdomain podman[243007]: 2025-12-02 09:37:59.977900059 +0000 UTC m=+0.115600989 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:38:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:38:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:00 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:38:00 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:38:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:38:01 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:01 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:01 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:01.643 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:01 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:01 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:01 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:01 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:38:03.021 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:38:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:38:03.022 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:38:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:38:03.024 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:38:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 02 09:38:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b063472ae149eb518ac7d99c3a97d11dcdfc09eaeb34ff91e9c6e02d02ccc47e-merged.mount: Deactivated successfully.
Dec 02 09:38:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b063472ae149eb518ac7d99c3a97d11dcdfc09eaeb34ff91e9c6e02d02ccc47e-merged.mount: Deactivated successfully.
Dec 02 09:38:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8519 DF PROTO=TCP SPT=48640 DPT=9102 SEQ=1433747981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479CB19E0000000001030307) 
Dec 02 09:38:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52751 DF PROTO=TCP SPT=36238 DPT=9105 SEQ=3035910122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479CB21F0000000001030307) 
Dec 02 09:38:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:04.273 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b-merged.mount: Deactivated successfully.
Dec 02 09:38:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Dec 02 09:38:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Dec 02 09:38:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:06.687 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8521 DF PROTO=TCP SPT=48640 DPT=9102 SEQ=1433747981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479CBDA40000000001030307) 
Dec 02 09:38:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:38:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 02 09:38:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:09.277 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:38:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:38:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:09 np0005541913.localdomain podman[243030]: 2025-12-02 09:38:09.845358335 +0000 UTC m=+0.077982687 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:38:09 np0005541913.localdomain podman[243030]: 2025-12-02 09:38:09.852806181 +0000 UTC m=+0.085430513 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:38:09 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51048 DF PROTO=TCP SPT=52238 DPT=9100 SEQ=4266852627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479CC9A40000000001030307) 
Dec 02 09:38:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:11 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:38:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:11.727 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:12 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:12 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:12 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44391 DF PROTO=TCP SPT=49728 DPT=9882 SEQ=1999135450 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479CD5A40000000001030307) 
Dec 02 09:38:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:14.316 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-236dcb696b3f9eafb040a89b33b195e1413d367dc5825ca1d6228a54c8e9179b-merged.mount: Deactivated successfully.
Dec 02 09:38:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:15.633 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:38:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:15.633 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:38:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:15.634 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:38:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:15.634 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:38:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Dec 02 09:38:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Dec 02 09:38:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-73f5af374b13f82b9f4d3d5847d5882ab5c5f129a64a44d0b3384933c5aad231-merged.mount: Deactivated successfully.
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.097 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.098 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.110 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.111 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.111 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.112 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.121 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6ecce42-5a10-4f87-aa35-cc847196b580', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:38:16.098858', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a0df5ae2-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.340538376, 'message_signature': 'd2774dc9a8107cfcb993b9e260c07ad57a81a1e4b3bbe8c2cbdb2f8ebe8c5654'}]}, 'timestamp': '2025-12-02 09:38:16.122542', '_unique_id': 'c8afbc06b28b42bca15bf76ff23a7bdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.141 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.142 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b81d2204-8ffa-4252-981e-17813fc3efec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:38:16.125899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0e25d8c-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.344987174, 'message_signature': 'a15b9a4f44b49d6b900853d33c4ae1287135409805c5116511f719e910bfc973'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:38:16.125899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0e270a6-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.344987174, 'message_signature': '8ef6e14b2a98d9f589950c661890cb5eaf816a022c6a351ab0bf0ab73b682bb2'}]}, 'timestamp': '2025-12-02 09:38:16.142746', '_unique_id': '550da06f79114e32ae25a701ccb796ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51050 DF PROTO=TCP SPT=52238 DPT=9100 SEQ=4266852627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479CE1650000000001030307) 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.144 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.149 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5b8839d-d38f-4d4d-a9cc-dc92ab90fee6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.145552', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0e38e64-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': '0a5250fcb0bc924861c12d3f550b8cdf7912832784e53154b92b43422579dd83'}]}, 'timestamp': '2025-12-02 09:38:16.150012', '_unique_id': '14a777fcece64f2493be2aa179698d5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.151 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.152 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.152 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e797a03-37cd-4ec8-aed4-f07a6e76ac91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:38:16.152280', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0e3f962-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.344987174, 'message_signature': 'bcae3847209586fbf28a3b044df8e99590ec16c103e2e0d38f7874b4fcee501a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:38:16.152280', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0e40be6-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.344987174, 'message_signature': 'f605037c6324293753e529acb402b879ebace42bf1b54d2c19b01044d4429600'}]}, 'timestamp': '2025-12-02 09:38:16.153188', '_unique_id': '268e3b2cc9924b90ae89d0bd38b3faf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.154 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.155 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.155 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 52850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3566c68d-6302-44be-a8a7-8cbdf26631d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52850000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:38:16.155435', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a0e47900-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.340538376, 'message_signature': 'b44dcf4d989b8902fd48aa34e117a1e9eb0cd40e13b70e868de79ee71dfea00f'}]}, 'timestamp': '2025-12-02 09:38:16.156027', '_unique_id': '8b194e450faf45618f10b849a91d69cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.158 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 8783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6028be7-a336-4493-8717-f7b87270ce01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8783, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.158207', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0e4e110-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': 'b937e98f7aa15fbeef6571e18693e39f0ed6bf9c83efacf58d4f3aeb4c88dfa9'}]}, 'timestamp': '2025-12-02 09:38:16.158723', '_unique_id': 'a644f847882b4ff4a9f30e8ece33872d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.162 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.164 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.164 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee025bdc-cac3-4274-833b-37a37c67ec82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.164166', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0e5c9d6-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': '81c4d780a5db47b985e4628343c4db746ffd945103c291f4c803ecbf3e7f32cf'}]}, 'timestamp': '2025-12-02 09:38:16.164674', '_unique_id': '5d6dc20003cc4390b53befedcf6d298a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.166 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1c9fc3d-8686-49c3-8a8c-d061abdb4976', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.166765', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0e62f34-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': '2e5e094a935fcfc6dc9a26b1f41920b4833bfe040c677c8b16c85e3346c798ae'}]}, 'timestamp': '2025-12-02 09:38:16.167222', '_unique_id': '73721932998a4078adb38e59ffa8657c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.169 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.199 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1433516318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.199 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 164656012 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cf872b0-eff8-4a87-9b44-0db5fd6e194f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1433516318, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:38:16.169500', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0eb2caa-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': '1f99dea48cb33d5bea98f8d27bab12e106f8d5312c36e3a7c3ab06535f8246b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164656012, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:38:16.169500', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0eb3f42-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': '6c857372621062078efbafa0847c5037d20e97269f851a4b3dc15d307e182c2a'}]}, 'timestamp': '2025-12-02 09:38:16.200399', '_unique_id': '51bd9e41dc78460c96dd64d15e5f9d91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b8d5504-8280-486f-bceb-1fb8d725bfaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.203168', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0ebbe5e-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': '5e09d59be23178776e005d962ea8761d79ef1359c12c5e40378fdcdb5d1ce6fd'}]}, 'timestamp': '2025-12-02 09:38:16.203730', '_unique_id': 'c2c4ac5d457444918156bae6c42fbe88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.206 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d780c92-ee5f-4eec-a2b2-aef5e646b499', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.206371', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0ec3c8a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': 'ffd195a38d367142af8606971ea2244a77c2a167127e5abe8bd49c410cab1803'}]}, 'timestamp': '2025-12-02 09:38:16.206918', '_unique_id': 'c25096b139d742d3b38368d51b72cf3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0206859-30d5-4238-945a-e8740da2cea8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:38:16.209177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0ecaa26-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': 'e9dc2de71091d198805e71eaed039e763b9a7df4ff5aae83f4dc24374c89a1aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:38:16.209177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0ecbef8-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': '47514a3658d3ce30eb542e496835af3f4600ec5281408a0872cc4b0a0a10afcf'}]}, 'timestamp': '2025-12-02 09:38:16.210219', '_unique_id': '34756de4d79f4a6c928c7c892ce1c4a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.212 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '367858e2-aed3-4946-975d-28fd3a6ff3c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:38:16.212523', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0ed2d52-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.344987174, 'message_signature': 'bcc1607ee252fe4b46904790aea8eda822df0bcc41a031e4c09164bfc092aac0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:38:16.212523', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0ed3e8c-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.344987174, 'message_signature': 'e6110306fe1d2c187dff0876211bdfec65657dd6935ad2a211939ff86a5053ca'}]}, 'timestamp': '2025-12-02 09:38:16.213538', '_unique_id': '9ea4dc9cc793430a8dd26f849d68c805'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.214 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6446232f-b321-4ae7-bdac-51a6fbe278f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.215951', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0edb164-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': '9aeaf33f773c127fad6c49ff6c949f4e2ff6bcce54eab17a18896a2a25ee38e3'}]}, 'timestamp': '2025-12-02 09:38:16.216458', '_unique_id': '0662b3e7dc364ecda45c989d09b4ef8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.218 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e120c79f-84e6-47aa-b75e-4c323f130e85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.218861', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0ee2306-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': '1473138fa23fdded0c0d3d69b838f33cbfbae91ea3e9f5082ac183837bd32627'}]}, 'timestamp': '2025-12-02 09:38:16.219366', '_unique_id': '044c5e65bbd44f2f9b84b3d9264f0195'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.221 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.222 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '350959b4-e7f6-488b-9740-35a1662e0bfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:38:16.221765', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0ee9444-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': '84a91c93e03c3c74dab4133fd65f5e104eefb4ffa2445c2ec24909bd3943c347'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:38:16.221765', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0eea560-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': '224b32855001f0fed435dc4e1d0b3213a59b457fe79fef0cc0f633aad0e86e0d'}]}, 'timestamp': '2025-12-02 09:38:16.222732', '_unique_id': 'd60f75e4608547c0895f19ca5a097ec6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.225 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.225 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bd91006-1843-4721-a508-9fcbdbb96a4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:38:16.225063', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0ef14f0-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': 'bff7422d84f0d7016ce33323c1a75a861ac55b7adeabdf2e1acabc5405ffee0c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:38:16.225063', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0ef276a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': '1903c9cb936b60f4d61fd5f427c76be8e555a99c694a586a17fee637b28f8fa8'}]}, 'timestamp': '2025-12-02 09:38:16.225999', '_unique_id': 'a81dcbbe10bd4b24be1ef3f01ded32f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.228 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.228 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.228 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.228 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.229 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fc941b0-1d16-42d0-bfac-8eb53971acc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:38:16.228661', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0efa1ae-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': '8997bd0d6bc152ee3869f2da151fcbda0673609502a045094808de208f683ed7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:38:16.228661', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0efb28e-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': '7e51d1e798f6ab9899c794dbf0a51c9d83a1d72f3c63aead3c3f02ae2e34fd78'}]}, 'timestamp': '2025-12-02 09:38:16.229560', '_unique_id': '3376c75c9f2a41b39dac1cc45c05cba5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.230 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.232 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 286697561 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.232 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 39228582 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eeea6d2d-07d6-4212-aa49-310faa2203e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 286697561, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:38:16.232163', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0f02a48-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': 'fd8033c5980393e43fd8cfedf8d74fe5829f98a49d7e056ff516425288712707'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39228582, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:38:16.232163', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0f03c68-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.388594143, 'message_signature': '3912041db8fcc142e6f1a91143a24bd9ccb7a45ba23d5dc985ee87fcdfce82c7'}]}, 'timestamp': '2025-12-02 09:38:16.233089', '_unique_id': '5e692d70b0ce44188a3c7c9608fda5b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.234 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.235 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07405a4e-0390-42c6-9e20-f76ba7b12735', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.235390', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0f0a9aa-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': '9db99c1ad784ff8de34c6e2e7a999344a0e53eb5cc98774a12fd40777c3a9b12'}]}, 'timestamp': '2025-12-02 09:38:16.235920', '_unique_id': 'ce284444a61c47668d9b674a8ae0b5f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.236 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.238 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b47c8494-62c3-4a24-bafa-2f6e87c10474', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:38:16.238135', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a0f113b8-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10458.364670103, 'message_signature': '36823e4ea014bcd88c9639414260950e49a9be4a7de55689d661d59ea8774183'}]}, 'timestamp': '2025-12-02 09:38:16.238692', '_unique_id': '2315d6d721c345748875c4ec6f85d724'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:38:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:38:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.569 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.591 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.591 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.592 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.592 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.593 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.593 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.593 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.594 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.594 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.594 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.616 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.616 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.617 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.617 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.618 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:38:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:16.759 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.047 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.112 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.113 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.260 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.261 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12467MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.261 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.262 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.323 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.323 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.323 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.354 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.789 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.794 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.809 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.811 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:38:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:17.811 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:38:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:38:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 02 09:38:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52755 DF PROTO=TCP SPT=36238 DPT=9105 SEQ=3035910122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479CEDE50000000001030307) 
Dec 02 09:38:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 02 09:38:19 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:19.355 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:38:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:38:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:38:20 np0005541913.localdomain podman[243094]: 2025-12-02 09:38:20.449125542 +0000 UTC m=+0.088989888 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:38:20 np0005541913.localdomain podman[243094]: 2025-12-02 09:38:20.45400635 +0000 UTC m=+0.093870726 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:38:20 np0005541913.localdomain podman[243094]: unhealthy
Dec 02 09:38:20 np0005541913.localdomain systemd[1]: tmp-crun.BjJN6t.mount: Deactivated successfully.
Dec 02 09:38:20 np0005541913.localdomain podman[243095]: 2025-12-02 09:38:20.511201158 +0000 UTC m=+0.151667389 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:38:20 np0005541913.localdomain podman[243095]: 2025-12-02 09:38:20.594078263 +0000 UTC m=+0.234544454 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:38:21 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:38:21 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:38:21 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:38:21 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:21 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:21 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:21.793 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16028 DF PROTO=TCP SPT=60374 DPT=9101 SEQ=1975283777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479CF9100000000001030307) 
Dec 02 09:38:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:38:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:22 np0005541913.localdomain podman[240799]: time="2025-12-02T09:38:22Z" level=error msg="Getting root fs size for \"53e24d9c63e51791534ba5dbbc87c43b7936a07e3fdb5942196f097973e8e4db\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy"
Dec 02 09:38:22 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:22 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:22 np0005541913.localdomain podman[243140]: 2025-12-02 09:38:22.446294671 +0000 UTC m=+0.113566125 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 09:38:22 np0005541913.localdomain podman[243140]: 2025-12-02 09:38:22.47811934 +0000 UTC m=+0.145390824 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Dec 02 09:38:22 np0005541913.localdomain podman[243140]: unhealthy
Dec 02 09:38:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:24 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:24.352 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 02 09:38:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-73f5af374b13f82b9f4d3d5847d5882ab5c5f129a64a44d0b3384933c5aad231-merged.mount: Deactivated successfully.
Dec 02 09:38:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-73f5af374b13f82b9f4d3d5847d5882ab5c5f129a64a44d0b3384933c5aad231-merged.mount: Deactivated successfully.
Dec 02 09:38:24 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:38:24 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Failed with result 'exit-code'.
Dec 02 09:38:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16030 DF PROTO=TCP SPT=60374 DPT=9101 SEQ=1975283777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D05240000000001030307) 
Dec 02 09:38:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:26 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:26.795 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:27 np0005541913.localdomain sudo[243157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:38:27 np0005541913.localdomain sudo[243157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:38:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:38:27 np0005541913.localdomain sudo[243157]: pam_unix(sudo:session): session closed for user root
Dec 02 09:38:27 np0005541913.localdomain sudo[243176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:38:27 np0005541913.localdomain sudo[243176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:38:27 np0005541913.localdomain podman[243175]: 2025-12-02 09:38:27.160691422 +0000 UTC m=+0.092948591 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 02 09:38:27 np0005541913.localdomain podman[243175]: 2025-12-02 09:38:27.192047089 +0000 UTC m=+0.124304258 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 02 09:38:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0b94aabaeee0e41c77050836c47e281aafe3b0b49cec59de508354f5d2967adc-merged.mount: Deactivated successfully.
Dec 02 09:38:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0b94aabaeee0e41c77050836c47e281aafe3b0b49cec59de508354f5d2967adc-merged.mount: Deactivated successfully.
Dec 02 09:38:28 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:38:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16031 DF PROTO=TCP SPT=60374 DPT=9101 SEQ=1975283777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D14E40000000001030307) 
Dec 02 09:38:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:38:29 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:29.394 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:38:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:38:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:38:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:31 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:31 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:31 np0005541913.localdomain podman[243226]: 2025-12-02 09:38:31.729117165 +0000 UTC m=+2.322298141 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public)
Dec 02 09:38:31 np0005541913.localdomain podman[243226]: 2025-12-02 09:38:31.832876801 +0000 UTC m=+2.426057807 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, version=9.6)
Dec 02 09:38:31 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:31.840 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:31 np0005541913.localdomain podman[243239]: 2025-12-02 09:38:31.850756272 +0000 UTC m=+0.592821190 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:38:31 np0005541913.localdomain podman[243239]: 2025-12-02 09:38:31.858236589 +0000 UTC m=+0.600301517 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:38:32 np0005541913.localdomain sudo[243176]: pam_unix(sudo:session): session closed for user root
Dec 02 09:38:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:32 np0005541913.localdomain sudo[243288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:38:32 np0005541913.localdomain sudo[243288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:38:32 np0005541913.localdomain sudo[243288]: pam_unix(sudo:session): session closed for user root
Dec 02 09:38:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7694 DF PROTO=TCP SPT=47384 DPT=9102 SEQ=1021865195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D26CD0000000001030307) 
Dec 02 09:38:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:34 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:34 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:34 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:38:34 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:38:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20598 DF PROTO=TCP SPT=46078 DPT=9105 SEQ=1180032002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D274F0000000001030307) 
Dec 02 09:38:34 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:34.448 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:38:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-804deff8dacbfc312114476fef5e5066b58626df118d8072d88e0a05fadba7d2-merged.mount: Deactivated successfully.
Dec 02 09:38:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:36 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:36 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:36 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 02 09:38:36 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:36.867 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7696 DF PROTO=TCP SPT=47384 DPT=9102 SEQ=1021865195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D32E50000000001030307) 
Dec 02 09:38:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 02 09:38:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 02 09:38:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0b94aabaeee0e41c77050836c47e281aafe3b0b49cec59de508354f5d2967adc-merged.mount: Deactivated successfully.
Dec 02 09:38:39 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:39.449 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 02 09:38:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a7c14a8989e4d415fd166e88f713e89b4166ed5e691e2325b8968269ca1a9aa5-merged.mount: Deactivated successfully.
Dec 02 09:38:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:38:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64456 DF PROTO=TCP SPT=37874 DPT=9100 SEQ=3955781839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D3EA50000000001030307) 
Dec 02 09:38:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:38:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:38:41 np0005541913.localdomain podman[243306]: 2025-12-02 09:38:41.465237109 +0000 UTC m=+0.105218834 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 09:38:41 np0005541913.localdomain podman[243306]: 2025-12-02 09:38:41.482188226 +0000 UTC m=+0.122169951 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 09:38:41 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:41.908 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:38:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-804deff8dacbfc312114476fef5e5066b58626df118d8072d88e0a05fadba7d2-merged.mount: Deactivated successfully.
Dec 02 09:38:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:42 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:38:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9829 DF PROTO=TCP SPT=45218 DPT=9882 SEQ=2512201313 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D4AE40000000001030307) 
Dec 02 09:38:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 02 09:38:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 02 09:38:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 02 09:38:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:44 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:44.476 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 02 09:38:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 02 09:38:44 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:45 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:45 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64458 DF PROTO=TCP SPT=37874 DPT=9100 SEQ=3955781839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D56640000000001030307) 
Dec 02 09:38:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 02 09:38:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a7c14a8989e4d415fd166e88f713e89b4166ed5e691e2325b8968269ca1a9aa5-merged.mount: Deactivated successfully.
Dec 02 09:38:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:46 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:46.945 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:47 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:49 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:49.509 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20602 DF PROTO=TCP SPT=46078 DPT=9105 SEQ=1180032002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D63E40000000001030307) 
Dec 02 09:38:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a55b3f6a1664d44c5b07239ef1b7cbc40e1e222dc48149b5fd43766f0362bb85-merged.mount: Deactivated successfully.
Dec 02 09:38:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a55b3f6a1664d44c5b07239ef1b7cbc40e1e222dc48149b5fd43766f0362bb85-merged.mount: Deactivated successfully.
Dec 02 09:38:50 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:50 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:38:50 np0005541913.localdomain podman[240799]: time="2025-12-02T09:38:50Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34/merged: device or resource busy"
Dec 02 09:38:50 np0005541913.localdomain podman[240799]: time="2025-12-02T09:38:50Z" level=error msg="Getting root fs size for \"6dbd36b1217b2142dd84dc17ea35f1612e9d8de121805c102bfb7395aec4018f\": getting diffsize of layer \"3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34\" and its parent \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\": creating overlay mount to /var/lib/containers/storage/overlay/3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/7ZIUCCDB7QIVI2FKGMJG2V3343:/var/lib/containers/storage/overlay/l/SVA2C5DVPQAVJLYG7FZGJBLHH5:/var/lib/containers/storage/overlay/l/S22KQKMHBS35Z24VPC7PIA3U37:/var/lib/containers/storage/overlay/l/QKXJKO4MENC2JONUN57ZG7NBIN,upperdir=/var/lib/containers/storage/overlay/3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34/diff,workdir=/var/lib/containers/storage/overlay/3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34/work,nodev,metacopy=on\": no such file or directory"
Dec 02 09:38:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:38:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:38:51 np0005541913.localdomain systemd[1]: tmp-crun.fnLClD.mount: Deactivated successfully.
Dec 02 09:38:51 np0005541913.localdomain podman[243326]: 2025-12-02 09:38:51.490857185 +0000 UTC m=+0.125958472 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:38:51 np0005541913.localdomain podman[243325]: 2025-12-02 09:38:51.473520608 +0000 UTC m=+0.113164895 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:38:51 np0005541913.localdomain podman[243326]: 2025-12-02 09:38:51.530980843 +0000 UTC m=+0.166082040 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 09:38:51 np0005541913.localdomain podman[243325]: 2025-12-02 09:38:51.556066984 +0000 UTC m=+0.195711291 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:38:51 np0005541913.localdomain podman[243325]: unhealthy
Dec 02 09:38:51 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:51.948 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14923 DF PROTO=TCP SPT=59076 DPT=9101 SEQ=4131444081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D6E400000000001030307) 
Dec 02 09:38:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a55b3f6a1664d44c5b07239ef1b7cbc40e1e222dc48149b5fd43766f0362bb85-merged.mount: Deactivated successfully.
Dec 02 09:38:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a55b3f6a1664d44c5b07239ef1b7cbc40e1e222dc48149b5fd43766f0362bb85-merged.mount: Deactivated successfully.
Dec 02 09:38:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:38:53 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:38:53 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:38:53 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:38:54 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:54.513 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:38:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:54 np0005541913.localdomain systemd[1]: tmp-crun.r7OtKt.mount: Deactivated successfully.
Dec 02 09:38:54 np0005541913.localdomain podman[243370]: 2025-12-02 09:38:54.86330306 +0000 UTC m=+0.084793427 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 09:38:54 np0005541913.localdomain podman[243370]: 2025-12-02 09:38:54.894214405 +0000 UTC m=+0.115704782 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:38:54 np0005541913.localdomain podman[243370]: unhealthy
Dec 02 09:38:55 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:38:55 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Failed with result 'exit-code'.
Dec 02 09:38:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14925 DF PROTO=TCP SPT=59076 DPT=9101 SEQ=4131444081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D7A640000000001030307) 
Dec 02 09:38:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:38:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:38:56 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:56.954 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:38:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:38:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:38:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:38:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14926 DF PROTO=TCP SPT=59076 DPT=9101 SEQ=4131444081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D8A240000000001030307) 
Dec 02 09:38:59 np0005541913.localdomain systemd[1]: tmp-crun.eZtyHe.mount: Deactivated successfully.
Dec 02 09:38:59 np0005541913.localdomain podman[243386]: 2025-12-02 09:38:59.471040639 +0000 UTC m=+0.107590577 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:38:59 np0005541913.localdomain podman[243386]: 2025-12-02 09:38:59.476831252 +0000 UTC m=+0.113381220 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 09:38:59 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:38:59.560 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:39:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996-merged.mount: Deactivated successfully.
Dec 02 09:39:00 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:39:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:39:01 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:01.987 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:39:03.022 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:39:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:39:03.023 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:39:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:39:03.024 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:39:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:39:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-083325a356d009687825873f5ef80d42d8ec3a9c9ef25c5a97dbce5b8f99fa32-merged.mount: Deactivated successfully.
Dec 02 09:39:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:39:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:39:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4196 DF PROTO=TCP SPT=58928 DPT=9102 SEQ=618966944 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D9BFE0000000001030307) 
Dec 02 09:39:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34191 DF PROTO=TCP SPT=45470 DPT=9105 SEQ=1061669575 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479D9C7F0000000001030307) 
Dec 02 09:39:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:39:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:39:04 np0005541913.localdomain podman[243404]: 2025-12-02 09:39:04.461485915 +0000 UTC m=+0.090686452 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Dec 02 09:39:04 np0005541913.localdomain podman[243404]: 2025-12-02 09:39:04.473884568 +0000 UTC m=+0.103085035 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Dec 02 09:39:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:39:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:04.598 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:39:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:39:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:39:05 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:39:05 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:05 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:05 np0005541913.localdomain podman[243405]: 2025-12-02 09:39:05.66365638 +0000 UTC m=+1.288633694 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:39:05 np0005541913.localdomain podman[243405]: 2025-12-02 09:39:05.700090421 +0000 UTC m=+1.325067695 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:39:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:39:06Z" level=error msg="Getting root fs size for \"79cf949874d5e3aaf0f7dbc88df9f119a232e5884c4318bbe279da712f335e16\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy"
Dec 02 09:39:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:06 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:39:06 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:06 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4198 DF PROTO=TCP SPT=58928 DPT=9102 SEQ=618966944 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479DA8240000000001030307) 
Dec 02 09:39:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:07.043 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:39:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0052f13d91303294194500e25d2f8e0888afaf1ca7e6de5d98fbefe304631472-merged.mount: Deactivated successfully.
Dec 02 09:39:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:09.639 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16771 DF PROTO=TCP SPT=45756 DPT=9100 SEQ=2475425474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479DB3E50000000001030307) 
Dec 02 09:39:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-adc9ccf45b0c7149995a619e9f57f17685eac5ade5b4374b2581744148a02996-merged.mount: Deactivated successfully.
Dec 02 09:39:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:39:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:39:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:39:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:39:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:11.895 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:11.895 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:11.933 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:11.933 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:39:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:11.933 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.047 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.055 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.055 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.055 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.055 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.412 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.426 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.426 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.427 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.427 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.427 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.428 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.428 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.428 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.429 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.429 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.443 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.444 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.444 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.444 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.445 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:39:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:39:12 np0005541913.localdomain systemd[1]: tmp-crun.ydA3bb.mount: Deactivated successfully.
Dec 02 09:39:12 np0005541913.localdomain podman[243463]: 2025-12-02 09:39:12.687385107 +0000 UTC m=+0.084359072 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:39:12 np0005541913.localdomain podman[243463]: 2025-12-02 09:39:12.696940865 +0000 UTC m=+0.093914840 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:39:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:12.911 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.051 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.052 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:39:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51053 DF PROTO=TCP SPT=52238 DPT=9100 SEQ=4266852627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479DBFE40000000001030307) 
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.219 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.220 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12432MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.221 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.221 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.311 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.312 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.313 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.347 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:39:13 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:39:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.775 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.781 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.798 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.801 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:39:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:13.801 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:39:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:39:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-083325a356d009687825873f5ef80d42d8ec3a9c9ef25c5a97dbce5b8f99fa32-merged.mount: Deactivated successfully.
Dec 02 09:39:14 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-083325a356d009687825873f5ef80d42d8ec3a9c9ef25c5a97dbce5b8f99fa32-merged.mount: Deactivated successfully.
Dec 02 09:39:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:14.701 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:39:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:15 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:15 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:15 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:39:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:15 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:15 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16773 DF PROTO=TCP SPT=45756 DPT=9100 SEQ=2475425474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479DCBA50000000001030307) 
Dec 02 09:39:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:17.050 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:17 np0005541913.localdomain sshd[230979]: Received disconnect from 192.168.122.30 port 38172:11: disconnected by user
Dec 02 09:39:17 np0005541913.localdomain sshd[230979]: Disconnected from user zuul 192.168.122.30 port 38172
Dec 02 09:39:17 np0005541913.localdomain sshd[230976]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:39:17 np0005541913.localdomain systemd[1]: session-56.scope: Deactivated successfully.
Dec 02 09:39:17 np0005541913.localdomain systemd[1]: session-56.scope: Consumed 57.745s CPU time.
Dec 02 09:39:17 np0005541913.localdomain systemd-logind[757]: Session 56 logged out. Waiting for processes to exit.
Dec 02 09:39:17 np0005541913.localdomain systemd-logind[757]: Removed session 56.
Dec 02 09:39:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:39:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-104925f4f3140d86c4d76991cbbe20b0ea2114e629deebdf08f0de90504ded5f-merged.mount: Deactivated successfully.
Dec 02 09:39:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-104925f4f3140d86c4d76991cbbe20b0ea2114e629deebdf08f0de90504ded5f-merged.mount: Deactivated successfully.
Dec 02 09:39:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:39:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34195 DF PROTO=TCP SPT=45470 DPT=9105 SEQ=1061669575 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479DD7E50000000001030307) 
Dec 02 09:39:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0052f13d91303294194500e25d2f8e0888afaf1ca7e6de5d98fbefe304631472-merged.mount: Deactivated successfully.
Dec 02 09:39:19 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:19.705 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:39:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:39:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:39:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:39:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:39:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:39:21 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:22 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:22.053 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:22 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60055 DF PROTO=TCP SPT=49338 DPT=9101 SEQ=2437125141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479DE3700000000001030307) 
Dec 02 09:39:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:39:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:23 np0005541913.localdomain sshd[243505]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:39:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:39:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:39:23 np0005541913.localdomain sshd[243505]: Accepted publickey for zuul from 192.168.122.30 port 47590 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:39:23 np0005541913.localdomain systemd-logind[757]: New session 57 of user zuul.
Dec 02 09:39:23 np0005541913.localdomain systemd[1]: Started Session 57 of User zuul.
Dec 02 09:39:23 np0005541913.localdomain sshd[243505]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:39:23 np0005541913.localdomain podman[243508]: 2025-12-02 09:39:23.479705924 +0000 UTC m=+0.111331318 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 09:39:23 np0005541913.localdomain podman[243508]: 2025-12-02 09:39:23.511912331 +0000 UTC m=+0.143537725 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:39:23 np0005541913.localdomain podman[243507]: 2025-12-02 09:39:23.462753988 +0000 UTC m=+0.099755467 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:39:23 np0005541913.localdomain podman[243507]: 2025-12-02 09:39:23.597071394 +0000 UTC m=+0.234072863 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:39:23 np0005541913.localdomain podman[243507]: unhealthy
Dec 02 09:39:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:23 np0005541913.localdomain sudo[243646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jikxnumimnydgibhrkjeubigyaoeiqqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668363.4947019-2568-224670861388072/AnsiballZ_podman_container_info.py
Dec 02 09:39:23 np0005541913.localdomain sudo[243646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:23 np0005541913.localdomain python3.9[243648]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 02 09:39:24 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:39:24 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:39:24 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:24 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:24 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:39:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:39:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:24 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:24.707 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60057 DF PROTO=TCP SPT=49338 DPT=9101 SEQ=2437125141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479DEF650000000001030307) 
Dec 02 09:39:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:39:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:25 np0005541913.localdomain podman[243661]: 2025-12-02 09:39:25.702074566 +0000 UTC m=+0.334061414 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 09:39:25 np0005541913.localdomain podman[243661]: 2025-12-02 09:39:25.735028473 +0000 UTC m=+0.367015301 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 09:39:25 np0005541913.localdomain podman[243661]: unhealthy
Dec 02 09:39:27 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:27.057 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:39:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-fddcd6dd4df186203ff55efce1dca7750680c9de7878dc7d77dfefe109af9b62-merged.mount: Deactivated successfully.
Dec 02 09:39:27 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:39:27 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Failed with result 'exit-code'.
Dec 02 09:39:27 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:27 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:28 np0005541913.localdomain sudo[243646]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:28 np0005541913.localdomain sudo[243787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tycfisyxpfelmcsxgddsbigprnvhvsfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668368.4466963-2579-34415775125280/AnsiballZ_podman_container_exec.py
Dec 02 09:39:28 np0005541913.localdomain sudo[243787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:28 np0005541913.localdomain python3.9[243789]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:39:28 np0005541913.localdomain systemd[1]: Started libpod-conmon-cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.scope.
Dec 02 09:39:29 np0005541913.localdomain podman[243790]: 2025-12-02 09:39:29.010697723 +0000 UTC m=+0.119330943 container exec cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 09:39:29 np0005541913.localdomain podman[243790]: 2025-12-02 09:39:29.017952488 +0000 UTC m=+0.126585698 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 02 09:39:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60058 DF PROTO=TCP SPT=49338 DPT=9101 SEQ=2437125141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479DFF250000000001030307) 
Dec 02 09:39:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:39:29 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:29.709 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:39:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:39:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:39:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:39:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-104925f4f3140d86c4d76991cbbe20b0ea2114e629deebdf08f0de90504ded5f-merged.mount: Deactivated successfully.
Dec 02 09:39:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-104925f4f3140d86c4d76991cbbe20b0ea2114e629deebdf08f0de90504ded5f-merged.mount: Deactivated successfully.
Dec 02 09:39:31 np0005541913.localdomain sudo[243787]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:31 np0005541913.localdomain podman[243819]: 2025-12-02 09:39:31.399362732 +0000 UTC m=+0.401160681 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 09:39:31 np0005541913.localdomain podman[243819]: 2025-12-02 09:39:31.432027512 +0000 UTC m=+0.433825441 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 09:39:31 np0005541913.localdomain sudo[243943]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puciomqpomwswpduqlwkhjifqnhnbbig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668371.4856603-2587-232717015259090/AnsiballZ_podman_container_exec.py
Dec 02 09:39:31 np0005541913.localdomain sudo[243943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:31 np0005541913.localdomain python3.9[243945]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:39:32 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:32.066 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:39:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:39:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:39:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:39:33 np0005541913.localdomain systemd[1]: libpod-conmon-cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.scope: Deactivated successfully.
Dec 02 09:39:33 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:39:33 np0005541913.localdomain systemd[1]: Started libpod-conmon-cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.scope.
Dec 02 09:39:33 np0005541913.localdomain podman[243946]: 2025-12-02 09:39:33.668966635 +0000 UTC m=+1.729577995 container exec cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 09:39:33 np0005541913.localdomain sudo[243960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:39:33 np0005541913.localdomain sudo[243960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:39:33 np0005541913.localdomain sudo[243960]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:33 np0005541913.localdomain podman[243946]: 2025-12-02 09:39:33.705953421 +0000 UTC m=+1.766564731 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:39:33 np0005541913.localdomain sudo[243994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:39:33 np0005541913.localdomain sudo[243994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:39:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49345 DF PROTO=TCP SPT=39090 DPT=9102 SEQ=4183493143 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E112D0000000001030307) 
Dec 02 09:39:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32679 DF PROTO=TCP SPT=55994 DPT=9105 SEQ=795964152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E11AE0000000001030307) 
Dec 02 09:39:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:34 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:34.712 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:39:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:39:35 np0005541913.localdomain systemd[1]: libpod-conmon-cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.scope: Deactivated successfully.
Dec 02 09:39:35 np0005541913.localdomain sudo[243943]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:35 np0005541913.localdomain sudo[243994]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:35 np0005541913.localdomain sudo[244151]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncperkkynplopwwsgevrlsccnwebqtmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668375.4893985-2595-186438327918038/AnsiballZ_file.py
Dec 02 09:39:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:39:35 np0005541913.localdomain sudo[244151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:35 np0005541913.localdomain podman[244153]: 2025-12-02 09:39:35.790857323 +0000 UTC m=+0.076097881 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.expose-services=)
Dec 02 09:39:35 np0005541913.localdomain podman[244153]: 2025-12-02 09:39:35.827078357 +0000 UTC m=+0.112318925 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 09:39:35 np0005541913.localdomain python3.9[244154]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:39:35 np0005541913.localdomain sudo[244151]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:39:36Z" level=error msg="Getting root fs size for \"7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 02 09:39:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:36 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:39:36 np0005541913.localdomain sudo[244280]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhdbbnibqzqznpgmsdxmekpiaercllom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668376.096078-2604-163563543435428/AnsiballZ_podman_container_info.py
Dec 02 09:39:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:39:36 np0005541913.localdomain sudo[244280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49347 DF PROTO=TCP SPT=39090 DPT=9102 SEQ=4183493143 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E1D240000000001030307) 
Dec 02 09:39:37 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:37.128 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:37 np0005541913.localdomain podman[244282]: 2025-12-02 09:39:37.13872243 +0000 UTC m=+0.157977844 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:39:37 np0005541913.localdomain podman[244282]: 2025-12-02 09:39:37.176932969 +0000 UTC m=+0.196188393 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:39:37 np0005541913.localdomain python3.9[244283]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 02 09:39:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:37 np0005541913.localdomain sudo[244315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:39:37 np0005541913.localdomain sudo[244315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:39:37 np0005541913.localdomain sudo[244315]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:39:38 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:39:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Dec 02 09:39:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Dec 02 09:39:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0438ade5aeea533b00cd75095bec75fbc2b307bace4c89bb39b75d428637bcd8-merged.mount: Deactivated successfully.
Dec 02 09:39:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Dec 02 09:39:39 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:39.716 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53002 DF PROTO=TCP SPT=53058 DPT=9100 SEQ=752504019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E29240000000001030307) 
Dec 02 09:39:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:39:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-fddcd6dd4df186203ff55efce1dca7750680c9de7878dc7d77dfefe109af9b62-merged.mount: Deactivated successfully.
Dec 02 09:39:40 np0005541913.localdomain sudo[244280]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0438ade5aeea533b00cd75095bec75fbc2b307bace4c89bb39b75d428637bcd8-merged.mount: Deactivated successfully.
Dec 02 09:39:41 np0005541913.localdomain sudo[244442]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkmkkpqcpqqaouhlxbnnuwgzbrovjrua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668381.0790899-2612-20933023528845/AnsiballZ_podman_container_exec.py
Dec 02 09:39:41 np0005541913.localdomain sudo[244442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:41 np0005541913.localdomain python3.9[244444]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:39:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.scope.
Dec 02 09:39:41 np0005541913.localdomain podman[244445]: 2025-12-02 09:39:41.730256156 +0000 UTC m=+0.139596059 container exec 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 02 09:39:41 np0005541913.localdomain podman[244445]: 2025-12-02 09:39:41.763020848 +0000 UTC m=+0.172360741 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 09:39:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:39:42 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:42.130 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37163 DF PROTO=TCP SPT=52760 DPT=9882 SEQ=2578111422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E35250000000001030307) 
Dec 02 09:39:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:39:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:39:43 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:43 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:39:43 np0005541913.localdomain sudo[244442]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:44 np0005541913.localdomain sudo[244593]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlffcwszceybhdwwwdxymzpnivgvhihd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668383.7533906-2620-38531654015189/AnsiballZ_podman_container_exec.py
Dec 02 09:39:44 np0005541913.localdomain sudo[244593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:44 np0005541913.localdomain python3.9[244595]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:39:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:44 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:44.715 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:39:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:39:45 np0005541913.localdomain systemd[1]: libpod-conmon-34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.scope: Deactivated successfully.
Dec 02 09:39:45 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:45 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:46 np0005541913.localdomain podman[244475]: 2025-12-02 09:39:46.03127678 +0000 UTC m=+2.435872721 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd)
Dec 02 09:39:46 np0005541913.localdomain podman[244475]: 2025-12-02 09:39:46.07804779 +0000 UTC m=+2.482643701 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:39:46 np0005541913.localdomain systemd[1]: Started libpod-conmon-34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.scope.
Dec 02 09:39:46 np0005541913.localdomain podman[244596]: 2025-12-02 09:39:46.108280223 +0000 UTC m=+1.850345927 container exec 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:39:46 np0005541913.localdomain podman[244596]: 2025-12-02 09:39:46.139123484 +0000 UTC m=+1.881189198 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 02 09:39:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53004 DF PROTO=TCP SPT=53058 DPT=9100 SEQ=752504019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E40E40000000001030307) 
Dec 02 09:39:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:47 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:47.189 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:47 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:39:47 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:47 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:47 np0005541913.localdomain sudo[244593]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:47 np0005541913.localdomain sudo[244740]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbordqrphgsfqqambejmfyxvnvakmqkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668387.4450004-2628-255309249333439/AnsiballZ_file.py
Dec 02 09:39:47 np0005541913.localdomain sudo[244740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:47 np0005541913.localdomain python3.9[244742]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:39:47 np0005541913.localdomain sudo[244740]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:48 np0005541913.localdomain systemd[1]: libpod-conmon-34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.scope: Deactivated successfully.
Dec 02 09:39:48 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:48 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:48 np0005541913.localdomain sudo[244850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfezbnperhanphujyfptarooergaqfzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668388.1538393-2637-191653040104466/AnsiballZ_podman_container_info.py
Dec 02 09:39:48 np0005541913.localdomain sudo[244850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:39:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d1605e3642cbc6f4a340468563ba343adf6d0f8a3115728727d8e4543418cb20-merged.mount: Deactivated successfully.
Dec 02 09:39:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:48 np0005541913.localdomain python3.9[244852]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 02 09:39:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32683 DF PROTO=TCP SPT=55994 DPT=9105 SEQ=795964152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E4DE40000000001030307) 
Dec 02 09:39:49 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:49.718 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:39:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:39:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:39:52 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:52.193 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21315 DF PROTO=TCP SPT=40858 DPT=9101 SEQ=906033053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E58A00000000001030307) 
Dec 02 09:39:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:39:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:39:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:39:52 np0005541913.localdomain sudo[244850]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:53 np0005541913.localdomain sudo[244973]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bepruqcgyttegcnqftcayzusmhxrwzaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668393.0619638-2645-240945672616085/AnsiballZ_podman_container_exec.py
Dec 02 09:39:53 np0005541913.localdomain sudo[244973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Dec 02 09:39:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Dec 02 09:39:53 np0005541913.localdomain python3.9[244975]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:39:53 np0005541913.localdomain systemd[1]: Started libpod-conmon-f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.scope.
Dec 02 09:39:53 np0005541913.localdomain podman[244976]: 2025-12-02 09:39:53.685799759 +0000 UTC m=+0.119297102 container exec f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:39:53 np0005541913.localdomain podman[244976]: 2025-12-02 09:39:53.719131037 +0000 UTC m=+0.152628340 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:39:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:39:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:54 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:54 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:54 np0005541913.localdomain systemd[1]: libpod-conmon-f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.scope: Deactivated successfully.
Dec 02 09:39:54 np0005541913.localdomain sudo[244973]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:39:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:39:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:39:54 np0005541913.localdomain podman[245061]: 2025-12-02 09:39:54.555226327 +0000 UTC m=+0.074853956 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:39:54 np0005541913.localdomain podman[245062]: 2025-12-02 09:39:54.612887019 +0000 UTC m=+0.129080056 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:39:54 np0005541913.localdomain podman[245061]: 2025-12-02 09:39:54.662015922 +0000 UTC m=+0.181643531 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:39:54 np0005541913.localdomain podman[245061]: unhealthy
Dec 02 09:39:54 np0005541913.localdomain podman[245062]: 2025-12-02 09:39:54.684556039 +0000 UTC m=+0.200749156 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 02 09:39:54 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:54.719 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:54 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:54 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:54 np0005541913.localdomain podman[240799]: time="2025-12-02T09:39:54Z" level=error msg="Getting root fs size for \"8a2f19c4db6822ef526d58aece58b61f7cc1170e0d395c0acd9eda9b1c2e9d60\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 02 09:39:55 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:39:55 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:55 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:39:55 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:39:55 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:39:55 np0005541913.localdomain sudo[245156]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmaqwjeurorrmnksfpqtncfrjgtnmtkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668394.3653688-2653-182243342567182/AnsiballZ_podman_container_exec.py
Dec 02 09:39:55 np0005541913.localdomain sudo[245156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21317 DF PROTO=TCP SPT=40858 DPT=9101 SEQ=906033053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E64A40000000001030307) 
Dec 02 09:39:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:39:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:39:55 np0005541913.localdomain python3.9[245158]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:39:55 np0005541913.localdomain systemd[1]: Started libpod-conmon-f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.scope.
Dec 02 09:39:55 np0005541913.localdomain systemd[1]: tmp-crun.4L7fj8.mount: Deactivated successfully.
Dec 02 09:39:55 np0005541913.localdomain podman[245159]: 2025-12-02 09:39:55.668571711 +0000 UTC m=+0.121119062 container exec f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd)
Dec 02 09:39:55 np0005541913.localdomain podman[245159]: 2025-12-02 09:39:55.701046346 +0000 UTC m=+0.153593647 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 09:39:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 02 09:39:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d1605e3642cbc6f4a340468563ba343adf6d0f8a3115728727d8e4543418cb20-merged.mount: Deactivated successfully.
Dec 02 09:39:57 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:57.233 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:39:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:39:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a423cc2ecc4b4a7a413eebe91da3e0f5986adaa9cffa0acabc604ad76a95339a-merged.mount: Deactivated successfully.
Dec 02 09:39:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a423cc2ecc4b4a7a413eebe91da3e0f5986adaa9cffa0acabc604ad76a95339a-merged.mount: Deactivated successfully.
Dec 02 09:39:57 np0005541913.localdomain sudo[245156]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:57 np0005541913.localdomain podman[245189]: 2025-12-02 09:39:57.736931207 +0000 UTC m=+0.376724744 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 09:39:57 np0005541913.localdomain podman[245189]: 2025-12-02 09:39:57.771157448 +0000 UTC m=+0.410951005 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Dec 02 09:39:57 np0005541913.localdomain podman[245189]: unhealthy
Dec 02 09:39:58 np0005541913.localdomain sudo[245313]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrqerizjhbiyveibqkruuzyrbocdpuvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668397.8243222-2661-52505736128726/AnsiballZ_file.py
Dec 02 09:39:58 np0005541913.localdomain sudo[245313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:58 np0005541913.localdomain python3.9[245315]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:39:58 np0005541913.localdomain sudo[245313]: pam_unix(sudo:session): session closed for user root
Dec 02 09:39:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:39:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:39:58 np0005541913.localdomain sudo[245423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbwwikjhbarprzhuuhvuxycqotetotjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668398.4706464-2670-137502692500547/AnsiballZ_podman_container_info.py
Dec 02 09:39:58 np0005541913.localdomain sudo[245423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:39:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:39:58 np0005541913.localdomain python3.9[245425]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec 02 09:39:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21318 DF PROTO=TCP SPT=40858 DPT=9101 SEQ=906033053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E74640000000001030307) 
Dec 02 09:39:59 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:39:59.722 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:39:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:39:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:40:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:40:00 np0005541913.localdomain systemd[1]: libpod-conmon-f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.scope: Deactivated successfully.
Dec 02 09:40:00 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:40:00 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Failed with result 'exit-code'.
Dec 02 09:40:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:40:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:40:02 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:02.235 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:40:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:40:03.023 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:40:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:40:03.024 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:40:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:40:03.026 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:40:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:40:03 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:03 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:03 np0005541913.localdomain sudo[245423]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:40:03 np0005541913.localdomain podman[245440]: 2025-12-02 09:40:03.703221254 +0000 UTC m=+0.071520296 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 09:40:03 np0005541913.localdomain podman[245440]: 2025-12-02 09:40:03.718070665 +0000 UTC m=+0.086369627 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 09:40:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30214 DF PROTO=TCP SPT=47556 DPT=9102 SEQ=2105428043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E865D0000000001030307) 
Dec 02 09:40:04 np0005541913.localdomain sudo[245565]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtgqcpoxhvcxtklzkstuasknejmmxbii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668403.7745168-2678-170442377319839/AnsiballZ_podman_container_exec.py
Dec 02 09:40:04 np0005541913.localdomain sudo[245565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46022 DF PROTO=TCP SPT=32890 DPT=9105 SEQ=3657354175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E86DF0000000001030307) 
Dec 02 09:40:04 np0005541913.localdomain python3.9[245567]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:40:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:04.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:04.723 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:40:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:04.723 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:04.723 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 02 09:40:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:04.734 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:04.742 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 02 09:40:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:04.744 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:04.744 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 02 09:40:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:04.755 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:06 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:40:06 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:06 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.scope.
Dec 02 09:40:06 np0005541913.localdomain podman[245568]: 2025-12-02 09:40:06.41973652 +0000 UTC m=+2.191785449 container exec 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 02 09:40:06 np0005541913.localdomain podman[245568]: 2025-12-02 09:40:06.449926453 +0000 UTC m=+2.221975382 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 09:40:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:40:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30216 DF PROTO=TCP SPT=47556 DPT=9102 SEQ=2105428043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E92640000000001030307) 
Dec 02 09:40:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:07.268 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:08 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:08 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:08 np0005541913.localdomain sudo[245565]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:08 np0005541913.localdomain podman[245596]: 2025-12-02 09:40:08.750731602 +0000 UTC m=+1.884904389 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 09:40:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:08.762 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:08.763 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:40:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:08.763 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:40:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:40:08 np0005541913.localdomain podman[245596]: 2025-12-02 09:40:08.789172378 +0000 UTC m=+1.923345165 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public)
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.073 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.073 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.074 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.074 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:40:09 np0005541913.localdomain sudo[245733]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjqvjnifahpcuxebjazeeyoqjedvyuyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668408.8401701-2686-82850366419962/AnsiballZ_podman_container_exec.py
Dec 02 09:40:09 np0005541913.localdomain sudo[245733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:09 np0005541913.localdomain python3.9[245735]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.525 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.542 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.542 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.543 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.543 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.544 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.564 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.565 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.565 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.565 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.566 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:40:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:09.784 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:09 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11624 DF PROTO=TCP SPT=38682 DPT=9882 SEQ=901355716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479E9DE40000000001030307) 
Dec 02 09:40:10 np0005541913.localdomain systemd[1]: libpod-conmon-31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.scope: Deactivated successfully.
Dec 02 09:40:10 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.080 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:40:10 np0005541913.localdomain podman[245633]: 2025-12-02 09:40:10.112659249 +0000 UTC m=+1.322441725 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.146 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.146 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:40:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:10 np0005541913.localdomain systemd[1]: Started libpod-conmon-31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.scope.
Dec 02 09:40:10 np0005541913.localdomain podman[245633]: 2025-12-02 09:40:10.17723295 +0000 UTC m=+1.387015426 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:40:10 np0005541913.localdomain podman[245736]: 2025-12-02 09:40:10.180352133 +0000 UTC m=+0.837542578 container exec 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3)
Dec 02 09:40:10 np0005541913.localdomain podman[245736]: 2025-12-02 09:40:10.259901274 +0000 UTC m=+0.917091719 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute)
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.357 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.358 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12425MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.358 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.359 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.453 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.453 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.454 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.495 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.537 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.537 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.553 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.574 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_VOLUME_EXTEND,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,HW_CPU_X86_FMA3,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 09:40:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:10.616 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:40:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:11.121 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:40:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:11.126 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:40:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:11.140 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:40:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:11.141 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:40:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:11.142 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:40:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:11.319 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:11.320 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:11.320 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:11.321 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:40:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:40:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:12.276 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a423cc2ecc4b4a7a413eebe91da3e0f5986adaa9cffa0acabc604ad76a95339a-merged.mount: Deactivated successfully.
Dec 02 09:40:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a423cc2ecc4b4a7a413eebe91da3e0f5986adaa9cffa0acabc604ad76a95339a-merged.mount: Deactivated successfully.
Dec 02 09:40:12 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:40:12 np0005541913.localdomain sudo[245733]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:12 np0005541913.localdomain sudo[245930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olbwgxppugmcbagnpswafzaxgpxwdgjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668412.7221508-2694-77693568215571/AnsiballZ_file.py
Dec 02 09:40:12 np0005541913.localdomain sudo[245930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16776 DF PROTO=TCP SPT=45756 DPT=9100 SEQ=2475425474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479EA9E40000000001030307) 
Dec 02 09:40:13 np0005541913.localdomain python3.9[245932]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:40:13 np0005541913.localdomain sudo[245930]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:40:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:40:13 np0005541913.localdomain sudo[246040]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzfvuttapdiuhfrkxyiyavccazofvror ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668413.3667855-2703-93869618292974/AnsiballZ_podman_container_info.py
Dec 02 09:40:13 np0005541913.localdomain sudo[246040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:40:13 np0005541913.localdomain systemd[1]: libpod-conmon-31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.scope: Deactivated successfully.
Dec 02 09:40:13 np0005541913.localdomain python3.9[246042]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec 02 09:40:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:14.820 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.096 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.101 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97c64543-3850-4379-915d-0ce07b864a87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.097735', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e862c7e6-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': '6e6767886b6197409b9d684ce559c450bdfdc0232f292c6a248c22a1f72edea0'}]}, 'timestamp': '2025-12-02 09:40:16.101944', '_unique_id': 'af1f409faf57438ebe4b29235c9a5988'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.102 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.103 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9bf866d-eea7-4e46-9786-360dd8bb0344', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.103561', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e86312be-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': 'f5238250e52f4467a0788f1b9a0129b3ea7b28957619b31c250b1687cfc28ade'}]}, 'timestamp': '2025-12-02 09:40:16.103855', '_unique_id': '9d2780bf0e154531ac856c1866be4671'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.104 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd2b6316-a1bf-48ec-8425-15418fc4842f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.104942', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e863491e-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': 'dd9ff593298bfaf9fb260dcb44c6e04fde1513fb567230f8062ca455e41ea4ae'}]}, 'timestamp': '2025-12-02 09:40:16.105221', '_unique_id': '33cbbee3ebaa4c899b1ef0360e463750'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.105 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.123 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.123 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96edc24c-7dc5-49e2-a93a-d4548f61c7cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:40:16.106388', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e86614b4-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.325430501, 'message_signature': '4bf92d36d03d09790195342ec91f4c66ab83cc547f9985814375a69fc17638fd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:40:16.106388', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8661e3c-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.325430501, 'message_signature': 'df5e31e062f77c696137a9f88a124b7a68531021a5548999064f981e8775ad32'}]}, 'timestamp': '2025-12-02 09:40:16.123756', '_unique_id': '8a992a48267341a185a05ae06fcd23cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a44e00d7-3f95-4850-b370-9c3dcfe3c4dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.125180', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e8665e7e-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': '514d3ac9d64f13c2eb8787f80da66c2ca873aec3fa40785fa19ee57d3fb6477d'}]}, 'timestamp': '2025-12-02 09:40:16.125408', '_unique_id': 'd1efe562e85545f0822f3434a71c696b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.125 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20738 DF PROTO=TCP SPT=56810 DPT=9100 SEQ=2923773322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479EB6240000000001030307) 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.161 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.162 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd88d2406-d56d-44d8-9052-66149022dedf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:40:16.126529', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e86bfa50-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': '602300d5d8681d6d31f8c61beb38462cc0a07416c47d19d445e05ff19e59c029'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:40:16.126529', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e86c02f2-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': '2293ded20a1c3dd0db794b65bd2f43b7a0f869db04b46ab11a45f260ca669b16'}]}, 'timestamp': '2025-12-02 09:40:16.162379', '_unique_id': '998518aef0fb4216a024fe9da7173117'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.163 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d9a4572-049b-4627-9702-d685a64b3d74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.163745', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e86c4122-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': '371ddef3c54d131b3a0008c2cce3257b987af788c9f9779783e8c6181a78a2e4'}]}, 'timestamp': '2025-12-02 09:40:16.163981', '_unique_id': 'b00d14fdd55941ce820f0f476526a3ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1433516318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 164656012 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cc66cf6-521f-480f-97ab-38bb930c7e0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1433516318, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:40:16.165015', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e86c725a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': '6521e58f056e1b3bd75fe992be77329ef3a894f4dd5d7e866dcee091d59da69b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164656012, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:40:16.165015', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e86c7b74-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': '30f1b604d3abfc29dddafa571ce1e31aa67d49dc7703c42cd15325929c71d565'}]}, 'timestamp': '2025-12-02 09:40:16.165460', '_unique_id': '6f81350ad38042bf8ebf2e2bd30b0673'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.166 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 54090000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c90aaf9-2255-477b-9409-fdfafa844f20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54090000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:40:16.166575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e87013ba-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.407323754, 'message_signature': '0c492dc7e2049c5f820ba446053fdafae2abdec62d9a413abef364886fda8077'}]}, 'timestamp': '2025-12-02 09:40:16.189243', '_unique_id': 'c93871717e77439b804abc8cb3f10aa5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.192 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.192 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5417526-1649-4ab3-ad4d-2e4afb1440a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:40:16.192229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8709e7a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.325430501, 'message_signature': 'c5a80a4ddf6235799d358b54d62bdfaedd1e60e53345d793138beb70527ac338'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:40:16.192229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e870b05e-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.325430501, 'message_signature': 'e2f6180803951ed38b6045cb910bfb2c68982832711255de8733bf89dde8765c'}]}, 'timestamp': '2025-12-02 09:40:16.193139', '_unique_id': '999b854dd8c14fe6af7bc232c6e0c5e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.194 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.195 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.195 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b40b9df7-27b3-4de7-9875-18f0dafdfea4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.195579', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e87122e6-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': 'b60f7dfebdb765a7186ad8482f73b8edf736e86326607369e3c4dabed9c07ff6'}]}, 'timestamp': '2025-12-02 09:40:16.196100', '_unique_id': '6a3ecd202f66474ab8ccef1a15a1d8ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.198 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.198 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91b1cd57-3e1a-4d45-8b26-7cf8aea2a214', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:40:16.198292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8718ad8-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': 'ac4afd732c1023db29e1cc25e78fe65e872865d36a23fd795024cbfc5bae4788'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:40:16.198292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8719cbc-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': 'c9312e2ee9f4c758c296b8e65a53e84aeb5b470a8f59cb8684e43ddfdc211533'}]}, 'timestamp': '2025-12-02 09:40:16.199196', '_unique_id': 'd57560aff1644fbb8d83c0a690d9deac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.201 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.201 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-422d62b3b9907c649268e279099615c7aa0520fd45eabb2e450a911bab63aaa2-merged.mount: Deactivated successfully.
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '935de191-e837-428a-8a3f-9b54c0df9542', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:40:16.201393', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e872059e-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.325430501, 'message_signature': 'e270bfdf1bfa8835cbfcd97e8b1d9c375e094c5c161ba088fd4ee7dc2331615e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:40:16.201393', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8721b1a-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.325430501, 'message_signature': '13694a2857bdb39ab2d0cc8b237177ff957cf502b22297f9885e5b588422ed76'}]}, 'timestamp': '2025-12-02 09:40:16.202440', '_unique_id': 'c2de5fca42c44d1a85168c614dc6db78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cccd8e78-c8a5-4f27-bb77-83b688894c98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:40:16.209338', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8733acc-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': '5b56b20fd1ba8957abbd7016e559640262b842190ebe82f384ba5694f930b921'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:40:16.209338', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8734ff8-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': '25d1f75d965a7840029b84efe8e671c453d49ec5e22de36dd0f8793ddf5d4d5f'}]}, 'timestamp': '2025-12-02 09:40:16.210337', '_unique_id': '25ae27317c4c4ff4822fc4d91c97a5b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.212 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4451328f-6220-41a4-b7a2-645e95b8a305', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.212717', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e873bfec-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': '180e5f5d6052f102839e75bc926080df52b20ebe529f2ca19f9008d9040169d2'}]}, 'timestamp': '2025-12-02 09:40:16.213236', '_unique_id': '11361df2551f4f8199828dee51b4bbf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.214 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0c9e76b-7969-4b80-821f-076254932a25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.215552', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e8742ec8-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': '5c5bc9f84c9328c095c4c915c6ff973c773486af2e80e2ab0c2b812b724e46ae'}]}, 'timestamp': '2025-12-02 09:40:16.216066', '_unique_id': '85bf79e3e44b4b299dc5172ccd58f0f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.218 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.218 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.219 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38d62981-e226-4eac-a5c3-a85fe93ab5a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:40:16.218507', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e874a218-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': '2a92b43fbf81abbef03e2e3c4d773fa41842e0855ae053dffd9c68cd836927a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:40:16.218507', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e874b2c6-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': 'd7587ecb6709fddea19a0b6412a5b4a417bbdbb5d539e2d39fd71b758c50f2cb'}]}, 'timestamp': '2025-12-02 09:40:16.219425', '_unique_id': 'e8bd4caa24094924ac5b03d4b0fd50e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.221 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 8783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35c7203b-8980-4edf-9d2d-62ae9c80abc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8783, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.221857', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e87523dc-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': '0790fdf3ba6e9a07fb2575ceffe206eb9d6580f06e6505dd667b58bd76d4bcd5'}]}, 'timestamp': '2025-12-02 09:40:16.222266', '_unique_id': '5cd2717b0dc54de7a046f2bde12fcd61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.224 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '088689ee-63e5-47e9-b4e9-adae6e3eefeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:40:16.224098', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'e87577e2-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.3167812, 'message_signature': 'b3744e0bb1d85d9b397f80e4cef231aec95217fc59b2ea99ad1108d7347d36d5'}]}, 'timestamp': '2025-12-02 09:40:16.224402', '_unique_id': '7b4da405cf3a4ebf9000abe9c9d60750'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.225 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da70e0d5-324e-4f2a-9aea-d482d10f1ac9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:40:16.225812', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e875bbee-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.407323754, 'message_signature': 'a0809458aac3945213ed2ca9ab34d8f58f2baec3036d8ad335ef00c9007aa652'}]}, 'timestamp': '2025-12-02 09:40:16.226135', '_unique_id': '7069eb17ab6a41a381864f29b9ca2a8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.227 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.227 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 286697561 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.227 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 39228582 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef12aabc-4312-427e-81b0-e52d147fe759', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 286697561, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:40:16.227602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8760176-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': 'a390de5a6798fa36eb5fddf8bf7a1db7d99f59942af4e7a2b1c5a939bac0ecb9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39228582, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:40:16.227602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8760cf2-cf62-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10578.345583588, 'message_signature': '2956d344e7bc964ed4915addca0aba413c44d9f1777bf62c7e680a717d0a9f59'}]}, 'timestamp': '2025-12-02 09:40:16.228203', '_unique_id': 'fe0c8aac0e9b4c22a8702479b5b6d014'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:40:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:40:16.228 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:40:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:17.301 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:40:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:40:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:40:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:40:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:40:18 np0005541913.localdomain sudo[246040]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:18 np0005541913.localdomain podman[246054]: 2025-12-02 09:40:18.930912387 +0000 UTC m=+1.566856310 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 09:40:18 np0005541913.localdomain podman[246054]: 2025-12-02 09:40:18.951787584 +0000 UTC m=+1.587731547 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 02 09:40:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30218 DF PROTO=TCP SPT=47556 DPT=9102 SEQ=2105428043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479EC1E50000000001030307) 
Dec 02 09:40:19 np0005541913.localdomain sudo[246178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkuxodnnchznmfsdjvzjshnleahxrbyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668419.5298257-2711-203739092591151/AnsiballZ_podman_container_exec.py
Dec 02 09:40:19 np0005541913.localdomain sudo[246178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:19 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:19.825 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:20 np0005541913.localdomain python3.9[246180]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:40:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:40:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:40:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:40:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:21 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:40:21 np0005541913.localdomain systemd[1]: Started libpod-conmon-89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.scope.
Dec 02 09:40:21 np0005541913.localdomain podman[246181]: 2025-12-02 09:40:21.871105127 +0000 UTC m=+1.843232978 container exec 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:40:21 np0005541913.localdomain podman[246181]: 2025-12-02 09:40:21.900576392 +0000 UTC m=+1.872704203 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:40:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17422 DF PROTO=TCP SPT=52866 DPT=9101 SEQ=2814881998 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479ECDCF0000000001030307) 
Dec 02 09:40:22 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:22.305 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:40:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:40:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:40:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:23 np0005541913.localdomain sudo[246178]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:24 np0005541913.localdomain sudo[246316]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcxndowohhqakbpwzlzytdpexlpttucy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668424.074332-2719-252713234074427/AnsiballZ_podman_container_exec.py
Dec 02 09:40:24 np0005541913.localdomain sudo[246316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:40:24 np0005541913.localdomain python3.9[246318]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:40:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:40:24 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:24.859 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: libpod-conmon-89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.scope: Deactivated successfully.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: Started libpod-conmon-89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.scope.
Dec 02 09:40:25 np0005541913.localdomain podman[246319]: 2025-12-02 09:40:25.171697934 +0000 UTC m=+0.629658726 container exec 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:40:25 np0005541913.localdomain podman[246319]: 2025-12-02 09:40:25.204131998 +0000 UTC m=+0.662092740 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:40:25 np0005541913.localdomain podman[246332]: 2025-12-02 09:40:25.23268892 +0000 UTC m=+0.098516048 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec 02 09:40:25 np0005541913.localdomain podman[246332]: 2025-12-02 09:40:25.26043418 +0000 UTC m=+0.126261308 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:40:25 np0005541913.localdomain podman[246331]: 2025-12-02 09:40:25.270778665 +0000 UTC m=+0.135957485 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:40:25 np0005541913.localdomain podman[246331]: 2025-12-02 09:40:25.280119685 +0000 UTC m=+0.145298495 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:40:25 np0005541913.localdomain podman[246331]: unhealthy
Dec 02 09:40:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17424 DF PROTO=TCP SPT=52866 DPT=9101 SEQ=2814881998 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479ED9E40000000001030307) 
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:40:25 np0005541913.localdomain systemd[1]: libpod-conmon-89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.scope: Deactivated successfully.
Dec 02 09:40:25 np0005541913.localdomain sudo[246316]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:26 np0005541913.localdomain sudo[246502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffphpfdeqtbtlrtiymnanfpgtcawwjaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668426.1042316-2727-136055866705763/AnsiballZ_file.py
Dec 02 09:40:26 np0005541913.localdomain sudo[246502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:26 np0005541913.localdomain python3.9[246504]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:40:26 np0005541913.localdomain sudo[246502]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:26 np0005541913.localdomain sudo[246612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhimfwomdanbqlyqhlvcxtlzxcvodcox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668426.6965072-2736-187173754047759/AnsiballZ_podman_container_info.py
Dec 02 09:40:26 np0005541913.localdomain sudo[246612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:27 np0005541913.localdomain python3.9[246614]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 02 09:40:27 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:27.308 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:40:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-422d62b3b9907c649268e279099615c7aa0520fd45eabb2e450a911bab63aaa2-merged.mount: Deactivated successfully.
Dec 02 09:40:28 np0005541913.localdomain sudo[246612]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:29 np0005541913.localdomain sudo[246736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxnpkgyjwdmgprqkdjvhynkpdehopwkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668429.0031796-2744-163321429680647/AnsiballZ_podman_container_exec.py
Dec 02 09:40:29 np0005541913.localdomain sudo[246736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17425 DF PROTO=TCP SPT=52866 DPT=9101 SEQ=2814881998 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479EE9A40000000001030307) 
Dec 02 09:40:29 np0005541913.localdomain python3.9[246738]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:40:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:40:29 np0005541913.localdomain systemd[1]: Started libpod-conmon-53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.scope.
Dec 02 09:40:29 np0005541913.localdomain podman[246739]: 2025-12-02 09:40:29.662964783 +0000 UTC m=+0.144114632 container exec 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:40:29 np0005541913.localdomain podman[246739]: 2025-12-02 09:40:29.694057102 +0000 UTC m=+0.175206971 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:40:29 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:29.906 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:40:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:40:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:40:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:40:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:40:31 np0005541913.localdomain sudo[246736]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:31 np0005541913.localdomain podman[246766]: 2025-12-02 09:40:31.458835808 +0000 UTC m=+1.171038089 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:40:31 np0005541913.localdomain podman[246766]: 2025-12-02 09:40:31.467904499 +0000 UTC m=+1.180106800 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:40:31 np0005541913.localdomain sudo[246892]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orkoaaihbqfuhqoatdebevjdeshfxqdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668431.507509-2752-274696897618699/AnsiballZ_podman_container_exec.py
Dec 02 09:40:31 np0005541913.localdomain sudo[246892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:31 np0005541913.localdomain python3.9[246894]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:40:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:32 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:32.311 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:40:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:40:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:40:33 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:40:33 np0005541913.localdomain systemd[1]: libpod-conmon-53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.scope: Deactivated successfully.
Dec 02 09:40:33 np0005541913.localdomain systemd[1]: Started libpod-conmon-53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.scope.
Dec 02 09:40:33 np0005541913.localdomain podman[246895]: 2025-12-02 09:40:33.234223856 +0000 UTC m=+1.310706172 container exec 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:40:33 np0005541913.localdomain podman[246895]: 2025-12-02 09:40:33.289783168 +0000 UTC m=+1.366265504 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:40:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15401 DF PROTO=TCP SPT=37240 DPT=9102 SEQ=3585207379 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479EFB8E0000000001030307) 
Dec 02 09:40:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28093 DF PROTO=TCP SPT=35492 DPT=9105 SEQ=134310109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479EFC0F0000000001030307) 
Dec 02 09:40:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:40:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:40:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336-merged.mount: Deactivated successfully.
Dec 02 09:40:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:40:34 np0005541913.localdomain sudo[246892]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:34 np0005541913.localdomain sudo[247032]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlxpxlcnnidvvtqtnvefjksnrigechoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668434.4127018-2760-151334901834897/AnsiballZ_file.py
Dec 02 09:40:34 np0005541913.localdomain sudo[247032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:40:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:40:34 np0005541913.localdomain python3.9[247034]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:40:34 np0005541913.localdomain sudo[247032]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:34 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:34.955 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:35 np0005541913.localdomain sudo[247142]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yngfhmgoefyzzisnbbotwocdsvesygbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668435.0618389-2769-99183551198332/AnsiballZ_podman_container_info.py
Dec 02 09:40:35 np0005541913.localdomain sudo[247142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:35 np0005541913.localdomain python3.9[247144]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 02 09:40:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:40:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:37 np0005541913.localdomain systemd[1]: libpod-conmon-53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.scope: Deactivated successfully.
Dec 02 09:40:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15403 DF PROTO=TCP SPT=37240 DPT=9102 SEQ=3585207379 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F07A40000000001030307) 
Dec 02 09:40:37 np0005541913.localdomain podman[247156]: 2025-12-02 09:40:37.093389184 +0000 UTC m=+0.433850206 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 09:40:37 np0005541913.localdomain podman[247156]: 2025-12-02 09:40:37.122646404 +0000 UTC m=+0.463107446 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 02 09:40:37 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:37.341 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:37 np0005541913.localdomain sudo[247174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:40:37 np0005541913.localdomain sudo[247174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:40:37 np0005541913.localdomain sudo[247174]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:40:37 np0005541913.localdomain sudo[247192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 09:40:37 np0005541913.localdomain sudo[247192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:40:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:39 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:40:39 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:39.958 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47223 DF PROTO=TCP SPT=53748 DPT=9100 SEQ=4293338437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F13A40000000001030307) 
Dec 02 09:40:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:40:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:40:41 np0005541913.localdomain sudo[247142]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:41 np0005541913.localdomain podman[247226]: 2025-12-02 09:40:41.192805057 +0000 UTC m=+0.817985547 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Dec 02 09:40:41 np0005541913.localdomain podman[247226]: 2025-12-02 09:40:41.216586501 +0000 UTC m=+0.841766941 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7)
Dec 02 09:40:41 np0005541913.localdomain sudo[247192]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:41 np0005541913.localdomain sudo[247269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:40:41 np0005541913.localdomain sudo[247269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:40:41 np0005541913.localdomain sudo[247269]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:41 np0005541913.localdomain sudo[247307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:40:41 np0005541913.localdomain sudo[247307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:40:41 np0005541913.localdomain sudo[247395]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvquxywhxyyprlqeowyrfmkzxuvrjhcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668441.3407598-2777-198787991439159/AnsiballZ_podman_container_exec.py
Dec 02 09:40:41 np0005541913.localdomain sudo[247395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:41 np0005541913.localdomain python3.9[247397]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:40:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:40:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:42 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:42.357 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:42 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:40:42 np0005541913.localdomain systemd[1]: Started libpod-conmon-6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.scope.
Dec 02 09:40:42 np0005541913.localdomain podman[247409]: 2025-12-02 09:40:42.449273742 +0000 UTC m=+0.654055296 container exec 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 09:40:42 np0005541913.localdomain podman[247409]: 2025-12-02 09:40:42.480859305 +0000 UTC m=+0.685640819 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Dec 02 09:40:42 np0005541913.localdomain sudo[247307]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:40:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:43 np0005541913.localdomain sudo[247466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:40:43 np0005541913.localdomain sudo[247466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:40:43 np0005541913.localdomain sudo[247466]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42711 DF PROTO=TCP SPT=42896 DPT=9882 SEQ=2305601042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F1FA50000000001030307) 
Dec 02 09:40:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:43 np0005541913.localdomain sudo[247395]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:43 np0005541913.localdomain podman[247457]: 2025-12-02 09:40:43.291063983 +0000 UTC m=+0.431426582 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:40:43 np0005541913.localdomain podman[247457]: 2025-12-02 09:40:43.304041859 +0000 UTC m=+0.444404508 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:40:43 np0005541913.localdomain sudo[247601]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhzatpizefrsfoicytkezgctfltzjzhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668443.3801658-2785-268002008041212/AnsiballZ_podman_container_exec.py
Dec 02 09:40:43 np0005541913.localdomain systemd[1]: libpod-conmon-6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.scope: Deactivated successfully.
Dec 02 09:40:43 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:40:43 np0005541913.localdomain sudo[247601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:43 np0005541913.localdomain python3.9[247603]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 09:40:43 np0005541913.localdomain systemd[1]: Started libpod-conmon-6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.scope.
Dec 02 09:40:43 np0005541913.localdomain podman[247606]: 2025-12-02 09:40:43.977834571 +0000 UTC m=+0.118341115 container exec 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 09:40:44 np0005541913.localdomain podman[247606]: 2025-12-02 09:40:44.007539363 +0000 UTC m=+0.148045937 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 09:40:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:40:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ffb2f622809786e8c5fa4ed1e24ed2fc8ffb08acf1d089c8dce2a99369c6a336-merged.mount: Deactivated successfully.
Dec 02 09:40:44 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:44.995 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-977574eb305bf7cb5c1e2fdd973144cbfe7638acd584e0968bf15c31ee49846c-merged.mount: Deactivated successfully.
Dec 02 09:40:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47225 DF PROTO=TCP SPT=53748 DPT=9100 SEQ=4293338437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F2B640000000001030307) 
Dec 02 09:40:46 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:46 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:46 np0005541913.localdomain podman[240799]: time="2025-12-02T09:40:46Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28/merged: invalid argument"
Dec 02 09:40:46 np0005541913.localdomain podman[240799]: time="2025-12-02T09:40:46Z" level=error msg="Getting root fs size for \"bab4f14ce85ea50a857b87bac282c7122719c952a0f3e60419c5d8fdba42705d\": getting diffsize of layer \"e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28\" and its parent \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\": creating overlay mount to /var/lib/containers/storage/overlay/e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/7ZIUCCDB7QIVI2FKGMJG2V3343:/var/lib/containers/storage/overlay/l/SVA2C5DVPQAVJLYG7FZGJBLHH5:/var/lib/containers/storage/overlay/l/S22KQKMHBS35Z24VPC7PIA3U37:/var/lib/containers/storage/overlay/l/QKXJKO4MENC2JONUN57ZG7NBIN,upperdir=/var/lib/containers/storage/overlay/e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28/diff,workdir=/var/lib/containers/storage/overlay/e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28/work,nodev,metacopy=on\": no such file or directory"
Dec 02 09:40:46 np0005541913.localdomain sudo[247601]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:46 np0005541913.localdomain sudo[247746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwpkwigkkrukmgfsdvwwqudlirenlljv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668446.4949732-2793-145402868198828/AnsiballZ_file.py
Dec 02 09:40:46 np0005541913.localdomain sudo[247746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:40:46 np0005541913.localdomain python3.9[247748]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:40:46 np0005541913.localdomain sudo[247746]: pam_unix(sudo:session): session closed for user root
Dec 02 09:40:47 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:47.361 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8c52eb2917af814f67bf9757f04611b4867e02cd94735e31ef932542a90a8de8-merged.mount: Deactivated successfully.
Dec 02 09:40:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8c52eb2917af814f67bf9757f04611b4867e02cd94735e31ef932542a90a8de8-merged.mount: Deactivated successfully.
Dec 02 09:40:49 np0005541913.localdomain systemd[1]: libpod-conmon-6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.scope: Deactivated successfully.
Dec 02 09:40:49 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28097 DF PROTO=TCP SPT=35492 DPT=9105 SEQ=134310109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F37E40000000001030307) 
Dec 02 09:40:50 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:50.032 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:40:51 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:40:51 np0005541913.localdomain podman[247766]: 2025-12-02 09:40:51.99222299 +0000 UTC m=+0.060241318 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:40:51 np0005541913.localdomain podman[247766]: 2025-12-02 09:40:51.998868967 +0000 UTC m=+0.066887295 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 02 09:40:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46761 DF PROTO=TCP SPT=58480 DPT=9101 SEQ=1329028917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F43000000000001030307) 
Dec 02 09:40:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:52 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:52.406 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:52 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:52 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:40:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:53 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:53 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:55 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:55.070 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46763 DF PROTO=TCP SPT=58480 DPT=9101 SEQ=1329028917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F4F250000000001030307) 
Dec 02 09:40:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:40:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:40:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:40:56 np0005541913.localdomain podman[247785]: 2025-12-02 09:40:56.046050747 +0000 UTC m=+0.077548068 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:40:56 np0005541913.localdomain podman[247785]: 2025-12-02 09:40:56.101792003 +0000 UTC m=+0.133289324 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 09:40:56 np0005541913.localdomain podman[247784]: 2025-12-02 09:40:56.106448427 +0000 UTC m=+0.140888167 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:40:56 np0005541913.localdomain podman[247784]: 2025-12-02 09:40:56.189314086 +0000 UTC m=+0.223753826 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:40:56 np0005541913.localdomain podman[247784]: unhealthy
Dec 02 09:40:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-977574eb305bf7cb5c1e2fdd973144cbfe7638acd584e0968bf15c31ee49846c-merged.mount: Deactivated successfully.
Dec 02 09:40:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:40:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:40:56 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:56 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:40:56 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:56 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:40:56 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:40:56 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:40:57 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:40:57.443 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:40:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:40:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:40:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46764 DF PROTO=TCP SPT=58480 DPT=9101 SEQ=1329028917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F5EE40000000001030307) 
Dec 02 09:40:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:40:59 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:00 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:00.104 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ae00fa9a5dde399a6e7e6528b55d78146560e04771ac7245c19ea55518318121-merged.mount: Deactivated successfully.
Dec 02 09:41:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:41:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:41:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:02 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:02.448 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:02 np0005541913.localdomain podman[240799]: time="2025-12-02T09:41:02Z" level=error msg="Getting root fs size for \"c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 02 09:41:02 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:02 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:41:03.024 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:41:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:41:03.025 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:41:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:41:03.025 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:41:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:41:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:03 np0005541913.localdomain podman[247827]: 2025-12-02 09:41:03.481545783 +0000 UTC m=+0.150494553 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:41:03 np0005541913.localdomain podman[247827]: 2025-12-02 09:41:03.490243664 +0000 UTC m=+0.159192444 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 02 09:41:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12764 DF PROTO=TCP SPT=40932 DPT=9102 SEQ=631520543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F70BE0000000001030307) 
Dec 02 09:41:03 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:41:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5093 DF PROTO=TCP SPT=49690 DPT=9105 SEQ=2103289598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F713E0000000001030307) 
Dec 02 09:41:04 np0005541913.localdomain systemd[1]: tmp-crun.iTCRaW.mount: Deactivated successfully.
Dec 02 09:41:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:41:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-eee6dae47ff617871c47add2aa57f33c2f7e68905855055afb3a7b04648ecacd-merged.mount: Deactivated successfully.
Dec 02 09:41:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 02 09:41:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 02 09:41:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 02 09:41:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:05.158 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:41:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 02 09:41:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 02 09:41:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 02 09:41:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:41:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8c52eb2917af814f67bf9757f04611b4867e02cd94735e31ef932542a90a8de8-merged.mount: Deactivated successfully.
Dec 02 09:41:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:06.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:06.722 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:41:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12766 DF PROTO=TCP SPT=40932 DPT=9102 SEQ=631520543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F7CE50000000001030307) 
Dec 02 09:41:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 02 09:41:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:07.486 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:08.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:08.739 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:41:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:08.740 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:41:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:08.740 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:41:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:08.741 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:41:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:08.741 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.155 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:41:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:41:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.233 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.234 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:41:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:41:09 np0005541913.localdomain podman[247867]: 2025-12-02 09:41:09.312828694 +0000 UTC m=+0.093108954 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 09:41:09 np0005541913.localdomain podman[247867]: 2025-12-02 09:41:09.316892262 +0000 UTC m=+0.097172452 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 09:41:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.430 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.431 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12174MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.431 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.431 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.492 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.492 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.493 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.537 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:41:09 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.945 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.950 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.965 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.967 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:41:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:09.968 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:41:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35670 DF PROTO=TCP SPT=37638 DPT=9100 SEQ=103877685 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F88A40000000001030307) 
Dec 02 09:41:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:10.195 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:10.969 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:10.970 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:41:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:10.970 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:41:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:41:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:41:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:12.079 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:41:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:12.079 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:41:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:12.080 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:41:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:12.080 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:41:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:41:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:41:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:41:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:41:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:41:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:12.489 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:12 np0005541913.localdomain podman[247907]: 2025-12-02 09:41:12.52222168 +0000 UTC m=+0.099870393 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Dec 02 09:41:12 np0005541913.localdomain podman[247907]: 2025-12-02 09:41:12.564016444 +0000 UTC m=+0.141665187 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, container_name=openstack_network_exporter, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64)
Dec 02 09:41:13 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:41:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55416 DF PROTO=TCP SPT=36386 DPT=9882 SEQ=555176884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479F94E40000000001030307) 
Dec 02 09:41:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:41:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:41:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:41:13 np0005541913.localdomain podman[247927]: 2025-12-02 09:41:13.843273867 +0000 UTC m=+0.085254433 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:41:13 np0005541913.localdomain podman[247927]: 2025-12-02 09:41:13.879447572 +0000 UTC m=+0.121428098 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:41:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.050 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.068 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.068 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.069 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.069 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.070 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.070 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.070 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.239 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:41:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7-merged.mount: Deactivated successfully.
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.819 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:15.819 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:41:15 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:41:15 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:16 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35672 DF PROTO=TCP SPT=37638 DPT=9100 SEQ=103877685 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479FA0640000000001030307) 
Dec 02 09:41:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:17.526 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ae00fa9a5dde399a6e7e6528b55d78146560e04771ac7245c19ea55518318121-merged.mount: Deactivated successfully.
Dec 02 09:41:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:41:18 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5097 DF PROTO=TCP SPT=49690 DPT=9105 SEQ=2103289598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479FADE40000000001030307) 
Dec 02 09:41:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:41:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:41:20 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:20.274 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:21 np0005541913.localdomain podman[240799]: time="2025-12-02T09:41:21Z" level=error msg="Getting root fs size for \"c88e11ade7cb42412d755bbc83b537b85df2401b0966bb2fdc0b1069a90bbe52\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 02 09:41:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:22 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50484 DF PROTO=TCP SPT=47528 DPT=9101 SEQ=3420111108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479FB8300000000001030307) 
Dec 02 09:41:22 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:22.559 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:22 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 02 09:41:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-eee6dae47ff617871c47add2aa57f33c2f7e68905855055afb3a7b04648ecacd-merged.mount: Deactivated successfully.
Dec 02 09:41:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:41:22 np0005541913.localdomain podman[247949]: 2025-12-02 09:41:22.788824899 +0000 UTC m=+0.089013115 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:41:22 np0005541913.localdomain podman[247949]: 2025-12-02 09:41:22.831897597 +0000 UTC m=+0.132085863 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 02 09:41:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:23 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:41:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 02 09:41:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 02 09:41:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:41:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:41:25 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50486 DF PROTO=TCP SPT=47528 DPT=9101 SEQ=3420111108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479FC4240000000001030307) 
Dec 02 09:41:25 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:25.314 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:41:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:41:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:41:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:41:26 np0005541913.localdomain systemd[1]: tmp-crun.mliDXl.mount: Deactivated successfully.
Dec 02 09:41:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a-merged.mount: Deactivated successfully.
Dec 02 09:41:26 np0005541913.localdomain podman[247969]: 2025-12-02 09:41:26.940895995 +0000 UTC m=+0.099720559 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:41:27 np0005541913.localdomain podman[247968]: 2025-12-02 09:41:27.048312479 +0000 UTC m=+0.212068265 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:41:27 np0005541913.localdomain podman[247968]: 2025-12-02 09:41:27.055715236 +0000 UTC m=+0.219471062 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:41:27 np0005541913.localdomain podman[247968]: unhealthy
Dec 02 09:41:27 np0005541913.localdomain podman[247969]: 2025-12-02 09:41:27.065297282 +0000 UTC m=+0.224121876 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 09:41:27 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:27.596 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:41:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:41:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:41:27 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:41:27 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:41:27 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:41:29 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50487 DF PROTO=TCP SPT=47528 DPT=9101 SEQ=3420111108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479FD3E40000000001030307) 
Dec 02 09:41:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:41:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:41:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:41:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 02 09:41:30 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:30.363 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 02 09:41:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:41:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:41:30 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 02 09:41:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:41:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:41:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:41:32 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:32.645 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55632 DF PROTO=TCP SPT=41432 DPT=9102 SEQ=281541048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479FE5EE0000000001030307) 
Dec 02 09:41:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45438 DF PROTO=TCP SPT=46792 DPT=9105 SEQ=1240696681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479FE66E0000000001030307) 
Dec 02 09:41:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:41:34 np0005541913.localdomain podman[248016]: 2025-12-02 09:41:34.471187928 +0000 UTC m=+0.111265226 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:41:34 np0005541913.localdomain podman[248016]: 2025-12-02 09:41:34.509072989 +0000 UTC m=+0.149150257 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm)
Dec 02 09:41:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 02 09:41:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7-merged.mount: Deactivated successfully.
Dec 02 09:41:35 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:35.402 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 02 09:41:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e7aea19432089756ed62f0f30cfa5a3f11dba2345bf487cdfbd5c2a4914be89-merged.mount: Deactivated successfully.
Dec 02 09:41:35 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e7aea19432089756ed62f0f30cfa5a3f11dba2345bf487cdfbd5c2a4914be89-merged.mount: Deactivated successfully.
Dec 02 09:41:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55634 DF PROTO=TCP SPT=41432 DPT=9102 SEQ=281541048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479FF1E40000000001030307) 
Dec 02 09:41:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:41:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:41:37 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:41:37 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:37.689 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:41:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:41:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:39 np0005541913.localdomain podman[248035]: 2025-12-02 09:41:39.876936266 +0000 UTC m=+0.082091659 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:41:39 np0005541913.localdomain podman[248035]: 2025-12-02 09:41:39.88796895 +0000 UTC m=+0.093124333 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:41:40 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:40 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42714 DF PROTO=TCP SPT=42896 DPT=9882 SEQ=2305601042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A479FFDE40000000001030307) 
Dec 02 09:41:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:41:40 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:40.433 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:41 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:41 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:41 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:41:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:42 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:42.730 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:41:43 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52065 DF PROTO=TCP SPT=51744 DPT=9882 SEQ=488510271 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A009E50000000001030307) 
Dec 02 09:41:43 np0005541913.localdomain sudo[248053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:41:43 np0005541913.localdomain sudo[248053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:41:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:41:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:41:43 np0005541913.localdomain sudo[248053]: pam_unix(sudo:session): session closed for user root
Dec 02 09:41:43 np0005541913.localdomain sudo[248072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:41:43 np0005541913.localdomain sudo[248072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:41:43 np0005541913.localdomain systemd[1]: tmp-crun.RYBYhx.mount: Deactivated successfully.
Dec 02 09:41:43 np0005541913.localdomain podman[248070]: 2025-12-02 09:41:43.379794305 +0000 UTC m=+0.078991256 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64)
Dec 02 09:41:43 np0005541913.localdomain podman[248070]: 2025-12-02 09:41:43.428156985 +0000 UTC m=+0.127353956 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 09:41:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:41:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:44 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:41:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf7a5c1891d67f42ad2a4e32b105c4405edd8c48c080a09b863da0e9425a915a-merged.mount: Deactivated successfully.
Dec 02 09:41:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:45 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:45.479 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:45 np0005541913.localdomain sudo[248072]: pam_unix(sudo:session): session closed for user root
Dec 02 09:41:46 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2576 DF PROTO=TCP SPT=50592 DPT=9100 SEQ=123306017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A015A40000000001030307) 
Dec 02 09:41:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:41:46 np0005541913.localdomain systemd[1]: tmp-crun.AUOUMw.mount: Deactivated successfully.
Dec 02 09:41:46 np0005541913.localdomain podman[248140]: 2025-12-02 09:41:46.446251902 +0000 UTC m=+0.086944769 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:41:46 np0005541913.localdomain podman[248140]: 2025-12-02 09:41:46.45143851 +0000 UTC m=+0.092131397 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:41:46 np0005541913.localdomain sudo[248162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:41:46 np0005541913.localdomain sudo[248162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:41:46 np0005541913.localdomain sudo[248162]: pam_unix(sudo:session): session closed for user root
Dec 02 09:41:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:41:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 02 09:41:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 02 09:41:47 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:47.774 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:41:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-368ab18e82291a20f6c3548bd942730bbcde8a3da90171ac2014bcabec91a7fe-merged.mount: Deactivated successfully.
Dec 02 09:41:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-368ab18e82291a20f6c3548bd942730bbcde8a3da90171ac2014bcabec91a7fe-merged.mount: Deactivated successfully.
Dec 02 09:41:48 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:41:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45442 DF PROTO=TCP SPT=46792 DPT=9105 SEQ=1240696681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A021E40000000001030307) 
Dec 02 09:41:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:41:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:41:50 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:50.527 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:41:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:51 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:51 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:41:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:52 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44665 DF PROTO=TCP SPT=38862 DPT=9101 SEQ=1946886759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A02D600000000001030307) 
Dec 02 09:41:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:52 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:52 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:52 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:52.804 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:41:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:41:53 np0005541913.localdomain podman[248180]: 2025-12-02 09:41:53.96094717 +0000 UTC m=+0.101049095 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 09:41:53 np0005541913.localdomain podman[248180]: 2025-12-02 09:41:53.992244474 +0000 UTC m=+0.132346389 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 09:41:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 02 09:41:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e7aea19432089756ed62f0f30cfa5a3f11dba2345bf487cdfbd5c2a4914be89-merged.mount: Deactivated successfully.
Dec 02 09:41:54 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:54 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:41:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e7aea19432089756ed62f0f30cfa5a3f11dba2345bf487cdfbd5c2a4914be89-merged.mount: Deactivated successfully.
Dec 02 09:41:54 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:41:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:41:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:55 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44667 DF PROTO=TCP SPT=38862 DPT=9101 SEQ=1946886759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A039640000000001030307) 
Dec 02 09:41:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:41:55 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:55.568 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:55 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:55 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:56 np0005541913.localdomain sudo[248289]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqhcongfrqytijnpthmeffizblqruihu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668516.4436855-3039-280604679376593/AnsiballZ_file.py
Dec 02 09:41:56 np0005541913.localdomain sudo[248289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:41:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:41:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:41:56 np0005541913.localdomain python3.9[248291]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:41:56 np0005541913.localdomain sudo[248289]: pam_unix(sudo:session): session closed for user root
Dec 02 09:41:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:41:57 np0005541913.localdomain sudo[248399]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzcekidnwzyegrextlzqvjfdllzcfnhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668517.2440658-3066-74882703854168/AnsiballZ_stat.py
Dec 02 09:41:57 np0005541913.localdomain sudo[248399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:41:57 np0005541913.localdomain python3.9[248401]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:41:57 np0005541913.localdomain sudo[248399]: pam_unix(sudo:session): session closed for user root
Dec 02 09:41:57 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:41:57.835 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:41:57 np0005541913.localdomain sudo[248487]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-faizxdxakvssfkpcabqqdbslrpvnvwer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668517.2440658-3066-74882703854168/AnsiballZ_copy.py
Dec 02 09:41:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:41:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:41:58 np0005541913.localdomain sudo[248487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:41:58 np0005541913.localdomain podman[248489]: 2025-12-02 09:41:58.07937093 +0000 UTC m=+0.070884172 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:41:58 np0005541913.localdomain systemd[1]: tmp-crun.3kp974.mount: Deactivated successfully.
Dec 02 09:41:58 np0005541913.localdomain podman[248489]: 2025-12-02 09:41:58.114373542 +0000 UTC m=+0.105886844 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:41:58 np0005541913.localdomain podman[248489]: unhealthy
Dec 02 09:41:58 np0005541913.localdomain podman[248490]: 2025-12-02 09:41:58.119665853 +0000 UTC m=+0.100703085 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:41:58 np0005541913.localdomain python3.9[248497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668517.2440658-3066-74882703854168/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:41:58 np0005541913.localdomain podman[248490]: 2025-12-02 09:41:58.200119678 +0000 UTC m=+0.181156930 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:41:58 np0005541913.localdomain sudo[248487]: pam_unix(sudo:session): session closed for user root
Dec 02 09:41:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:41:58 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:58 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:58 np0005541913.localdomain sudo[248641]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtbvmxhyerlactmkpmmcgmsiqpbsqiae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668518.5386915-3114-14664410649471/AnsiballZ_file.py
Dec 02 09:41:58 np0005541913.localdomain sudo[248641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:41:58 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:41:58 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 09:41:58 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Failed with result 'exit-code'.
Dec 02 09:41:58 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:58 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:41:58 np0005541913.localdomain python3.9[248643]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:41:59 np0005541913.localdomain sudo[248641]: pam_unix(sudo:session): session closed for user root
Dec 02 09:41:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 02 09:41:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f-merged.mount: Deactivated successfully.
Dec 02 09:41:59 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44668 DF PROTO=TCP SPT=38862 DPT=9101 SEQ=1946886759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A049250000000001030307) 
Dec 02 09:41:59 np0005541913.localdomain sudo[248751]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rthcgshiteislwkunkxbqgaapnnrcgkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668519.177874-3138-134684129774799/AnsiballZ_stat.py
Dec 02 09:41:59 np0005541913.localdomain sudo[248751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:41:59 np0005541913.localdomain python3.9[248753]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:41:59 np0005541913.localdomain sudo[248751]: pam_unix(sudo:session): session closed for user root
Dec 02 09:41:59 np0005541913.localdomain sudo[248808]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etahzzwshdsqydahzdrxixdatudtkfqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668519.177874-3138-134684129774799/AnsiballZ_file.py
Dec 02 09:41:59 np0005541913.localdomain sudo[248808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:00 np0005541913.localdomain python3.9[248810]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:00 np0005541913.localdomain sudo[248808]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:42:00 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:00.574 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:00 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:00 np0005541913.localdomain sudo[248918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kztagnuugjwnunklvlswocngrgcsusge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668520.3864377-3174-141768058734514/AnsiballZ_stat.py
Dec 02 09:42:00 np0005541913.localdomain sudo[248918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:00 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:00 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:00 np0005541913.localdomain python3.9[248920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:00 np0005541913.localdomain sudo[248918]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:42:01 np0005541913.localdomain sudo[248975]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tonxpgqriqgutjireudbtgztexlykhcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668520.3864377-3174-141768058734514/AnsiballZ_file.py
Dec 02 09:42:01 np0005541913.localdomain sudo[248975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:01 np0005541913.localdomain python3.9[248977]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.m_ocy26o recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:01 np0005541913.localdomain sudo[248975]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:01 np0005541913.localdomain sudo[249085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxmzxyyibjzggkhzoerlitytonqtyigp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668521.6055622-3210-77857768088233/AnsiballZ_stat.py
Dec 02 09:42:01 np0005541913.localdomain sudo[249085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:01 np0005541913.localdomain python3.9[249087]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:02 np0005541913.localdomain sudo[249085]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:42:02 np0005541913.localdomain sudo[249142]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-burotgnpqeqqoepcfdpnyxrsidsgichm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668521.6055622-3210-77857768088233/AnsiballZ_file.py
Dec 02 09:42:02 np0005541913.localdomain sudo[249142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:02 np0005541913.localdomain python3.9[249144]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:02 np0005541913.localdomain sudo[249142]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:42:02 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:02.875 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:03 np0005541913.localdomain sudo[249252]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oooxddattxaxtzufoaqworqvpdyevjmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668522.7383313-3249-80902607377547/AnsiballZ_command.py
Dec 02 09:42:03 np0005541913.localdomain sudo[249252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:42:03.025 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:42:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:42:03.025 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:42:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:42:03.026 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:42:03 np0005541913.localdomain python3.9[249254]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:42:03 np0005541913.localdomain sudo[249252]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 02 09:42:03 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:03 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:03 np0005541913.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:03 np0005541913.localdomain sudo[249363]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oytyhwxcekjigfegyulrrnmeqcltojag ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668523.4579866-3273-156083429078550/AnsiballZ_edpm_nftables_from_files.py
Dec 02 09:42:03 np0005541913.localdomain sudo[249363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10449 DF PROTO=TCP SPT=53162 DPT=9102 SEQ=2311503365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A05B1D0000000001030307) 
Dec 02 09:42:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21872 DF PROTO=TCP SPT=34742 DPT=9105 SEQ=3849536290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A05B9E0000000001030307) 
Dec 02 09:42:04 np0005541913.localdomain python3[249365]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 09:42:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-368ab18e82291a20f6c3548bd942730bbcde8a3da90171ac2014bcabec91a7fe-merged.mount: Deactivated successfully.
Dec 02 09:42:04 np0005541913.localdomain sudo[249363]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:04 np0005541913.localdomain sudo[249473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtkfmdxowhegvrmetwgnolsoysmptyue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668524.363224-3297-85291337046473/AnsiballZ_stat.py
Dec 02 09:42:04 np0005541913.localdomain sudo[249473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:04 np0005541913.localdomain python3.9[249475]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:04 np0005541913.localdomain sudo[249473]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:05 np0005541913.localdomain sudo[249530]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orsmqmhdswxjbabenklrjxuiqgxlrdmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668524.363224-3297-85291337046473/AnsiballZ_file.py
Dec 02 09:42:05 np0005541913.localdomain sudo[249530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:05 np0005541913.localdomain python3.9[249532]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:05 np0005541913.localdomain sudo[249530]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:05.611 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:05 np0005541913.localdomain sudo[249640]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aholsmvhagmbupeiqhxomqbhnbmxrart ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668525.474293-3333-228883873859064/AnsiballZ_stat.py
Dec 02 09:42:05 np0005541913.localdomain sudo[249640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:06 np0005541913.localdomain python3.9[249642]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:06 np0005541913.localdomain sudo[249640]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:06 np0005541913.localdomain sudo[249697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbblqnfaueumpuvtwgvpyamjltyjnapr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668525.474293-3333-228883873859064/AnsiballZ_file.py
Dec 02 09:42:06 np0005541913.localdomain sudo[249697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:42:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:42:06 np0005541913.localdomain python3.9[249699]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:06 np0005541913.localdomain sudo[249697]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-46603caa88f65e015c74097f596e48b006fc6fd2b23d7cf444ca3fcae1abca86-merged.mount: Deactivated successfully.
Dec 02 09:42:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:42:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10451 DF PROTO=TCP SPT=53162 DPT=9102 SEQ=2311503365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A067240000000001030307) 
Dec 02 09:42:07 np0005541913.localdomain sudo[249807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibueebphysbjffkalvhlkbxyigfosyyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668526.889752-3369-87233363654793/AnsiballZ_stat.py
Dec 02 09:42:07 np0005541913.localdomain sudo[249807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:07 np0005541913.localdomain python3.9[249809]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:07 np0005541913.localdomain sudo[249807]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:07 np0005541913.localdomain sudo[249864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpyiigzyqdllwyzkhosrnszaqliypzzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668526.889752-3369-87233363654793/AnsiballZ_file.py
Dec 02 09:42:07 np0005541913.localdomain sudo[249864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:42:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:42:07 np0005541913.localdomain podman[249867]: 2025-12-02 09:42:07.627235448 +0000 UTC m=+0.109520071 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 09:42:07 np0005541913.localdomain podman[249867]: 2025-12-02 09:42:07.659321322 +0000 UTC m=+0.141605945 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 09:42:07 np0005541913.localdomain python3.9[249866]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:07 np0005541913.localdomain sudo[249864]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:07.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:42:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:07.723 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:42:07 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:07.879 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:08 np0005541913.localdomain sudo[249993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmzfjuislqvczvjwsgezxwrrbyffllqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668528.0423605-3405-256086048785674/AnsiballZ_stat.py
Dec 02 09:42:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:42:08 np0005541913.localdomain sudo[249993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:08 np0005541913.localdomain python3.9[249995]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:08 np0005541913.localdomain sudo[249993]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:42:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:08.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:42:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:08.741 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:42:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:08.742 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:42:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:08.742 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:42:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:08.743 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:42:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:08.743 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:42:08 np0005541913.localdomain sudo[250050]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqixifjrogoehykdrxbqrqsocqckmnvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668528.0423605-3405-256086048785674/AnsiballZ_file.py
Dec 02 09:42:08 np0005541913.localdomain sudo[250050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 02 09:42:08 np0005541913.localdomain python3.9[250053]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:08 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:42:08 np0005541913.localdomain sudo[250050]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.182 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.249 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.249 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.378 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.379 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12391MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.380 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.380 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.446 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.446 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.446 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:42:09 np0005541913.localdomain sudo[250182]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebulefeyuohngjebanclcoqqcbjsnbvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668529.191199-3441-105084003476070/AnsiballZ_stat.py
Dec 02 09:42:09 np0005541913.localdomain sudo[250182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.480 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:42:09 np0005541913.localdomain python3.9[250184]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:09 np0005541913.localdomain sudo[250182]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:09 np0005541913.localdomain sudo[250292]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqgavcttnnuwjzmiaxzskiuktyfgugib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668529.191199-3441-105084003476070/AnsiballZ_copy.py
Dec 02 09:42:09 np0005541913.localdomain sudo[250292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.940 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.950 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:42:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.971 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.974 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:42:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:09.974 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:42:10 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5611 DF PROTO=TCP SPT=58138 DPT=9100 SEQ=2600033628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A073250000000001030307) 
Dec 02 09:42:10 np0005541913.localdomain python3.9[250296]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764668529.191199-3441-105084003476070/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:10 np0005541913.localdomain sudo[250292]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:42:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:10.686 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:42:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully.
Dec 02 09:42:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:10 np0005541913.localdomain sudo[250404]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzvcdqlhklhwxmknpoodfoiwstyioiep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668530.6055076-3486-110081388469975/AnsiballZ_file.py
Dec 02 09:42:10 np0005541913.localdomain sudo[250404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:11 np0005541913.localdomain python3.9[250406]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:42:11 np0005541913.localdomain sudo[250404]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:11 np0005541913.localdomain podman[250407]: 2025-12-02 09:42:11.205802365 +0000 UTC m=+0.085467209 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 09:42:11 np0005541913.localdomain podman[250407]: 2025-12-02 09:42:11.236293898 +0000 UTC m=+0.115958722 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:42:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:42:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:11 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:11 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:11 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:42:11 np0005541913.localdomain sudo[250531]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzokuewwqsjrlphudgvcwoktrtzgxeab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668531.3078353-3510-8105422032282/AnsiballZ_command.py
Dec 02 09:42:11 np0005541913.localdomain sudo[250531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:11 np0005541913.localdomain python3.9[250533]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:42:11 np0005541913.localdomain sudo[250531]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:11.974 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:42:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:11.974 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:42:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:11.974 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:42:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:42:12 np0005541913.localdomain sudo[250644]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oevglfsnwznfuobvdzdqefbqtubtdpow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668532.0796387-3534-206410370656627/AnsiballZ_blockinfile.py
Dec 02 09:42:12 np0005541913.localdomain sudo[250644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:12 np0005541913.localdomain python3.9[250646]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:12 np0005541913.localdomain sudo[250644]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:12.925 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:42:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:13.079 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:42:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:13.079 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:42:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:13.080 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:42:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:13.080 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:42:13 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9438 DF PROTO=TCP SPT=51182 DPT=9882 SEQ=4228104018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A07F240000000001030307) 
Dec 02 09:42:13 np0005541913.localdomain sudo[250754]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otjcrctpbynprxdvyrsjrvcizwfvnmdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668533.076659-3561-43925402554603/AnsiballZ_command.py
Dec 02 09:42:13 np0005541913.localdomain sudo[250754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:13 np0005541913.localdomain python3.9[250756]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:42:13 np0005541913.localdomain sudo[250754]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:13 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:13 np0005541913.localdomain sudo[250865]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssjejgjutfwxegdnwvzfskasuqsnclbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668533.739314-3585-247462725141845/AnsiballZ_stat.py
Dec 02 09:42:13 np0005541913.localdomain sudo[250865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:14 np0005541913.localdomain python3.9[250867]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:42:14 np0005541913.localdomain sudo[250865]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:14.309 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:42:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 02 09:42:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-52524ff35057981b78caabbdad0990997b49d052172da58f45f8887febb8205f-merged.mount: Deactivated successfully.
Dec 02 09:42:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:14.332 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:42:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:14.333 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:42:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:14.333 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:42:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:14.333 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:42:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:14.334 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:42:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:14.334 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:42:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:14.334 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:42:14 np0005541913.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:14 np0005541913.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 02 09:42:14 np0005541913.localdomain sudo[250977]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdveslvfywogkixnbwripnukmplpesdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668534.4047058-3609-238122318277917/AnsiballZ_command.py
Dec 02 09:42:14 np0005541913.localdomain sudo[250977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:14 np0005541913.localdomain python3.9[250979]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:42:14 np0005541913.localdomain sudo[250977]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:14 np0005541913.localdomain podman[240799]: time="2025-12-02T09:42:14Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\""
Dec 02 09:42:14 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:36:37 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1"
Dec 02 09:42:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:15.076 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:42:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:42:15 np0005541913.localdomain podman[251024]: 2025-12-02 09:42:15.169389476 +0000 UTC m=+0.061600754 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 09:42:15 np0005541913.localdomain podman[251024]: 2025-12-02 09:42:15.177188877 +0000 UTC m=+0.069400175 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:42:15 np0005541913.localdomain sudo[251107]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzuztzwxfdhlnvguyhmkmrkwjbydvost ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668535.0756853-3633-43107748206183/AnsiballZ_file.py
Dec 02 09:42:15 np0005541913.localdomain sudo[251107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:42:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5908dabcdc4beecd14375872c1a5b4a4e28c3db557b9e42f64a01ed422f93ce2-merged.mount: Deactivated successfully.
Dec 02 09:42:15 np0005541913.localdomain python3.9[251109]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:15 np0005541913.localdomain sudo[251107]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:15.692 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:42:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:42:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:42:16 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:42:16 np0005541913.localdomain sshd[243505]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:42:16 np0005541913.localdomain systemd[1]: session-57.scope: Deactivated successfully.
Dec 02 09:42:16 np0005541913.localdomain systemd[1]: session-57.scope: Consumed 28.180s CPU time.
Dec 02 09:42:16 np0005541913.localdomain systemd-logind[757]: Session 57 logged out. Waiting for processes to exit.
Dec 02 09:42:16 np0005541913.localdomain systemd-logind[757]: Removed session 57.
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.097 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.100 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '662126a7-5d9d-4a2d-a282-f26e9fa618ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.097712', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2fe9229a-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': '47cd4ea4cf56188cd1d783bdac21b9463dbaa9e70f269b6bf51020cf0af587d4'}]}, 'timestamp': '2025-12-02 09:42:16.100932', '_unique_id': 'a3225624e9924604a2f3b12b79dd9164'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.101 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.102 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0473c1cf-a747-4c5c-bd6b-70f7b5ee9eed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.102772', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2fe97f2e-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': '914224f000476b24a5622e4a837b3d9e551f33f0a3d0c5e7eb9776960997a981'}]}, 'timestamp': '2025-12-02 09:42:16.103004', '_unique_id': '6c605bdb1ba748f0b3700e3c888f71ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.103 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.104 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.119 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4339ffc1-0fc6-45f6-951e-d4a75be71998', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:42:16.104302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '2fec06fe-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.338142136, 'message_signature': '2cd697ad71075f60723e4337c821ab00e8130ee4ad160f4227d3bb886716f02c'}]}, 'timestamp': '2025-12-02 09:42:16.119661', '_unique_id': 'de9e40c3361f4d6380c36638aac72722'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.120 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87c61dc6-e2a1-4fa2-813b-6bc18772ef5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.121128', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2fec4c2c-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': 'c723f8ebaa7608dee24e22931a26f455f681e29ea93c9c29f5f62f8ae89712f9'}]}, 'timestamp': '2025-12-02 09:42:16.121355', '_unique_id': '73748ce8f1844688a7c22a360d26214c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.121 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.122 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e27844a0-ba99-4399-a8ac-ef2e81cc460e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.122420', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2fec7e68-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': '8eeab8ba0e5ebe3c384f7e49aed4580a53f4e6cabdca29ad0a59eb3767a60d22'}]}, 'timestamp': '2025-12-02 09:42:16.122661', '_unique_id': 'c26cde8e223e4c2daefa1de8fabe61c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8eef70ed-b0f0-4c33-8255-ca96fe6d4d71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:42:16.123716', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2fee4608-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.342764131, 'message_signature': 'e2ab2f2a1d99ebd474d43cfc33de795b85711cf21f943004d94237c8ad8029d5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:42:16.123716', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2fee4dc4-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.342764131, 'message_signature': '43c1cf0a3222a90888034beba55435259764ca310e77e421d3ad3f046745a41e'}]}, 'timestamp': '2025-12-02 09:42:16.134489', '_unique_id': '23abf80e831c43af9a68067d7b4df691'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.134 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.135 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.135 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19adb394-3cc3-4a0e-adef-0fb652e81763', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:42:16.135537', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2fee7f42-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.342764131, 'message_signature': 'b82e939c136ea4f907f181850a6ec7781662da8e9f6fc3d228ef456d1e5b4f12'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:42:16.135537', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2fee86fe-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.342764131, 'message_signature': '50e284ce6e7833fb22032802aeeadc5ed7ea9d0400731f4747f81bb398a51773'}]}, 'timestamp': '2025-12-02 09:42:16.135954', '_unique_id': 'be50e2aa170447cab2c4ff05668b037c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.136 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '945dad8d-629f-4083-b187-fd4226649f4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.136978', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2feeb70a-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': '5137efc2b1cba442d59a79f56d38d7514eef11a093f9980c22c2955972c7c4e5'}]}, 'timestamp': '2025-12-02 09:42:16.137197', '_unique_id': 'd708f31f8bf44044817a64e6f723ef22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.137 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b91f2736-bfe3-45af-a25e-5f950a528ed3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.138212', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2feee7a2-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': '3c4363dc1bad8093ac9920a12f5e62448a0185979741fb98eb512ca01f517e95'}]}, 'timestamp': '2025-12-02 09:42:16.138488', '_unique_id': 'ff5eb72d04014d6891a99ad0f44d8f8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.165 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.165 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aedcdb06-5898-4a23-ae50-b38fc5bd215d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:42:16.139482', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2ff31610-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': '8553379cfccfb174ca159ad07aceef7fe91da95a2290d2c4237954043dae67bc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:42:16.139482', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2ff32182-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': 'b3c79a0a2bb8cf584ed275d91f92cc970c8180e690c5558c5025493a52d09c2d'}]}, 'timestamp': '2025-12-02 09:42:16.166164', '_unique_id': '5bcf42aa81b240d783bbc83a952bb753'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.166 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.167 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.167 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1433516318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 164656012 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8ec9643-7a43-456b-a758-0aa788e8779f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1433516318, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:42:16.167740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2ff36a7a-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': '8dd59b4b49beb885adef88a71ae97df055619c4fdb92d83ff6c7bf6f7b4d896a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164656012, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:42:16.167740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2ff374f2-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': '6567f7e54e27a820224c482e7edda0e9a1b9054e841ccf3eb0f619fde92967e9'}]}, 'timestamp': '2025-12-02 09:42:16.168297', '_unique_id': '41553269850a41b4a083f55bf43130c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.169 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.169 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 286697561 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 39228582 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2c88578-8b90-4255-804e-fdb3a75c60c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 286697561, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:42:16.169709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2ff3b732-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': '7e09637f1796e04a5422cd65640a36e57441136bcbd0ead9edfbf03664061c3c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39228582, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:42:16.169709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2ff3c3ee-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': '93edea1b2c0820b9f77946dc81dc2210eb2ca72f70141103421eff8f1b5ace60'}]}, 'timestamp': '2025-12-02 09:42:16.170345', '_unique_id': '13f644761f8e41c8a1a018b6b4448d9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.171 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.171 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b9d5927-297e-4f9c-be89-5780e53efd3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.171772', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2ff40836-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': '3cd3fc3b8e18f06aeee83c96704b359cefebc70be5b9cb49cf0dd5d3158a7f9c'}]}, 'timestamp': '2025-12-02 09:42:16.172086', '_unique_id': '24106592774a4e439163670feb82a535'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.173 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.173 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.173 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '380a9d94-0115-44c4-ac7c-15555f8b8238', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:42:16.173531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2ff44da0-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.342764131, 'message_signature': '34c4308384d012afacd627b3aeca47974ba5fb49c0823dfd4ed119db87b9bb22'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:42:16.173531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2ff45818-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.342764131, 'message_signature': '1bf2509332cf388121def70629dd885f456a5ff54ef4bf577917dbdd03ae8085'}]}, 'timestamp': '2025-12-02 09:42:16.174110', '_unique_id': '6ed241197e3c45b78cbd5ee16e585f26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.177 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f931611-7615-4978-9029-94037a7b61a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.177692', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2ff4f00c-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': 'b238ee74e23695e488624a9d08b581d0ebb144f585ac107cdca8f86bc93b19f8'}]}, 'timestamp': '2025-12-02 09:42:16.178017', '_unique_id': '763069284ba340eaafb8e1c15ba8b629'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.179 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 8783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddd877c9-62b3-4a25-a8e5-a7ee446de404', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8783, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.179556', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2ff545c0-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': 'd9d940808d65e776de670bcfaad4a44bce7f5e572acbcea9b2275dd5f8f9b6cb'}]}, 'timestamp': '2025-12-02 09:42:16.180218', '_unique_id': '147b5346a53b498f984a8c0dd152e0b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.181 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.182 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32c1d443-2033-421a-8da7-b471b1eb8457', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:42:16.181783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2ff58ec2-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': 'cadad9d582fd318b9e4de6634140ba85820f39d3af613b5db34d040ed2ecb68b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:42:16.181783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2ff59930-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': '0883cebe2b0e29ec3a51dc46f3955befa01c650edfe48357c6a7e0085e886817'}]}, 'timestamp': '2025-12-02 09:42:16.182354', '_unique_id': 'cfa8c7ea554d49eb941d9dfbe035178a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.183 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.184 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5118dfb4-f2d6-40e2-ac5b-1ddbc4633de1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:42:16.183836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2ff5df26-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': '83e48d4ccdf920ff8f21387d300e8f097c8e14d1e6443bc144b2589e72e1c98b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:42:16.183836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2ff5e99e-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': '5a7f5874471839ac0e3c136fb6a369f753e1a2066c0781361ac115315827b497'}]}, 'timestamp': '2025-12-02 09:42:16.184397', '_unique_id': 'ce05ea6b80d9495d86626a5c2f8294d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.185 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.186 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 55300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b93558d1-3adc-4b1a-aac6-a196c5b8112a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55300000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:42:16.186068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2ff63782-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.338142136, 'message_signature': '98118a103c089d69955c99dc337e96c3a28de323a7241f0efd8790284563e7b6'}]}, 'timestamp': '2025-12-02 09:42:16.186404', '_unique_id': 'c16e97adafe04b89b07d73c83bb5c05a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.187 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97dc583c-4ce8-459c-a6c1-d4350fa5f3db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:42:16.187816', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2ff67a30-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': '4720c1bbb556d77cb54a3db21a314bc6eef8017751dd2f68ccb4e7925be96e7d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:42:16.187816', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2ff68462-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.358523337, 'message_signature': 'b8465e75a75c1d8417c3b2b06c8227e2836433c1cfc116a163f93f8ce2d99012'}]}, 'timestamp': '2025-12-02 09:42:16.188367', '_unique_id': 'cdb43719f5cf4df78146cb8729ebfb74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.189 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73a79a0e-6929-4333-a092-2f182fd977f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:42:16.189870', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2ff6ccc4-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10698.316756569, 'message_signature': '7b929a099eef5b579f601c025339a5660f29b84b44004688d6ebefdffdf0c4bf'}]}, 'timestamp': '2025-12-02 09:42:16.190222', '_unique_id': '5d5f94b4216349af9875e2930536ece4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:42:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:42:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:42:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:42:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 02 09:42:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:17.929 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:42:18 np0005541913.localdomain podman[251127]: 2025-12-02 09:42:18.678379422 +0000 UTC m=+0.066387884 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:42:18 np0005541913.localdomain podman[251127]: 2025-12-02 09:42:18.716089349 +0000 UTC m=+0.104097801 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:42:19 np0005541913.localdomain systemd[1]: tmp-crun.obRAAT.mount: Deactivated successfully.
Dec 02 09:42:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 02 09:42:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:19 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:42:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10453 DF PROTO=TCP SPT=53162 DPT=9102 SEQ=2311503365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A097E50000000001030307) 
Dec 02 09:42:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:20 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:20.694 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 02 09:42:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-46603caa88f65e015c74097f596e48b006fc6fd2b23d7cf444ca3fcae1abca86-merged.mount: Deactivated successfully.
Dec 02 09:42:22 np0005541913.localdomain sshd[251149]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:42:22 np0005541913.localdomain sshd[251149]: Accepted publickey for zuul from 192.168.122.30 port 35816 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:42:22 np0005541913.localdomain systemd-logind[757]: New session 58 of user zuul.
Dec 02 09:42:22 np0005541913.localdomain systemd[1]: Started Session 58 of User zuul.
Dec 02 09:42:22 np0005541913.localdomain sshd[251149]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:42:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:42:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:42:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:42:22 np0005541913.localdomain sudo[251260]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zitiejijzgibpfvvhaopmepldwesonma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668542.2711194-27-4664893037082/AnsiballZ_file.py
Dec 02 09:42:22 np0005541913.localdomain sudo[251260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:42:22 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:22.933 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:22 np0005541913.localdomain python3.9[251262]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:23 np0005541913.localdomain sudo[251260]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 02 09:42:23 np0005541913.localdomain sudo[251370]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtfzxyojobhcemeebvtgcnlqfbjgdizi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668543.1142666-27-204351413634848/AnsiballZ_file.py
Dec 02 09:42:23 np0005541913.localdomain sudo[251370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:23 np0005541913.localdomain python3.9[251372]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:23 np0005541913.localdomain sudo[251370]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:24 np0005541913.localdomain sudo[251480]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlvixvmfkxcbmmntwquicqyvijxlcdyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668543.7350845-27-6558568459720/AnsiballZ_file.py
Dec 02 09:42:24 np0005541913.localdomain sudo[251480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 02 09:42:24 np0005541913.localdomain python3.9[251482]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:24 np0005541913.localdomain sudo[251480]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:42:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully.
Dec 02 09:42:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully.
Dec 02 09:42:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:42:24 np0005541913.localdomain python3.9[251602]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 02 09:42:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 02 09:42:25 np0005541913.localdomain python3.9[251688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668544.4092004-105-89479007587118/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:25 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:25.696 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully.
Dec 02 09:42:25 np0005541913.localdomain podman[251547]: 2025-12-02 09:42:25.829456305 +0000 UTC m=+1.157139901 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:42:25 np0005541913.localdomain podman[251547]: 2025-12-02 09:42:25.841776008 +0000 UTC m=+1.169459614 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 09:42:26 np0005541913.localdomain python3.9[251802]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:42:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:42:26 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:42:26 np0005541913.localdomain python3.9[251888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668545.8289428-150-280103273693308/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:27 np0005541913.localdomain python3.9[251996]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 02 09:42:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 02 09:42:27 np0005541913.localdomain python3.9[252082]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668546.9516094-150-59311323058681/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:27 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:27.973 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:28 np0005541913.localdomain python3.9[252190]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:28 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:36:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 142938 "" "Go-http-client/1.1"
Dec 02 09:42:28 np0005541913.localdomain podman_exporter[241003]: ts=2025-12-02T09:42:28.512Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 02 09:42:28 np0005541913.localdomain podman_exporter[241003]: ts=2025-12-02T09:42:28.513Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 02 09:42:28 np0005541913.localdomain podman_exporter[241003]: ts=2025-12-02T09:42:28.513Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Dec 02 09:42:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 02 09:42:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5908dabcdc4beecd14375872c1a5b4a4e28c3db557b9e42f64a01ed422f93ce2-merged.mount: Deactivated successfully.
Dec 02 09:42:28 np0005541913.localdomain python3.9[252276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668547.9985857-150-179697789803673/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=3dc6b2706ee653838c49e5c5e70381c0a171cc0a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:42:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:42:29 np0005541913.localdomain podman[252294]: 2025-12-02 09:42:29.454595526 +0000 UTC m=+0.093760073 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:42:29 np0005541913.localdomain podman[252294]: 2025-12-02 09:42:29.466055795 +0000 UTC m=+0.105220332 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:42:29 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:42:29 np0005541913.localdomain podman[252295]: 2025-12-02 09:42:29.559439086 +0000 UTC m=+0.196581898 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:42:29 np0005541913.localdomain podman[252295]: 2025-12-02 09:42:29.629053136 +0000 UTC m=+0.266195938 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 02 09:42:29 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:42:30 np0005541913.localdomain python3.9[252432]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:30 np0005541913.localdomain python3.9[252518]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668549.6909568-324-97965707517879/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=d6e803f833d8b5f768d3a3c0112defa742aeec55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:30 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:30.730 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:31 np0005541913.localdomain python3.9[252626]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:42:31 np0005541913.localdomain sudo[252736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycipfsegpahykfzjuvhcovmdefazqreo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668551.4424734-396-149678107873537/AnsiballZ_file.py
Dec 02 09:42:31 np0005541913.localdomain sudo[252736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:31 np0005541913.localdomain python3.9[252738]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:31 np0005541913.localdomain sudo[252736]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:32 np0005541913.localdomain sudo[252846]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zovbqhzadhmxptbdhuveyhlhaqsfriuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668552.0627894-420-164223320761644/AnsiballZ_stat.py
Dec 02 09:42:32 np0005541913.localdomain sudo[252846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:32 np0005541913.localdomain python3.9[252848]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:32 np0005541913.localdomain sudo[252846]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:32 np0005541913.localdomain sudo[252903]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmxhwwkikrjuhzqbseyaukbrjjycdrmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668552.0627894-420-164223320761644/AnsiballZ_file.py
Dec 02 09:42:32 np0005541913.localdomain sudo[252903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:32 np0005541913.localdomain python3.9[252905]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:32 np0005541913.localdomain sudo[252903]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:33 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:33.022 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:33 np0005541913.localdomain sudo[253013]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfenwqdvrbodhqxivvgukfhdbmcqpwmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668553.1526868-420-219200672503100/AnsiballZ_stat.py
Dec 02 09:42:33 np0005541913.localdomain sudo[253013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:33 np0005541913.localdomain python3.9[253015]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:33 np0005541913.localdomain sudo[253013]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:33 np0005541913.localdomain sudo[253070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjggbexwivugumcelcefmloymebkzkky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668553.1526868-420-219200672503100/AnsiballZ_file.py
Dec 02 09:42:33 np0005541913.localdomain sudo[253070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59699 DF PROTO=TCP SPT=43938 DPT=9102 SEQ=553238560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A0D04D0000000001030307) 
Dec 02 09:42:34 np0005541913.localdomain python3.9[253072]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:34 np0005541913.localdomain sudo[253070]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:42:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:42:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:42:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:42:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:42:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:42:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:42:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:42:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:42:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:42:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:42:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:42:34 np0005541913.localdomain sudo[253184]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbgaanbjdpspijktnnaezyvbnhwfvfxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668554.2359722-489-63035833860254/AnsiballZ_file.py
Dec 02 09:42:34 np0005541913.localdomain sudo[253184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:34 np0005541913.localdomain python3.9[253186]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:34 np0005541913.localdomain sudo[253184]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59700 DF PROTO=TCP SPT=43938 DPT=9102 SEQ=553238560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A0D4640000000001030307) 
Dec 02 09:42:35 np0005541913.localdomain sudo[253294]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbjhristcuuwpwslspcknrtszcqazxcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668554.8756235-513-101346148051619/AnsiballZ_stat.py
Dec 02 09:42:35 np0005541913.localdomain sudo[253294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:35 np0005541913.localdomain python3.9[253296]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:35 np0005541913.localdomain sudo[253294]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:35 np0005541913.localdomain sudo[253351]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzhbzjuzxqvroknlbcwytsbctcvhmuxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668554.8756235-513-101346148051619/AnsiballZ_file.py
Dec 02 09:42:35 np0005541913.localdomain sudo[253351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:35 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:35.753 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:35 np0005541913.localdomain python3.9[253353]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:35 np0005541913.localdomain sudo[253351]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10454 DF PROTO=TCP SPT=53162 DPT=9102 SEQ=2311503365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A0D7E50000000001030307) 
Dec 02 09:42:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:42:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:42:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:42:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144562 "" "Go-http-client/1.1"
Dec 02 09:42:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:42:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16340 "" "Go-http-client/1.1"
Dec 02 09:42:36 np0005541913.localdomain sudo[253462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgrakfaslyxhfryadjczjhrkjlxkgznr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668555.9788065-549-188683645242290/AnsiballZ_stat.py
Dec 02 09:42:36 np0005541913.localdomain sudo[253462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:36 np0005541913.localdomain python3.9[253464]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:36 np0005541913.localdomain sudo[253462]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:36 np0005541913.localdomain sudo[253519]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwfgkkejdihjategpykowvdgvdwsnoao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668555.9788065-549-188683645242290/AnsiballZ_file.py
Dec 02 09:42:36 np0005541913.localdomain sudo[253519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:36 np0005541913.localdomain python3.9[253521]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:36 np0005541913.localdomain sudo[253519]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59701 DF PROTO=TCP SPT=43938 DPT=9102 SEQ=553238560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A0DC650000000001030307) 
Dec 02 09:42:37 np0005541913.localdomain sudo[253629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlzoqsdfjjxvogifjwlvoyrdvcfbmlxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668557.1115916-585-66656676991627/AnsiballZ_systemd.py
Dec 02 09:42:37 np0005541913.localdomain sudo[253629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55638 DF PROTO=TCP SPT=41432 DPT=9102 SEQ=281541048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A0DFE40000000001030307) 
Dec 02 09:42:37 np0005541913.localdomain python3.9[253631]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:42:37 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:42:38 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:38.065 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:38 np0005541913.localdomain systemd-rc-local-generator[253658]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:42:38 np0005541913.localdomain systemd-sysv-generator[253662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:42:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:42:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:38 np0005541913.localdomain sudo[253629]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:38 np0005541913.localdomain auditd[710]: Audit daemon rotating log files
Dec 02 09:42:38 np0005541913.localdomain sudo[253777]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqqdkvpcwtftoxnidkdqjwslbubotlzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668558.583511-609-192428274837279/AnsiballZ_stat.py
Dec 02 09:42:38 np0005541913.localdomain sudo[253777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:42:39 np0005541913.localdomain python3.9[253779]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:39 np0005541913.localdomain sudo[253777]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:39 np0005541913.localdomain podman[253780]: 2025-12-02 09:42:39.191268494 +0000 UTC m=+0.097366899 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 09:42:39 np0005541913.localdomain podman[253780]: 2025-12-02 09:42:39.202171078 +0000 UTC m=+0.108269503 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute)
Dec 02 09:42:39 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:42:39 np0005541913.localdomain sudo[253853]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzsykvxlumyhqzgotrqfxzsuysonixgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668558.583511-609-192428274837279/AnsiballZ_file.py
Dec 02 09:42:39 np0005541913.localdomain sudo[253853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:39 np0005541913.localdomain python3.9[253855]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:39 np0005541913.localdomain sudo[253853]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:39 np0005541913.localdomain sudo[253963]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obzrokqycffukfwxnaclvtulbqldbjkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668559.702088-645-73783847052602/AnsiballZ_stat.py
Dec 02 09:42:39 np0005541913.localdomain sudo[253963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:40 np0005541913.localdomain python3.9[253965]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:40 np0005541913.localdomain sudo[253963]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:40 np0005541913.localdomain sudo[254020]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkjogxopwqambybbhfxxnyibbllblugb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668559.702088-645-73783847052602/AnsiballZ_file.py
Dec 02 09:42:40 np0005541913.localdomain sudo[254020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:40 np0005541913.localdomain python3.9[254022]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:40 np0005541913.localdomain sudo[254020]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:40 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:40.756 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59702 DF PROTO=TCP SPT=43938 DPT=9102 SEQ=553238560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A0EC240000000001030307) 
Dec 02 09:42:41 np0005541913.localdomain sudo[254130]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfbmoqcyoixjtbdycqitoitlsxoxsouq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668560.8687537-681-215721431748273/AnsiballZ_systemd.py
Dec 02 09:42:41 np0005541913.localdomain sudo[254130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:41 np0005541913.localdomain python3.9[254132]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:42:41 np0005541913.localdomain systemd-sysv-generator[254158]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:42:41 np0005541913.localdomain systemd-rc-local-generator[254154]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 09:42:41 np0005541913.localdomain podman[254170]: 2025-12-02 09:42:41.940817266 +0000 UTC m=+0.097482933 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 09:42:41 np0005541913.localdomain sudo[254130]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:41 np0005541913.localdomain podman[254170]: 2025-12-02 09:42:41.97505268 +0000 UTC m=+0.131718367 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 02 09:42:41 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:42:42 np0005541913.localdomain sudo[254298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swfliuvtcoonzqoknyttgwdjvwgowzyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668562.3200388-711-277345688306552/AnsiballZ_file.py
Dec 02 09:42:42 np0005541913.localdomain sudo[254298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:42 np0005541913.localdomain python3.9[254300]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:42:42 np0005541913.localdomain sudo[254298]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:43 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:43.097 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:43 np0005541913.localdomain sudo[254408]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oinmfigtiywnrkhvisrjxwgvaohgdhfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668563.0380995-735-158280516950574/AnsiballZ_stat.py
Dec 02 09:42:43 np0005541913.localdomain sudo[254408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:43 np0005541913.localdomain python3.9[254410]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:42:43 np0005541913.localdomain sudo[254408]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:43 np0005541913.localdomain sudo[254496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixtcxegfdrwjrxluerajsbfewxwwtdta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668563.0380995-735-158280516950574/AnsiballZ_copy.py
Dec 02 09:42:43 np0005541913.localdomain sudo[254496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:44 np0005541913.localdomain python3.9[254498]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668563.0380995-735-158280516950574/.source.json _original_basename=.60ynxj4z follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:44 np0005541913.localdomain sudo[254496]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:44 np0005541913.localdomain sudo[254606]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlzfgvdcofoqrjbxmksvovimumncdkex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668564.2289755-780-125541626024100/AnsiballZ_file.py
Dec 02 09:42:44 np0005541913.localdomain sudo[254606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:44 np0005541913.localdomain python3.9[254608]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:44 np0005541913.localdomain sudo[254606]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:45 np0005541913.localdomain sudo[254716]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jftvitmqzeannyihmwhatgasfdxikzqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668564.9169545-804-42809388411833/AnsiballZ_stat.py
Dec 02 09:42:45 np0005541913.localdomain sudo[254716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:45 np0005541913.localdomain sudo[254716]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:45 np0005541913.localdomain sudo[254804]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krwbqksfjwhnhiaovdratroylnzkahrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668564.9169545-804-42809388411833/AnsiballZ_copy.py
Dec 02 09:42:45 np0005541913.localdomain sudo[254804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:45 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:45.759 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:45 np0005541913.localdomain sudo[254804]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:42:46 np0005541913.localdomain podman[254862]: 2025-12-02 09:42:46.453070767 +0000 UTC m=+0.086595349 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc.)
Dec 02 09:42:46 np0005541913.localdomain podman[254862]: 2025-12-02 09:42:46.467864816 +0000 UTC m=+0.101389388 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Dec 02 09:42:46 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:42:46 np0005541913.localdomain sudo[254934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcojyrtjvfmtrpqiykrdldcplyejyvsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668566.2205207-855-140573874227428/AnsiballZ_container_config_data.py
Dec 02 09:42:46 np0005541913.localdomain sudo[254934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:46 np0005541913.localdomain python3.9[254936]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Dec 02 09:42:46 np0005541913.localdomain sudo[254934]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:46 np0005541913.localdomain sudo[254937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:42:46 np0005541913.localdomain sudo[254937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:42:46 np0005541913.localdomain sudo[254937]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:46 np0005541913.localdomain sudo[254972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:42:46 np0005541913.localdomain sudo[254972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:42:47 np0005541913.localdomain sudo[255119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxlubdtlfruqpwcwjokgxaisrgamsbsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668567.0813308-882-246791298541646/AnsiballZ_container_config_hash.py
Dec 02 09:42:47 np0005541913.localdomain sudo[255119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:47 np0005541913.localdomain python3.9[255123]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:42:47 np0005541913.localdomain sudo[255119]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:47 np0005541913.localdomain podman[255156]: 2025-12-02 09:42:47.79628364 +0000 UTC m=+0.095248882 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, release=1763362218, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:42:47 np0005541913.localdomain podman[255156]: 2025-12-02 09:42:47.91329161 +0000 UTC m=+0.212256852 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.buildah.version=1.41.4, version=7, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, release=1763362218, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph)
Dec 02 09:42:48 np0005541913.localdomain sudo[254972]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:48 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:48.129 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:48 np0005541913.localdomain sudo[255278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:42:48 np0005541913.localdomain sudo[255278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:42:48 np0005541913.localdomain sudo[255278]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:48 np0005541913.localdomain sudo[255312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:42:48 np0005541913.localdomain sudo[255312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:42:48 np0005541913.localdomain sudo[255366]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbatitugcixnmmzupkqkchljqanbcsmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668568.0042512-909-136688834048129/AnsiballZ_podman_container_info.py
Dec 02 09:42:48 np0005541913.localdomain sudo[255366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:48 np0005541913.localdomain python3.9[255368]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 09:42:48 np0005541913.localdomain sudo[255366]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:48 np0005541913.localdomain sudo[255312]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59703 DF PROTO=TCP SPT=43938 DPT=9102 SEQ=553238560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A10BE40000000001030307) 
Dec 02 09:42:49 np0005541913.localdomain sudo[255445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:42:49 np0005541913.localdomain sudo[255445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:42:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:42:49 np0005541913.localdomain sudo[255445]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:49 np0005541913.localdomain podman[255463]: 2025-12-02 09:42:49.7167356 +0000 UTC m=+0.122313174 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:42:49 np0005541913.localdomain podman[255463]: 2025-12-02 09:42:49.730157141 +0000 UTC m=+0.135734705 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:42:49 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:42:50 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:50.762 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:52 np0005541913.localdomain sudo[255578]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgnqfkwzxfibbnfiwkwrhzcapygrwhkr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668572.1895082-948-160641392221777/AnsiballZ_edpm_container_manage.py
Dec 02 09:42:52 np0005541913.localdomain sudo[255578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:52 np0005541913.localdomain python3[255580]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:42:53 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:53.161 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:53 np0005541913.localdomain podman[255619]: 
Dec 02 09:42:53 np0005541913.localdomain podman[255619]: 2025-12-02 09:42:53.21785406 +0000 UTC m=+0.099092026 container create a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, config_id=neutron_sriov_agent, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 09:42:53 np0005541913.localdomain podman[255619]: 2025-12-02 09:42:53.174531991 +0000 UTC m=+0.055769997 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Dec 02 09:42:53 np0005541913.localdomain python3[255580]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Dec 02 09:42:53 np0005541913.localdomain sudo[255578]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:53 np0005541913.localdomain sudo[255762]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvfdqpgyrzejgasuttklarrlrvhpotpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668573.573642-972-73563366713223/AnsiballZ_stat.py
Dec 02 09:42:53 np0005541913.localdomain sudo[255762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:54 np0005541913.localdomain python3.9[255764]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:42:54 np0005541913.localdomain sudo[255762]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:54 np0005541913.localdomain sudo[255874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lewkdudvfnvezfivxgcosafbmnihxenq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668574.3329287-999-103504100307873/AnsiballZ_file.py
Dec 02 09:42:54 np0005541913.localdomain sudo[255874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:54 np0005541913.localdomain python3.9[255876]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:54 np0005541913.localdomain sudo[255874]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:55 np0005541913.localdomain sudo[255929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyuacccukzfzyhpuzbvcdcvwlaopjdtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668574.3329287-999-103504100307873/AnsiballZ_stat.py
Dec 02 09:42:55 np0005541913.localdomain sudo[255929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:55 np0005541913.localdomain python3.9[255931]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:42:55 np0005541913.localdomain sudo[255929]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:55 np0005541913.localdomain sudo[256038]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzgpofyqzaxsiomofjbcthhczdrffcnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668575.308752-999-133693698447539/AnsiballZ_copy.py
Dec 02 09:42:55 np0005541913.localdomain sudo[256038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:55 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:55.764 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:55 np0005541913.localdomain python3.9[256040]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764668575.308752-999-133693698447539/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:42:55 np0005541913.localdomain sudo[256038]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:56 np0005541913.localdomain sudo[256093]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxrlvadycvtcjepuhqeyruaggiamknxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668575.308752-999-133693698447539/AnsiballZ_systemd.py
Dec 02 09:42:56 np0005541913.localdomain sudo[256093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:56 np0005541913.localdomain python3.9[256095]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:42:56 np0005541913.localdomain systemd-sysv-generator[256120]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:42:56 np0005541913.localdomain systemd-rc-local-generator[256116]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:42:56 np0005541913.localdomain sudo[256093]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:56 np0005541913.localdomain podman[256132]: 2025-12-02 09:42:56.866921108 +0000 UTC m=+0.071754629 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 09:42:56 np0005541913.localdomain podman[256132]: 2025-12-02 09:42:56.907081052 +0000 UTC m=+0.111914653 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 09:42:56 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:42:57 np0005541913.localdomain sudo[256203]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uaaxvmksawuidywanilpgrvekyqtozuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668575.308752-999-133693698447539/AnsiballZ_systemd.py
Dec 02 09:42:57 np0005541913.localdomain sudo[256203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:57 np0005541913.localdomain python3.9[256205]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:42:57 np0005541913.localdomain systemd-sysv-generator[256236]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:42:57 np0005541913.localdomain systemd-rc-local-generator[256230]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: Starting neutron_sriov_agent container...
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: tmp-crun.IOqn2V.mount: Deactivated successfully.
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:42:57 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a95121ccaf3925a63af3516bd9487ef1392fcae7d1282cfa94fd6d6ef1726a7/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 02 09:42:57 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a95121ccaf3925a63af3516bd9487ef1392fcae7d1282cfa94fd6d6ef1726a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 09:42:57 np0005541913.localdomain podman[256247]: 2025-12-02 09:42:57.957807499 +0000 UTC m=+0.137628376 container init a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=neutron_sriov_agent, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 09:42:57 np0005541913.localdomain podman[256247]: 2025-12-02 09:42:57.968598841 +0000 UTC m=+0.148419708 container start a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 09:42:57 np0005541913.localdomain podman[256247]: neutron_sriov_agent
Dec 02 09:42:57 np0005541913.localdomain neutron_sriov_agent[256261]: + sudo -E kolla_set_configs
Dec 02 09:42:57 np0005541913.localdomain systemd[1]: Started neutron_sriov_agent container.
Dec 02 09:42:58 np0005541913.localdomain sudo[256203]: pam_unix(sudo:session): session closed for user root
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Validating config file
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Copying service configuration files
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Writing out command to execute
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/595e1c9b-709c-41d2-9212-0b18b13291a8.pid.haproxy
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/595e1c9b-709c-41d2-9212-0b18b13291a8.conf
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: ++ cat /run_command
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: + CMD=/usr/bin/neutron-sriov-nic-agent
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: + ARGS=
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: + sudo kolla_copy_cacerts
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: + [[ ! -n '' ]]
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: + . kolla_extend_start
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: + umask 0022
Dec 02 09:42:58 np0005541913.localdomain neutron_sriov_agent[256261]: + exec /usr/bin/neutron-sriov-nic-agent
Dec 02 09:42:58 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:42:58.185 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:42:59 np0005541913.localdomain sudo[256383]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tukbdcslzwcidpkdlqoxlvuvxtqywprp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668579.0960987-1083-28942992262641/AnsiballZ_systemd.py
Dec 02 09:42:59 np0005541913.localdomain sudo[256383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:42:59 np0005541913.localdomain python3.9[256385]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:42:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:42:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.741 2 INFO neutron.common.config [-] Logging enabled!
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.741 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.742 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.742 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.742 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.742 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.742 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005541913.localdomain'}
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.743 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-cd411476-013c-42ff-b2fb-c5de8758bbea - - - - - -] RPC agent_id: nic-switch-agent.np0005541913.localdomain
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.749 2 INFO neutron.agent.agent_extensions_manager [None req-cd411476-013c-42ff-b2fb-c5de8758bbea - - - - - -] Loaded agent extensions: ['qos']
Dec 02 09:42:59 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:42:59.749 2 INFO neutron.agent.agent_extensions_manager [None req-cd411476-013c-42ff-b2fb-c5de8758bbea - - - - - -] Initializing agent extension 'qos'
Dec 02 09:42:59 np0005541913.localdomain podman[256387]: 2025-12-02 09:42:59.807681912 +0000 UTC m=+0.085821988 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:42:59 np0005541913.localdomain podman[256387]: 2025-12-02 09:42:59.817036104 +0000 UTC m=+0.095176281 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:42:59 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:42:59 np0005541913.localdomain podman[256388]: 2025-12-02 09:42:59.897135267 +0000 UTC m=+0.177841292 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 09:42:59 np0005541913.localdomain podman[256388]: 2025-12-02 09:42:59.963066457 +0000 UTC m=+0.243772432 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller)
Dec 02 09:42:59 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:43:00 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:43:00.040 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-cd411476-013c-42ff-b2fb-c5de8758bbea - - - - - -] Agent initialized successfully, now running... 
Dec 02 09:43:00 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:43:00.041 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-cd411476-013c-42ff-b2fb-c5de8758bbea - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Dec 02 09:43:00 np0005541913.localdomain neutron_sriov_agent[256261]: 2025-12-02 09:43:00.041 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-cd411476-013c-42ff-b2fb-c5de8758bbea - - - - - -] Agent out of sync with plugin!
Dec 02 09:43:00 np0005541913.localdomain systemd[1]: Stopping neutron_sriov_agent container...
Dec 02 09:43:00 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:00.770 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:00 np0005541913.localdomain systemd[1]: libpod-a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5.scope: Deactivated successfully.
Dec 02 09:43:00 np0005541913.localdomain systemd[1]: libpod-a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5.scope: Consumed 1.863s CPU time.
Dec 02 09:43:00 np0005541913.localdomain podman[256439]: 2025-12-02 09:43:00.817713711 +0000 UTC m=+0.062148379 container died a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:43:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5-userdata-shm.mount: Deactivated successfully.
Dec 02 09:43:00 np0005541913.localdomain podman[256439]: 2025-12-02 09:43:00.870792784 +0000 UTC m=+0.115227412 container cleanup a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.license=GPLv2, container_name=neutron_sriov_agent, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 09:43:00 np0005541913.localdomain podman[256439]: neutron_sriov_agent
Dec 02 09:43:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0a95121ccaf3925a63af3516bd9487ef1392fcae7d1282cfa94fd6d6ef1726a7-merged.mount: Deactivated successfully.
Dec 02 09:43:00 np0005541913.localdomain podman[256465]: 2025-12-02 09:43:00.956315143 +0000 UTC m=+0.057369600 container cleanup a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 09:43:00 np0005541913.localdomain podman[256465]: neutron_sriov_agent
Dec 02 09:43:00 np0005541913.localdomain systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Dec 02 09:43:00 np0005541913.localdomain systemd[1]: Stopped neutron_sriov_agent container.
Dec 02 09:43:00 np0005541913.localdomain systemd[1]: Starting neutron_sriov_agent container...
Dec 02 09:43:01 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:43:01 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a95121ccaf3925a63af3516bd9487ef1392fcae7d1282cfa94fd6d6ef1726a7/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 02 09:43:01 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a95121ccaf3925a63af3516bd9487ef1392fcae7d1282cfa94fd6d6ef1726a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 09:43:01 np0005541913.localdomain podman[256479]: 2025-12-02 09:43:01.097598947 +0000 UTC m=+0.109908358 container init a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=neutron_sriov_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 09:43:01 np0005541913.localdomain podman[256479]: 2025-12-02 09:43:01.106345893 +0000 UTC m=+0.118655264 container start a7e373e25adab2e86802f7aa2ef48beacbb2b6f7ee2f40ba0c9f787b70bdf3e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'ffa28ff7313f2a8e23ca9e2d89c0c3c7b9d3df09b3fadbb856f3e15857e6d8de'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 09:43:01 np0005541913.localdomain podman[256479]: neutron_sriov_agent
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: + sudo -E kolla_set_configs
Dec 02 09:43:01 np0005541913.localdomain systemd[1]: Started neutron_sriov_agent container.
Dec 02 09:43:01 np0005541913.localdomain sudo[256383]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Validating config file
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Copying service configuration files
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Writing out command to execute
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/595e1c9b-709c-41d2-9212-0b18b13291a8.pid.haproxy
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/595e1c9b-709c-41d2-9212-0b18b13291a8.conf
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: ++ cat /run_command
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: + CMD=/usr/bin/neutron-sriov-nic-agent
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: + ARGS=
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: + sudo kolla_copy_cacerts
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: + [[ ! -n '' ]]
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: + . kolla_extend_start
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: + umask 0022
Dec 02 09:43:01 np0005541913.localdomain neutron_sriov_agent[256494]: + exec /usr/bin/neutron-sriov-nic-agent
Dec 02 09:43:01 np0005541913.localdomain sshd[251149]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:43:01 np0005541913.localdomain systemd-logind[757]: Session 58 logged out. Waiting for processes to exit.
Dec 02 09:43:01 np0005541913.localdomain systemd[1]: session-58.scope: Deactivated successfully.
Dec 02 09:43:01 np0005541913.localdomain systemd[1]: session-58.scope: Consumed 23.256s CPU time.
Dec 02 09:43:01 np0005541913.localdomain systemd-logind[757]: Removed session 58.
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.763 2 INFO neutron.common.config [-] Logging enabled!
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.763 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.764 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.764 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.764 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.764 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.764 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005541913.localdomain'}
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.765 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eb369f77-4d91-450c-9cf1-08efd00c7291 - - - - - -] RPC agent_id: nic-switch-agent.np0005541913.localdomain
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.769 2 INFO neutron.agent.agent_extensions_manager [None req-eb369f77-4d91-450c-9cf1-08efd00c7291 - - - - - -] Loaded agent extensions: ['qos']
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.769 2 INFO neutron.agent.agent_extensions_manager [None req-eb369f77-4d91-450c-9cf1-08efd00c7291 - - - - - -] Initializing agent extension 'qos'
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.918 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eb369f77-4d91-450c-9cf1-08efd00c7291 - - - - - -] Agent initialized successfully, now running... 
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.919 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eb369f77-4d91-450c-9cf1-08efd00c7291 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Dec 02 09:43:02 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 09:43:02.919 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eb369f77-4d91-450c-9cf1-08efd00c7291 - - - - - -] Agent out of sync with plugin!
Dec 02 09:43:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:43:03.026 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:43:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:43:03.026 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:43:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:43:03.027 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:43:03 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:03.213 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39174 DF PROTO=TCP SPT=37708 DPT=9102 SEQ=1820381988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A145800000000001030307) 
Dec 02 09:43:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:43:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:43:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:43:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:43:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:43:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:43:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:43:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39175 DF PROTO=TCP SPT=37708 DPT=9102 SEQ=1820381988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A149A40000000001030307) 
Dec 02 09:43:05 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59704 DF PROTO=TCP SPT=43938 DPT=9102 SEQ=553238560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A14BE50000000001030307) 
Dec 02 09:43:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:05.770 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:43:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:43:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:43:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146520 "" "Go-http-client/1.1"
Dec 02 09:43:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:43:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16781 "" "Go-http-client/1.1"
Dec 02 09:43:06 np0005541913.localdomain sshd[256527]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:43:06 np0005541913.localdomain sshd[256527]: Accepted publickey for zuul from 192.168.122.30 port 48796 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:43:06 np0005541913.localdomain systemd-logind[757]: New session 59 of user zuul.
Dec 02 09:43:06 np0005541913.localdomain systemd[1]: Started Session 59 of User zuul.
Dec 02 09:43:06 np0005541913.localdomain sshd[256527]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:43:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39176 DF PROTO=TCP SPT=37708 DPT=9102 SEQ=1820381988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A151A40000000001030307) 
Dec 02 09:43:08 np0005541913.localdomain python3.9[256638]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:43:08 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10455 DF PROTO=TCP SPT=53162 DPT=9102 SEQ=2311503365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A155E50000000001030307) 
Dec 02 09:43:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:08.273 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:08 np0005541913.localdomain sudo[256750]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjkaxakyjeznnhfithdsbdbjhzrjxtgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668588.7113898-66-276731374063936/AnsiballZ_setup.py
Dec 02 09:43:08 np0005541913.localdomain sudo[256750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:09 np0005541913.localdomain python3.9[256752]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:43:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:43:09 np0005541913.localdomain systemd[1]: tmp-crun.LnnmmC.mount: Deactivated successfully.
Dec 02 09:43:09 np0005541913.localdomain podman[256759]: 2025-12-02 09:43:09.462728466 +0000 UTC m=+0.100625508 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 09:43:09 np0005541913.localdomain podman[256759]: 2025-12-02 09:43:09.476053526 +0000 UTC m=+0.113950558 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 09:43:09 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:43:09 np0005541913.localdomain sudo[256750]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:09.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:09.722 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:43:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:09.723 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:09.749 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:43:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:09.750 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:43:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:09.750 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:43:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:09.750 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:43:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:09.751 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:43:09 np0005541913.localdomain sudo[256831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fivzohsrtvvsjniuzvocsnuvqfblobie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668588.7113898-66-276731374063936/AnsiballZ_dnf.py
Dec 02 09:43:09 np0005541913.localdomain sudo[256831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:10 np0005541913.localdomain python3.9[256852]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.177 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.241 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.242 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.450 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.452 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12227MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.453 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.453 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.520 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.520 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.521 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.566 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:43:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:10.774 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:11.027 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:43:11 np0005541913.localdomain rsyslogd[754]: imjournal: 6666 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 02 09:43:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:11.034 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:43:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:11.052 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:43:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:11.055 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:43:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:11.055 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:43:11 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39177 DF PROTO=TCP SPT=37708 DPT=9102 SEQ=1820381988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A161650000000001030307) 
Dec 02 09:43:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:43:12 np0005541913.localdomain systemd[1]: tmp-crun.T5NsoB.mount: Deactivated successfully.
Dec 02 09:43:12 np0005541913.localdomain podman[256879]: 2025-12-02 09:43:12.458426984 +0000 UTC m=+0.096250970 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:43:12 np0005541913.localdomain podman[256879]: 2025-12-02 09:43:12.466004238 +0000 UTC m=+0.103828244 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 02 09:43:12 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.057 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.058 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.058 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.058 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.119 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.119 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.120 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.120 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.275 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.460 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.658 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.659 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.660 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.660 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.661 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:13 np0005541913.localdomain sudo[256831]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:13.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:14 np0005541913.localdomain sudo[257004]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shakhsfyratrkwwocusqysiditoazyjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668593.827317-102-236186159424827/AnsiballZ_systemd.py
Dec 02 09:43:14 np0005541913.localdomain sudo[257004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:14 np0005541913.localdomain python3.9[257006]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 09:43:14 np0005541913.localdomain sudo[257004]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:15.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:15.776 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:16 np0005541913.localdomain sudo[257117]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsucmesauukozoijznjciyzhrrixkmsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668595.9785125-129-261210175828783/AnsiballZ_file.py
Dec 02 09:43:16 np0005541913.localdomain sudo[257117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:16 np0005541913.localdomain python3.9[257119]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:16 np0005541913.localdomain sudo[257117]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:16.718 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:43:17 np0005541913.localdomain sudo[257227]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbblbxeprzkqrffsyrlytijnhpynffvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668596.7519925-129-107885795488930/AnsiballZ_file.py
Dec 02 09:43:17 np0005541913.localdomain sudo[257227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:43:17 np0005541913.localdomain podman[257230]: 2025-12-02 09:43:17.148537096 +0000 UTC m=+0.089369264 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, config_id=edpm, vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 09:43:17 np0005541913.localdomain podman[257230]: 2025-12-02 09:43:17.167207239 +0000 UTC m=+0.108039437 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 09:43:17 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:43:17 np0005541913.localdomain python3.9[257229]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:17 np0005541913.localdomain sudo[257227]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:17 np0005541913.localdomain sudo[257357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nelkyejmebsmzujvsuwjxmaylslyrolh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668597.3919184-129-12489039588836/AnsiballZ_file.py
Dec 02 09:43:17 np0005541913.localdomain sudo[257357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:17 np0005541913.localdomain python3.9[257359]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:17 np0005541913.localdomain sudo[257357]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:18 np0005541913.localdomain sudo[257467]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldtwtcgcfidwtnnqgooetgwmuwbahkmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668597.9656296-129-11871585882563/AnsiballZ_file.py
Dec 02 09:43:18 np0005541913.localdomain sudo[257467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:18.317 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:18 np0005541913.localdomain python3.9[257469]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:18 np0005541913.localdomain sudo[257467]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:18 np0005541913.localdomain sudo[257577]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxefmhmvtezdfjfrraxbecrmgdbgypzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668598.5975718-129-66054016185074/AnsiballZ_file.py
Dec 02 09:43:18 np0005541913.localdomain sudo[257577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:19 np0005541913.localdomain python3.9[257579]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:19 np0005541913.localdomain sudo[257577]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39178 DF PROTO=TCP SPT=37708 DPT=9102 SEQ=1820381988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A181E40000000001030307) 
Dec 02 09:43:19 np0005541913.localdomain sudo[257687]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfptjvuqvozdkwxcthcohohsgeckvyop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668599.1957111-129-86047816203378/AnsiballZ_file.py
Dec 02 09:43:19 np0005541913.localdomain sudo[257687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:19 np0005541913.localdomain python3.9[257689]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:19 np0005541913.localdomain sudo[257687]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:20 np0005541913.localdomain sudo[257797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spmxcrivsyhonzeiudkvmlxwyhactmwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668599.7812676-129-183325675083951/AnsiballZ_file.py
Dec 02 09:43:20 np0005541913.localdomain sudo[257797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:43:20 np0005541913.localdomain systemd[1]: tmp-crun.2RDqvo.mount: Deactivated successfully.
Dec 02 09:43:20 np0005541913.localdomain podman[257799]: 2025-12-02 09:43:20.177875671 +0000 UTC m=+0.097299678 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:43:20 np0005541913.localdomain podman[257799]: 2025-12-02 09:43:20.194069548 +0000 UTC m=+0.113493615 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:43:20 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:43:20 np0005541913.localdomain python3.9[257800]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:20 np0005541913.localdomain sudo[257797]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:20 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:20.779 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:20 np0005541913.localdomain sudo[257928]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcnsxzjhomyiotprcfmmrgkedppaaodv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668600.5928473-279-193908501822782/AnsiballZ_stat.py
Dec 02 09:43:20 np0005541913.localdomain sudo[257928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:21 np0005541913.localdomain python3.9[257930]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:21 np0005541913.localdomain sudo[257928]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:21 np0005541913.localdomain sudo[258016]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aipvfknflzfcvrgkgxdvhxggkzwjerne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668600.5928473-279-193908501822782/AnsiballZ_copy.py
Dec 02 09:43:21 np0005541913.localdomain sudo[258016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:21 np0005541913.localdomain python3.9[258018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668600.5928473-279-193908501822782/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:21 np0005541913.localdomain sudo[258016]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:22 np0005541913.localdomain python3.9[258126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:22 np0005541913.localdomain python3.9[258212]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668602.0470908-324-45132693326891/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:23 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:23.321 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:23 np0005541913.localdomain python3.9[258320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:24 np0005541913.localdomain python3.9[258406]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668603.1381304-324-133461508350307/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:24 np0005541913.localdomain python3.9[258514]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:25 np0005541913.localdomain python3.9[258600]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668604.2490718-324-213312502282350/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=694f6cc59ea78cd881696e3f3cdb6845e6a84456 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:25 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:25.782 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:26 np0005541913.localdomain python3.9[258708]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:26 np0005541913.localdomain python3.9[258794]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668605.9838195-498-258976508411/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=d6e803f833d8b5f768d3a3c0112defa742aeec55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:43:27 np0005541913.localdomain podman[258901]: 2025-12-02 09:43:27.441934685 +0000 UTC m=+0.077612367 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:43:27 np0005541913.localdomain podman[258901]: 2025-12-02 09:43:27.45584664 +0000 UTC m=+0.091524322 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:43:27 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:43:27 np0005541913.localdomain python3.9[258903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:28 np0005541913.localdomain python3.9[259009]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668607.1376557-543-82289181862901/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:28 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:28.368 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:28 np0005541913.localdomain python3.9[259117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:29 np0005541913.localdomain python3.9[259203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668608.1425648-543-6165996118472/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:29 np0005541913.localdomain python3.9[259311]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:43:29 np0005541913.localdomain systemd[1]: tmp-crun.IsRqRE.mount: Deactivated successfully.
Dec 02 09:43:29 np0005541913.localdomain podman[259312]: 2025-12-02 09:43:29.968742462 +0000 UTC m=+0.093413463 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:43:29 np0005541913.localdomain podman[259312]: 2025-12-02 09:43:29.980930472 +0000 UTC m=+0.105601513 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:43:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:43:29 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:43:30 np0005541913.localdomain podman[259353]: 2025-12-02 09:43:30.085850614 +0000 UTC m=+0.079430495 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Dec 02 09:43:30 np0005541913.localdomain podman[259353]: 2025-12-02 09:43:30.122090522 +0000 UTC m=+0.115670403 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 09:43:30 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:43:30 np0005541913.localdomain python3.9[259405]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:30 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:30.784 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:30 np0005541913.localdomain python3.9[259522]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:31 np0005541913.localdomain python3.9[259608]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668610.4084315-630-163921474061312/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:31 np0005541913.localdomain python3.9[259716]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:43:32 np0005541913.localdomain sudo[259826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hboyoostascpuwmttogkcitbeffddrtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668612.2281728-735-3783184904107/AnsiballZ_file.py
Dec 02 09:43:32 np0005541913.localdomain sudo[259826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:32 np0005541913.localdomain python3.9[259828]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:32 np0005541913.localdomain sudo[259826]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:33 np0005541913.localdomain sudo[259936]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlsuucwsnwkkzeaqzspyraorgocngued ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668612.8574739-759-274096747373759/AnsiballZ_stat.py
Dec 02 09:43:33 np0005541913.localdomain sudo[259936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:33 np0005541913.localdomain python3.9[259938]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:33 np0005541913.localdomain sudo[259936]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:33 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:33.371 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:33 np0005541913.localdomain sudo[259993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qejyuevanvfasxycpiwtqhndlpeaodzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668612.8574739-759-274096747373759/AnsiballZ_file.py
Dec 02 09:43:33 np0005541913.localdomain sudo[259993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:33 np0005541913.localdomain python3.9[259995]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:33 np0005541913.localdomain sudo[259993]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41767 DF PROTO=TCP SPT=34180 DPT=9102 SEQ=2874539780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A1BAAD0000000001030307) 
Dec 02 09:43:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:43:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:43:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:43:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:43:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:43:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:43:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:43:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:43:34 np0005541913.localdomain sudo[260104]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apkmymhurdwigpvqkrgvcbntezpcihwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668613.9080992-759-49389756785003/AnsiballZ_stat.py
Dec 02 09:43:34 np0005541913.localdomain sudo[260104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:34 np0005541913.localdomain python3.9[260106]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:34 np0005541913.localdomain sudo[260104]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:34 np0005541913.localdomain sudo[260161]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sophewifvpjkffyiwiymeleqzhgclqro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668613.9080992-759-49389756785003/AnsiballZ_file.py
Dec 02 09:43:34 np0005541913.localdomain sudo[260161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:34 np0005541913.localdomain python3.9[260163]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:34 np0005541913.localdomain sudo[260161]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41768 DF PROTO=TCP SPT=34180 DPT=9102 SEQ=2874539780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A1BEA40000000001030307) 
Dec 02 09:43:35 np0005541913.localdomain sudo[260271]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxxyupvytdszpzdarkgafvmfonhauxib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668614.9337666-828-180195715781451/AnsiballZ_file.py
Dec 02 09:43:35 np0005541913.localdomain sudo[260271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:35 np0005541913.localdomain python3.9[260273]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:43:35 np0005541913.localdomain sudo[260271]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39179 DF PROTO=TCP SPT=37708 DPT=9102 SEQ=1820381988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A1C1E40000000001030307) 
Dec 02 09:43:35 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:35.809 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:35 np0005541913.localdomain sudo[260381]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzkjkfwbvtstsflkbmzwqakgdgqtfxta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668615.5890355-852-182365057749040/AnsiballZ_stat.py
Dec 02 09:43:35 np0005541913.localdomain sudo[260381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:43:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:43:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:43:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146520 "" "Go-http-client/1.1"
Dec 02 09:43:36 np0005541913.localdomain python3.9[260383]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:43:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16781 "" "Go-http-client/1.1"
Dec 02 09:43:36 np0005541913.localdomain sudo[260381]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:36 np0005541913.localdomain sudo[260438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcdcerhddpifjpgwibjolazhgfjnpxgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668615.5890355-852-182365057749040/AnsiballZ_file.py
Dec 02 09:43:36 np0005541913.localdomain sudo[260438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:36 np0005541913.localdomain python3.9[260440]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:43:36 np0005541913.localdomain sudo[260438]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:36 np0005541913.localdomain sudo[260548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwvlsgelqagpdcvgbvzokivbcyogfpmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668616.6999598-888-58815910376120/AnsiballZ_stat.py
Dec 02 09:43:36 np0005541913.localdomain sudo[260548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41769 DF PROTO=TCP SPT=34180 DPT=9102 SEQ=2874539780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A1C6A40000000001030307) 
Dec 02 09:43:37 np0005541913.localdomain python3.9[260550]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:37 np0005541913.localdomain sudo[260548]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:37 np0005541913.localdomain sudo[260605]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quwczesegzxfflqsfcepjpgdmodwecog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668616.6999598-888-58815910376120/AnsiballZ_file.py
Dec 02 09:43:37 np0005541913.localdomain sudo[260605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:37 np0005541913.localdomain python3.9[260607]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:43:37 np0005541913.localdomain sudo[260605]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59705 DF PROTO=TCP SPT=43938 DPT=9102 SEQ=553238560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A1C9E40000000001030307) 
Dec 02 09:43:38 np0005541913.localdomain sudo[260715]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsrtvmjlbnycndacaaouabxaxhojzpda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668617.7654681-924-201515973343975/AnsiballZ_systemd.py
Dec 02 09:43:38 np0005541913.localdomain sudo[260715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:38 np0005541913.localdomain python3.9[260717]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:43:38 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:38.399 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:38 np0005541913.localdomain systemd-rc-local-generator[260743]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:43:38 np0005541913.localdomain systemd-sysv-generator[260747]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:38 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:38 np0005541913.localdomain sudo[260715]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:39 np0005541913.localdomain sudo[260863]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfbadxycrrbtdcgzsnhwagdydtzaxcyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668618.9674246-948-110583836109484/AnsiballZ_stat.py
Dec 02 09:43:39 np0005541913.localdomain sudo[260863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:39 np0005541913.localdomain python3.9[260865]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:39 np0005541913.localdomain sudo[260863]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:39 np0005541913.localdomain sudo[260920]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndccmrpbkhbtvqmljmjdzkzawgbfhsir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668618.9674246-948-110583836109484/AnsiballZ_file.py
Dec 02 09:43:39 np0005541913.localdomain sudo[260920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:43:39 np0005541913.localdomain systemd[1]: tmp-crun.YdzhaG.mount: Deactivated successfully.
Dec 02 09:43:39 np0005541913.localdomain podman[260923]: 2025-12-02 09:43:39.84649816 +0000 UTC m=+0.109835076 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:43:39 np0005541913.localdomain podman[260923]: 2025-12-02 09:43:39.883836629 +0000 UTC m=+0.147173565 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 09:43:39 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:43:39 np0005541913.localdomain python3.9[260922]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:43:39 np0005541913.localdomain sudo[260920]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:40 np0005541913.localdomain sudo[261047]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdmdpixuajluonksrolqrwzzfmkogmpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668620.1003258-984-134986321108034/AnsiballZ_stat.py
Dec 02 09:43:40 np0005541913.localdomain sudo[261047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:40 np0005541913.localdomain python3.9[261049]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:40 np0005541913.localdomain sudo[261047]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:40 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:40.810 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:40 np0005541913.localdomain sudo[261104]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdxneiupogpktqjccrtmtnbkbtqcekik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668620.1003258-984-134986321108034/AnsiballZ_file.py
Dec 02 09:43:40 np0005541913.localdomain sudo[261104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41770 DF PROTO=TCP SPT=34180 DPT=9102 SEQ=2874539780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A1D6640000000001030307) 
Dec 02 09:43:41 np0005541913.localdomain python3.9[261106]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:43:41 np0005541913.localdomain sudo[261104]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:41 np0005541913.localdomain sudo[261214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdfzdmqvfpeoyaapzzdlvgcqexmgrmgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668621.2192264-1020-127853421198917/AnsiballZ_systemd.py
Dec 02 09:43:41 np0005541913.localdomain sudo[261214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:41 np0005541913.localdomain python3.9[261216]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:43:41 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:43:41 np0005541913.localdomain systemd-sysv-generator[261242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:43:41 np0005541913.localdomain systemd-rc-local-generator[261239]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:43:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:43:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:41 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:42 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:42 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 09:43:42 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 09:43:42 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 09:43:42 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 09:43:42 np0005541913.localdomain sudo[261214]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:42 np0005541913.localdomain sudo[261365]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dykbbmttwtwomxgxwvvpeebtuntqewln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668622.65923-1050-234502521472331/AnsiballZ_file.py
Dec 02 09:43:42 np0005541913.localdomain sudo[261365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:43:43 np0005541913.localdomain podman[261367]: 2025-12-02 09:43:43.013107231 +0000 UTC m=+0.076546598 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 02 09:43:43 np0005541913.localdomain podman[261367]: 2025-12-02 09:43:43.047091749 +0000 UTC m=+0.110531086 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:43:43 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:43:43 np0005541913.localdomain python3.9[261368]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:43:43 np0005541913.localdomain sudo[261365]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:43 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:43.402 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:43 np0005541913.localdomain sudo[261493]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzullovfbysigozvmgexuvksvzegkvzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668623.354793-1074-100794677055274/AnsiballZ_stat.py
Dec 02 09:43:43 np0005541913.localdomain sudo[261493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:43 np0005541913.localdomain python3.9[261495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:43:43 np0005541913.localdomain sudo[261493]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:44 np0005541913.localdomain sudo[261581]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldhpkrhklzsqmzstxasemylssjzjniks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668623.354793-1074-100794677055274/AnsiballZ_copy.py
Dec 02 09:43:44 np0005541913.localdomain sudo[261581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:44 np0005541913.localdomain python3.9[261583]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764668623.354793-1074-100794677055274/.source.json _original_basename=.ad3i5_vm follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:43:44 np0005541913.localdomain sudo[261581]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:44 np0005541913.localdomain sudo[261691]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znqmawavbcqllotmksyjoknurqwlrncy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668624.7056694-1119-13893799039871/AnsiballZ_file.py
Dec 02 09:43:44 np0005541913.localdomain sudo[261691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:45 np0005541913.localdomain python3.9[261693]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:43:45 np0005541913.localdomain sudo[261691]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:45 np0005541913.localdomain sudo[261801]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klulujbxlorqtfwnhbdxxjdlsqnxbqmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668625.3833342-1143-59836749088317/AnsiballZ_stat.py
Dec 02 09:43:45 np0005541913.localdomain sudo[261801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:45 np0005541913.localdomain sudo[261801]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:45 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:45.852 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:46 np0005541913.localdomain sudo[261889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xaldtnnsuwbulsqppybrfzhxanjfrlxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668625.3833342-1143-59836749088317/AnsiballZ_copy.py
Dec 02 09:43:46 np0005541913.localdomain sudo[261889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:46 np0005541913.localdomain sudo[261889]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:47 np0005541913.localdomain sudo[261999]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcczuidhvzsbkeegnjxeubultehylicq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668626.7655377-1194-118651062464629/AnsiballZ_container_config_data.py
Dec 02 09:43:47 np0005541913.localdomain sudo[261999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:43:47 np0005541913.localdomain podman[262002]: 2025-12-02 09:43:47.311007365 +0000 UTC m=+0.092556820 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Dec 02 09:43:47 np0005541913.localdomain podman[262002]: 2025-12-02 09:43:47.350788899 +0000 UTC m=+0.132338274 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, distribution-scope=public)
Dec 02 09:43:47 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:43:47 np0005541913.localdomain python3.9[262001]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Dec 02 09:43:47 np0005541913.localdomain sudo[261999]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:48 np0005541913.localdomain sudo[262129]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqppgxgkfnmodlwhnbmyortrifvfgsbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668627.6604626-1221-190832458761223/AnsiballZ_container_config_hash.py
Dec 02 09:43:48 np0005541913.localdomain sudo[262129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:48 np0005541913.localdomain python3.9[262131]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:43:48 np0005541913.localdomain sudo[262129]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:48 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:48.441 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:48 np0005541913.localdomain sudo[262239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iypvouwbmbpdipkcndjlhijxrnjroirg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668628.5456867-1248-224693934700160/AnsiballZ_podman_container_info.py
Dec 02 09:43:48 np0005541913.localdomain sudo[262239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41771 DF PROTO=TCP SPT=34180 DPT=9102 SEQ=2874539780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A1F5E40000000001030307) 
Dec 02 09:43:49 np0005541913.localdomain python3.9[262241]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 09:43:49 np0005541913.localdomain sudo[262239]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:49 np0005541913.localdomain sudo[262286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:43:49 np0005541913.localdomain sudo[262286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:43:49 np0005541913.localdomain sudo[262286]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:49 np0005541913.localdomain sudo[262304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:43:49 np0005541913.localdomain sudo[262304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:43:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:43:50 np0005541913.localdomain systemd[1]: tmp-crun.RR2Xpp.mount: Deactivated successfully.
Dec 02 09:43:50 np0005541913.localdomain podman[262340]: 2025-12-02 09:43:50.471220853 +0000 UTC m=+0.099887928 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:43:50 np0005541913.localdomain podman[262340]: 2025-12-02 09:43:50.478367916 +0000 UTC m=+0.107035001 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:43:50 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:43:50 np0005541913.localdomain sudo[262304]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:50 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:50.854 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:51 np0005541913.localdomain sudo[262377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:43:51 np0005541913.localdomain sudo[262377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:43:51 np0005541913.localdomain sudo[262377]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:53 np0005541913.localdomain sudo[262485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvjpanseaqwqygefnbwggeqsnoycwogy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668632.974954-1287-95169929799166/AnsiballZ_edpm_container_manage.py
Dec 02 09:43:53 np0005541913.localdomain sudo[262485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:53 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:53.476 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:53 np0005541913.localdomain python3[262487]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:43:54 np0005541913.localdomain podman[262527]: 
Dec 02 09:43:54 np0005541913.localdomain podman[262527]: 2025-12-02 09:43:54.027543367 +0000 UTC m=+0.084088411 container create 12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, config_id=neutron_dhcp, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 09:43:54 np0005541913.localdomain podman[262527]: 2025-12-02 09:43:53.982137012 +0000 UTC m=+0.038682076 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 09:43:54 np0005541913.localdomain python3[262487]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 09:43:54 np0005541913.localdomain sudo[262485]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:54 np0005541913.localdomain sudo[262671]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlcavrlqzibnabbyzrvvgbxqzvlnizcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668634.3933506-1311-72719010614933/AnsiballZ_stat.py
Dec 02 09:43:54 np0005541913.localdomain sudo[262671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:54 np0005541913.localdomain python3.9[262673]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:43:54 np0005541913.localdomain sudo[262671]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:55 np0005541913.localdomain sudo[262783]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwacusmgrovuosfzbkqzswlczfkipvjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668635.122471-1338-15997993813318/AnsiballZ_file.py
Dec 02 09:43:55 np0005541913.localdomain sudo[262783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:55 np0005541913.localdomain python3.9[262785]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:43:55 np0005541913.localdomain sudo[262783]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:55 np0005541913.localdomain sudo[262838]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hoogismeivuueqguurxgepshooezcvcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668635.122471-1338-15997993813318/AnsiballZ_stat.py
Dec 02 09:43:55 np0005541913.localdomain sudo[262838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:55 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:55.898 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:56 np0005541913.localdomain python3.9[262840]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:43:56 np0005541913.localdomain sudo[262838]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:56 np0005541913.localdomain sudo[262947]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-woovogbyrhmdoyyxqywstilpvlcfohuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668636.0603197-1338-56051706117139/AnsiballZ_copy.py
Dec 02 09:43:56 np0005541913.localdomain sudo[262947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:56 np0005541913.localdomain python3.9[262949]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764668636.0603197-1338-56051706117139/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:43:56 np0005541913.localdomain sudo[262947]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:56 np0005541913.localdomain sudo[263002]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krwewkilsehufsaudlutmfoxayznxkct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668636.0603197-1338-56051706117139/AnsiballZ_systemd.py
Dec 02 09:43:56 np0005541913.localdomain sudo[263002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:57 np0005541913.localdomain python3.9[263004]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:43:57 np0005541913.localdomain systemd-rc-local-generator[263030]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:43:57 np0005541913.localdomain systemd-sysv-generator[263033]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:43:57 np0005541913.localdomain sudo[263002]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:57 np0005541913.localdomain podman[263041]: 2025-12-02 09:43:57.678027512 +0000 UTC m=+0.084610395 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 09:43:57 np0005541913.localdomain podman[263041]: 2025-12-02 09:43:57.696066639 +0000 UTC m=+0.102649522 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 02 09:43:57 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:43:57 np0005541913.localdomain sudo[263112]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxoprvniloxainaehswwxrieduwymxme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668636.0603197-1338-56051706117139/AnsiballZ_systemd.py
Dec 02 09:43:57 np0005541913.localdomain sudo[263112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:43:58 np0005541913.localdomain python3.9[263114]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:43:58 np0005541913.localdomain systemd-rc-local-generator[263143]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:43:58 np0005541913.localdomain systemd-sysv-generator[263147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:43:58 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:43:58.506 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:43:58 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/684a301db60ff2842f4bca90ed755fa7238e90727cf60535fb02a8b75505de20/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 02 09:43:58 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/684a301db60ff2842f4bca90ed755fa7238e90727cf60535fb02a8b75505de20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 09:43:58 np0005541913.localdomain podman[263155]: 2025-12-02 09:43:58.776421597 +0000 UTC m=+0.149982321 container init 12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:43:58 np0005541913.localdomain podman[263155]: 2025-12-02 09:43:58.788086632 +0000 UTC m=+0.161647356 container start 12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=neutron_dhcp, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 09:43:58 np0005541913.localdomain podman[263155]: neutron_dhcp_agent
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: + sudo -E kolla_set_configs
Dec 02 09:43:58 np0005541913.localdomain systemd[1]: Started neutron_dhcp_agent container.
Dec 02 09:43:58 np0005541913.localdomain sudo[263112]: pam_unix(sudo:session): session closed for user root
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Validating config file
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Copying service configuration files
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Writing out command to execute
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/595e1c9b-709c-41d2-9212-0b18b13291a8.pid.haproxy
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/595e1c9b-709c-41d2-9212-0b18b13291a8.conf
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: ++ cat /run_command
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: + CMD=/usr/bin/neutron-dhcp-agent
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: + ARGS=
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: + sudo kolla_copy_cacerts
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: + [[ ! -n '' ]]
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: + . kolla_extend_start
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: Running command: '/usr/bin/neutron-dhcp-agent'
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: + umask 0022
Dec 02 09:43:58 np0005541913.localdomain neutron_dhcp_agent[263169]: + exec /usr/bin/neutron-dhcp-agent
Dec 02 09:44:00 np0005541913.localdomain neutron_dhcp_agent[263169]: 2025-12-02 09:44:00.194 263173 INFO neutron.common.config [-] Logging enabled!
Dec 02 09:44:00 np0005541913.localdomain neutron_dhcp_agent[263169]: 2025-12-02 09:44:00.194 263173 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43
Dec 02 09:44:00 np0005541913.localdomain sudo[263291]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olfvosowvjuabmfmipnzprpwdetykxnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668640.0803485-1422-7005170316448/AnsiballZ_systemd.py
Dec 02 09:44:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:44:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:44:00 np0005541913.localdomain sudo[263291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:44:00 np0005541913.localdomain podman[263293]: 2025-12-02 09:44:00.439501256 +0000 UTC m=+0.079648221 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:44:00 np0005541913.localdomain podman[263293]: 2025-12-02 09:44:00.451279214 +0000 UTC m=+0.091426219 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:44:00 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:44:00 np0005541913.localdomain podman[263294]: 2025-12-02 09:44:00.510365239 +0000 UTC m=+0.148652594 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:44:00 np0005541913.localdomain podman[263294]: 2025-12-02 09:44:00.57708454 +0000 UTC m=+0.215371885 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 09:44:00 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:44:00 np0005541913.localdomain neutron_dhcp_agent[263169]: 2025-12-02 09:44:00.624 263173 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Dec 02 09:44:00 np0005541913.localdomain python3.9[263295]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:44:00 np0005541913.localdomain systemd[1]: Stopping neutron_dhcp_agent container...
Dec 02 09:44:00 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:00.904 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:01 np0005541913.localdomain systemd[1]: libpod-12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa.scope: Deactivated successfully.
Dec 02 09:44:01 np0005541913.localdomain systemd[1]: libpod-12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa.scope: Consumed 2.381s CPU time.
Dec 02 09:44:01 np0005541913.localdomain podman[263346]: 2025-12-02 09:44:01.366426471 +0000 UTC m=+0.607948214 container died 12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 02 09:44:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-684a301db60ff2842f4bca90ed755fa7238e90727cf60535fb02a8b75505de20-merged.mount: Deactivated successfully.
Dec 02 09:44:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa-userdata-shm.mount: Deactivated successfully.
Dec 02 09:44:01 np0005541913.localdomain podman[263346]: 2025-12-02 09:44:01.467593732 +0000 UTC m=+0.709115445 container cleanup 12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=neutron_dhcp, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 09:44:01 np0005541913.localdomain podman[263346]: neutron_dhcp_agent
Dec 02 09:44:01 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:01.526 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:44:01.522 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 09:44:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:44:01.523 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 09:44:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:44:01.525 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:44:01 np0005541913.localdomain podman[263384]: error opening file `/run/crun/12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa/status`: No such file or directory
Dec 02 09:44:01 np0005541913.localdomain podman[263373]: 2025-12-02 09:44:01.581447786 +0000 UTC m=+0.074150263 container cleanup 12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 09:44:01 np0005541913.localdomain podman[263373]: neutron_dhcp_agent
Dec 02 09:44:01 np0005541913.localdomain systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Dec 02 09:44:01 np0005541913.localdomain systemd[1]: Stopped neutron_dhcp_agent container.
Dec 02 09:44:01 np0005541913.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Dec 02 09:44:01 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:44:01 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/684a301db60ff2842f4bca90ed755fa7238e90727cf60535fb02a8b75505de20/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 02 09:44:01 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/684a301db60ff2842f4bca90ed755fa7238e90727cf60535fb02a8b75505de20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 09:44:01 np0005541913.localdomain podman[263388]: 2025-12-02 09:44:01.733996444 +0000 UTC m=+0.116686460 container init 12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=neutron_dhcp_agent, config_id=neutron_dhcp, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 09:44:01 np0005541913.localdomain podman[263388]: 2025-12-02 09:44:01.742221407 +0000 UTC m=+0.124911423 container start 12fb0e38ef14daf14b83aea79b0f1cf6caa8dbcc2691abc83accd331da1c7afa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b920ccd57f1789bca419f0fa8ccb82ff7492e8e96fd044e96be9ed18906b094d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=neutron_dhcp_agent, config_id=neutron_dhcp)
Dec 02 09:44:01 np0005541913.localdomain podman[263388]: neutron_dhcp_agent
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: + sudo -E kolla_set_configs
Dec 02 09:44:01 np0005541913.localdomain systemd[1]: Started neutron_dhcp_agent container.
Dec 02 09:44:01 np0005541913.localdomain sudo[263291]: pam_unix(sudo:session): session closed for user root
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Validating config file
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Copying service configuration files
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Writing out command to execute
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/595e1c9b-709c-41d2-9212-0b18b13291a8.pid.haproxy
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/595e1c9b-709c-41d2-9212-0b18b13291a8.conf
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: ++ cat /run_command
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: + CMD=/usr/bin/neutron-dhcp-agent
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: + ARGS=
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: + sudo kolla_copy_cacerts
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: + [[ ! -n '' ]]
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: + . kolla_extend_start
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: Running command: '/usr/bin/neutron-dhcp-agent'
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: + umask 0022
Dec 02 09:44:01 np0005541913.localdomain neutron_dhcp_agent[263402]: + exec /usr/bin/neutron-dhcp-agent
Dec 02 09:44:02 np0005541913.localdomain sshd[256527]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:44:02 np0005541913.localdomain systemd[1]: session-59.scope: Deactivated successfully.
Dec 02 09:44:02 np0005541913.localdomain systemd[1]: session-59.scope: Consumed 35.841s CPU time.
Dec 02 09:44:02 np0005541913.localdomain systemd-logind[757]: Session 59 logged out. Waiting for processes to exit.
Dec 02 09:44:02 np0005541913.localdomain systemd-logind[757]: Removed session 59.
Dec 02 09:44:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:44:03.026 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:44:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:44:03.027 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:44:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:44:03.028 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:44:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 09:44:03.083 263406 INFO neutron.common.config [-] Logging enabled!
Dec 02 09:44:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 09:44:03.084 263406 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43
Dec 02 09:44:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 09:44:03.494 263406 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Dec 02 09:44:03 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:03.549 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 09:44:03.609 263406 INFO neutron.agent.dhcp.agent [None req-0b0597ea-3ae0-4335-aeba-5e8799b0c87f - - - - - -] All active networks have been fetched through RPC.
Dec 02 09:44:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 09:44:03.610 263406 INFO neutron.agent.dhcp.agent [None req-0b0597ea-3ae0-4335-aeba-5e8799b0c87f - - - - - -] Synchronizing state complete
Dec 02 09:44:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 09:44:03.670 263406 INFO neutron.agent.dhcp.agent [None req-0b0597ea-3ae0-4335-aeba-5e8799b0c87f - - - - - -] DHCP agent started
Dec 02 09:44:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19940 DF PROTO=TCP SPT=51420 DPT=9102 SEQ=20700297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A22FDD0000000001030307) 
Dec 02 09:44:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:44:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:44:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:44:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:44:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:44:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:44:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:44:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19941 DF PROTO=TCP SPT=51420 DPT=9102 SEQ=20700297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A233E40000000001030307) 
Dec 02 09:44:05 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41772 DF PROTO=TCP SPT=34180 DPT=9102 SEQ=2874539780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A235E40000000001030307) 
Dec 02 09:44:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:05.904 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:44:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:44:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:44:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148825 "" "Go-http-client/1.1"
Dec 02 09:44:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:44:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17212 "" "Go-http-client/1.1"
Dec 02 09:44:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19942 DF PROTO=TCP SPT=51420 DPT=9102 SEQ=20700297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A23BE40000000001030307) 
Dec 02 09:44:08 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39180 DF PROTO=TCP SPT=37708 DPT=9102 SEQ=1820381988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A23FE40000000001030307) 
Dec 02 09:44:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:08.554 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:09.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:44:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:09.740 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:44:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:09.741 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:44:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:09.741 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:44:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:09.741 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:44:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:09.741 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.178 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.243 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.245 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:44:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.447 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.448 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12137MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.448 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.449 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:44:10 np0005541913.localdomain podman[263457]: 2025-12-02 09:44:10.462769724 +0000 UTC m=+0.102064566 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:44:10 np0005541913.localdomain podman[263457]: 2025-12-02 09:44:10.468847368 +0000 UTC m=+0.108142200 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Dec 02 09:44:10 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.511 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.511 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.511 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.552 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.908 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.967 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:44:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:10.973 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:44:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:11.001 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:44:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:11.003 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:44:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:11.003 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:44:11 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19943 DF PROTO=TCP SPT=51420 DPT=9102 SEQ=20700297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A24BA50000000001030307) 
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.003 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.004 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.004 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.093 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.094 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.094 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.094 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:44:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:44:13 np0005541913.localdomain systemd[1]: tmp-crun.J2qoNh.mount: Deactivated successfully.
Dec 02 09:44:13 np0005541913.localdomain podman[263498]: 2025-12-02 09:44:13.452920983 +0000 UTC m=+0.093032644 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 02 09:44:13 np0005541913.localdomain podman[263498]: 2025-12-02 09:44:13.463086816 +0000 UTC m=+0.103198527 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 09:44:13 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.556 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.658 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.674 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.675 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.675 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.676 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.676 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.676 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.723 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:44:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:13.723 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:44:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:15.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:44:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:15.963 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.098 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.115 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.116 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2432c842-5932-4146-a31e-05b07f1101aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:44:16.099740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '777205e6-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.31881643, 'message_signature': '1b9e8271716cf249e2872f5ad75188c67b5782f4ca968a6f3923d281ee8e0249'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:44:16.099740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77721978-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.31881643, 'message_signature': '95ab67a0b65efec6df12b2cfab69e6592a38c9841bda3bd6d86afa903da97c0c'}]}, 'timestamp': '2025-12-02 09:44:16.116587', '_unique_id': '136de8464c364a21930f46e57a74cc64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.118 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.122 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8483e625-5372-4f0b-a905-c6ae24fb4171', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.119534', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '77731ce2-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': '41f13b4db67739bff79024b73ff7fa9beb36710a367a3bd374af36cff81877ad'}]}, 'timestamp': '2025-12-02 09:44:16.123252', '_unique_id': 'f9deacf16c6f4e03818a0356957295a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.125 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12e6a044-9164-40e2-8171-cd35479972f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.125555', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '77738bd2-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': '7d8bc1293356b0abcedf54f7f4a1fa95fd73a9710cfbebfcc666742d7ea7a064'}]}, 'timestamp': '2025-12-02 09:44:16.126086', '_unique_id': '24e8b85da7574942978059f4ff3487d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.127 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.128 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.144 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d007820-5df9-4b35-92a5-83a789a278d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:44:16.128358', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '777672fc-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.363536848, 'message_signature': '5cedcd6a686cd6aed04d1b58d9f0cab48718f09add36beb9bd7f0aa6459156c0'}]}, 'timestamp': '2025-12-02 09:44:16.145100', '_unique_id': '5691d983385048b4842914a313d1d934'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.146 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.147 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.148 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b61d48d2-c097-4769-a703-f623a1a2c7df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:44:16.147527', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7776e516-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.31881643, 'message_signature': '5693a51062c0e5e2933c3e755718760a54dc49aac7c9cf5ba468bf2fa225fc58'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:44:16.147527', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7776f556-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.31881643, 'message_signature': '81d60ea59dc2894884ef5bdd684d29410d469c0cfdc8efc72117c8eefbae8a96'}]}, 'timestamp': '2025-12-02 09:44:16.148407', '_unique_id': '7ac0fd400b8148b0ac59d525abb443ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.149 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.184 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.184 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d5c9d89-c900-4cb1-a9e2-fe356ce4f5d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:44:16.150790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '777c7cc4-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': 'b1f2070c610e5c88959bcaac72825118b352a89f9524e9e5360ed8b7ef15b501'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:44:16.150790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '777c9510-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': '298b2b2c26ba56079ea31430e227eef9f033118f4dd3849655c0aec30bab2522'}]}, 'timestamp': '2025-12-02 09:44:16.185467', '_unique_id': '9f2da6c9af96435f97fa84285d5efe38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9065735d-d82f-44b0-9f54-836a4a04b366', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:44:16.187986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '777d109e-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': '37f679005b7ffce19c2d92c7b3e7318565cf1a1205e58ea4099fceadb3623eb0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:44:16.187986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '777d2214-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': '2c07d1411c3aecfca502b5eb69dadbf13c9bba6c73833f33f54c2e21df8605ca'}]}, 'timestamp': '2025-12-02 09:44:16.188882', '_unique_id': '73bb51af0fe7459fbbecbcfbf47e692b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.189 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.191 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '391ca724-89e8-4e44-ba53-bc836fa55ee7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.191078', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '777d8970-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': '00dbe912c159072f74650ee890a4697c84e35f4dd7c7ea8876a54d9e4607f93d'}]}, 'timestamp': '2025-12-02 09:44:16.191554', '_unique_id': '8ac9b5a5068f4f79b904a167956a7b9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.193 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '090b1dc7-c394-4342-affc-95c421cc267e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.193710', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '777df130-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': 'eb3ba91751cbfa8fec148fd16ffd202768c3268c80a8a0863d3caf7635ff8528'}]}, 'timestamp': '2025-12-02 09:44:16.194212', '_unique_id': '036e09cfd14f4b45984a11bc033e9be8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec6d958f-5711-4d93-977b-11a493839ddf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.196331', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '777e567a-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': '9470bcb0d6da37bee3951c2fb467fca90644016f96321d9fa8a70f1bb7cbc579'}]}, 'timestamp': '2025-12-02 09:44:16.196836', '_unique_id': 'bf5afa907e424a77840000c8b69e5c1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.199 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.199 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1294fad9-024a-4c67-bc85-37aad4216b0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:44:16.198963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '777ebd36-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': '533ef7164702552a14ae3b4584c4cb0a14e520d3c58374661683380ad2e09684'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:44:16.198963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '777ecec0-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': '7c2da2921eadd5ba986f19ab40f372290755770e847f2bfb3a09f27a1439c7a0'}]}, 'timestamp': '2025-12-02 09:44:16.199885', '_unique_id': 'c062fb1b57b94347bc2d554c0368d425'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.201 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.202 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 56560000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '756313cf-2dfc-44ac-b61d-d47a46ecdde6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56560000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:44:16.202103', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '777f37f2-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.363536848, 'message_signature': '11d3101e4706702e2b67bae10ccca8661f0e2cbb53ad40017b2934b1d7838bf5'}]}, 'timestamp': '2025-12-02 09:44:16.202561', '_unique_id': '2fbcdb89b4324faf94a4854e18c730bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.204 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.204 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5aa65c34-b701-433a-a0cf-0042b56577bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.204845', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '777fa304-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': '25821a683b7a2d6ea3e4987a392fcb968576313bf578f50d71daf16650a0cbce'}]}, 'timestamp': '2025-12-02 09:44:16.205317', '_unique_id': '5036fe0fd7434d189ab2830604007ef9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.206 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.207 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1433516318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.207 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 164656012 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82c8a8db-a895-4fd0-bab0-d61eb18ed9dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1433516318, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:44:16.207434', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '778008b2-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': '8b76e1af8c41a9a020e5806a1e51f4aa3a901d4d6f1a52ac4cfc652dbde5a0f8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164656012, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:44:16.207434', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77801974-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': '663d6430b311b0a8c9a6b89054cff3b555f356d9d85535951a56b0014fa1575e'}]}, 'timestamp': '2025-12-02 09:44:16.208323', '_unique_id': '30536de629a2486fb971ce5e897d19f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.209 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.210 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65e46cf7-34a8-4389-85f2-582ce73c2b4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.210529', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '77808256-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': '537ba4267a67e1105bef74dffff1b83ccf2277c063b585633ab3c7c54e40ef9c'}]}, 'timestamp': '2025-12-02 09:44:16.211032', '_unique_id': 'fdf1d771e9dd418ead4eeb660aa09e9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.213 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.213 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b04d2f65-a035-4bdd-baa4-2c3baa01a2b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:44:16.213282', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7780ec32-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.31881643, 'message_signature': '1c0112244d50a24af64cd32857673612fad515461e4e943cfb2e51be644b9f1d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:44:16.213282', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7780fee8-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.31881643, 'message_signature': '6752b1337bb796109f71e6906a4664ba0743471779d98135d5fc4ab067ab8960'}]}, 'timestamp': '2025-12-02 09:44:16.214195', '_unique_id': '5e4877e765cc45cf984774c88cf0a3d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.216 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.216 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 286697561 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.217 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 39228582 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5893091-afaa-40f8-ac77-e6829a440ec3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 286697561, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:44:16.216554', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77816dc4-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': '61c82db9099a949a3f91414b66420d4c071de222c5688181b85169363c93f435'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39228582, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:44:16.216554', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77817e18-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': 'b54c5f1023e63078a6a743119b634df92130450d1013b6346453e791ef5d29ed'}]}, 'timestamp': '2025-12-02 09:44:16.217490', '_unique_id': '715ba7c4f4fc4ae1b220499d98587431'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.219 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcbff0cd-b5f3-4cca-8c41-2a00fd8f318c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.219829', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7781ebf0-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': 'fb60635d1936f34389bf3c19ad4fc7deb7e720a5df8ce612ea3743fb02a3b92b'}]}, 'timestamp': '2025-12-02 09:44:16.220287', '_unique_id': '797a3eb0181c4f9c9c8b84aa4a4afaca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.222 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 9229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08701f33-a737-4317-b647-c73aff6a751b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9229, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.222697', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '77825c8e-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': '792e743f809909c17bacc015a0627c86a3f412c22d1739451d0a432d1ed0a229'}]}, 'timestamp': '2025-12-02 09:44:16.223172', '_unique_id': 'a12984e5c25a462a9b766df921b3154c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.224 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.225 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.225 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f615a03-7df6-4048-a875-48d5abae1c54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:44:16.225261', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7782bff8-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.338659026, 'message_signature': '8ff69b9532e7f25ec78a6ecfaf15e53c823b5e6e963c66e303a7f803bfe385ed'}]}, 'timestamp': '2025-12-02 09:44:16.225798', '_unique_id': '1483f89066014164b63024248f782f14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.227 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.227 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fd34916-13ad-4d2a-a5e4-b077aac11362', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:44:16.227576', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '778317fa-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': 'dc5f1a0bf6f2878fb088956b109bbfb17deb7e367b22cc476ded38d25476bf2b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:44:16.227576', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77832204-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10818.369868799, 'message_signature': '5109be235f1cfb44a8da73c49da1725bbe9a4641348d4c8e8ba2bf4f2973682e'}]}, 'timestamp': '2025-12-02 09:44:16.228121', '_unique_id': 'dfe209c04b2c4e4abc739ad66f2b8508'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.228 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:44:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:44:16.229 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:44:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:44:18 np0005541913.localdomain podman[263518]: 2025-12-02 09:44:18.440896134 +0000 UTC m=+0.084417650 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.)
Dec 02 09:44:18 np0005541913.localdomain podman[263518]: 2025-12-02 09:44:18.455022686 +0000 UTC m=+0.098544182 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 09:44:18 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:44:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:18.561 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19944 DF PROTO=TCP SPT=51420 DPT=9102 SEQ=20700297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A26BE40000000001030307) 
Dec 02 09:44:20 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:20.966 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:44:21 np0005541913.localdomain podman[263538]: 2025-12-02 09:44:21.438630295 +0000 UTC m=+0.077376819 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:44:21 np0005541913.localdomain podman[263538]: 2025-12-02 09:44:21.473282211 +0000 UTC m=+0.112028755 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:44:21 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:44:23 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:23.565 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:25 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:25.969 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:44:28 np0005541913.localdomain podman[263561]: 2025-12-02 09:44:28.435487608 +0000 UTC m=+0.072752725 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:44:28 np0005541913.localdomain podman[263561]: 2025-12-02 09:44:28.453992604 +0000 UTC m=+0.091257721 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 09:44:28 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:44:28 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:28.570 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:30 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:30.972 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:44:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:44:31 np0005541913.localdomain podman[263578]: 2025-12-02 09:44:31.454730882 +0000 UTC m=+0.094903140 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:44:31 np0005541913.localdomain podman[263578]: 2025-12-02 09:44:31.462927092 +0000 UTC m=+0.103099360 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:44:31 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:44:31 np0005541913.localdomain podman[263579]: 2025-12-02 09:44:31.50343467 +0000 UTC m=+0.140901265 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:44:31 np0005541913.localdomain podman[263579]: 2025-12-02 09:44:31.541203644 +0000 UTC m=+0.178670179 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:44:31 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:44:31 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:44:31Z|00047|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 02 09:44:33 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:33.573 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51716 DF PROTO=TCP SPT=35230 DPT=9102 SEQ=1998856449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A2A50D0000000001030307) 
Dec 02 09:44:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:44:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:44:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:44:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:44:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:44:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:44:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:44:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:44:34 np0005541913.localdomain rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 09:44:34 np0005541913.localdomain rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 02 09:44:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51717 DF PROTO=TCP SPT=35230 DPT=9102 SEQ=1998856449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A2A9250000000001030307) 
Dec 02 09:44:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19945 DF PROTO=TCP SPT=51420 DPT=9102 SEQ=20700297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A2ABE40000000001030307) 
Dec 02 09:44:35 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:35.975 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:44:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:44:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:44:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148825 "" "Go-http-client/1.1"
Dec 02 09:44:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:44:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17222 "" "Go-http-client/1.1"
Dec 02 09:44:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51718 DF PROTO=TCP SPT=35230 DPT=9102 SEQ=1998856449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A2B1250000000001030307) 
Dec 02 09:44:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41773 DF PROTO=TCP SPT=34180 DPT=9102 SEQ=2874539780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A2B3E40000000001030307) 
Dec 02 09:44:38 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:38.610 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:41 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:41.012 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51719 DF PROTO=TCP SPT=35230 DPT=9102 SEQ=1998856449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A2C0E40000000001030307) 
Dec 02 09:44:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:44:41 np0005541913.localdomain podman[263629]: 2025-12-02 09:44:41.445236952 +0000 UTC m=+0.083782221 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 09:44:41 np0005541913.localdomain podman[263629]: 2025-12-02 09:44:41.48501205 +0000 UTC m=+0.123557269 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:44:41 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:44:43 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:43.665 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:44:44 np0005541913.localdomain podman[263649]: 2025-12-02 09:44:44.44495667 +0000 UTC m=+0.085222541 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 09:44:44 np0005541913.localdomain podman[263649]: 2025-12-02 09:44:44.480233927 +0000 UTC m=+0.120499818 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 09:44:44 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:44:46 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:46.058 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:48 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:48.668 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:44:49 np0005541913.localdomain podman[263666]: 2025-12-02 09:44:49.442584602 +0000 UTC m=+0.078261713 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, version=9.6)
Dec 02 09:44:49 np0005541913.localdomain podman[263666]: 2025-12-02 09:44:49.481143427 +0000 UTC m=+0.116820538 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec 02 09:44:49 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:44:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51720 DF PROTO=TCP SPT=35230 DPT=9102 SEQ=1998856449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A2E1E40000000001030307) 
Dec 02 09:44:51 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:51.097 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:51 np0005541913.localdomain sudo[263687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:44:51 np0005541913.localdomain sudo[263687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:44:51 np0005541913.localdomain sudo[263687]: pam_unix(sudo:session): session closed for user root
Dec 02 09:44:51 np0005541913.localdomain sudo[263705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:44:51 np0005541913.localdomain sudo[263705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:44:52 np0005541913.localdomain sudo[263705]: pam_unix(sudo:session): session closed for user root
Dec 02 09:44:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:44:52 np0005541913.localdomain podman[263755]: 2025-12-02 09:44:52.442829864 +0000 UTC m=+0.083690799 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:44:52 np0005541913.localdomain podman[263755]: 2025-12-02 09:44:52.451000213 +0000 UTC m=+0.091861118 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:44:52 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:44:52 np0005541913.localdomain sudo[263779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:44:52 np0005541913.localdomain sudo[263779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:44:52 np0005541913.localdomain sudo[263779]: pam_unix(sudo:session): session closed for user root
Dec 02 09:44:53 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:53.723 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:56 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:56.125 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:58 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:44:58.726 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:44:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:44:59 np0005541913.localdomain podman[263797]: 2025-12-02 09:44:59.443006389 +0000 UTC m=+0.082792374 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 02 09:44:59 np0005541913.localdomain podman[263797]: 2025-12-02 09:44:59.45902495 +0000 UTC m=+0.098810945 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:44:59 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:45:01 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:01.155 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:45:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:45:02 np0005541913.localdomain podman[263818]: 2025-12-02 09:45:02.434830146 +0000 UTC m=+0.077911373 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:45:02 np0005541913.localdomain podman[263818]: 2025-12-02 09:45:02.443093338 +0000 UTC m=+0.086174605 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:45:02 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:45:02 np0005541913.localdomain systemd[1]: tmp-crun.nFkIha.mount: Deactivated successfully.
Dec 02 09:45:02 np0005541913.localdomain podman[263819]: 2025-12-02 09:45:02.497843949 +0000 UTC m=+0.138961454 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 09:45:02 np0005541913.localdomain podman[263819]: 2025-12-02 09:45:02.581103495 +0000 UTC m=+0.222220960 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:45:02 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:45:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:45:03.028 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:45:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:45:03.028 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:45:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:45:03.030 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:45:03 np0005541913.localdomain sshd[263865]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:45:03 np0005541913.localdomain sshd[263865]: Accepted publickey for zuul from 192.168.122.30 port 55348 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:45:03 np0005541913.localdomain systemd-logind[757]: New session 60 of user zuul.
Dec 02 09:45:03 np0005541913.localdomain systemd[1]: Started Session 60 of User zuul.
Dec 02 09:45:03 np0005541913.localdomain sshd[263865]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:45:03 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:03.766 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63019 DF PROTO=TCP SPT=48422 DPT=9102 SEQ=1265174586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A31A3D0000000001030307) 
Dec 02 09:45:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:45:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:45:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:45:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:45:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:45:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:45:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:45:04 np0005541913.localdomain python3.9[263976]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:45:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63020 DF PROTO=TCP SPT=48422 DPT=9102 SEQ=1265174586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A31E640000000001030307) 
Dec 02 09:45:05 np0005541913.localdomain python3.9[264089]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:45:05 np0005541913.localdomain network[264106]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:45:05 np0005541913.localdomain network[264107]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:45:05 np0005541913.localdomain network[264108]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:45:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:05.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:05.722 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 02 09:45:05 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:05.747 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 02 09:45:05 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51721 DF PROTO=TCP SPT=35230 DPT=9102 SEQ=1998856449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A321E40000000001030307) 
Dec 02 09:45:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:45:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:45:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:45:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148825 "" "Go-http-client/1.1"
Dec 02 09:45:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:45:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17215 "" "Go-http-client/1.1"
Dec 02 09:45:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:06.192 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:06 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:45:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63021 DF PROTO=TCP SPT=48422 DPT=9102 SEQ=1265174586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A326650000000001030307) 
Dec 02 09:45:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19946 DF PROTO=TCP SPT=51420 DPT=9102 SEQ=20700297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A329E40000000001030307) 
Dec 02 09:45:08 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:08.808 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:09.747 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:09.779 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:45:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:09.779 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:45:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:09.780 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:45:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:09.780 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:45:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:09.781 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:45:10 np0005541913.localdomain sudo[264360]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcmdhgivtesrboelazrjjtnrmgukypzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668709.9002917-102-35089067462670/AnsiballZ_setup.py
Dec 02 09:45:10 np0005541913.localdomain sudo[264360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.246 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.312 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.313 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:45:10 np0005541913.localdomain python3.9[264362]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.487 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.488 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12125MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.489 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.489 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:45:10 np0005541913.localdomain sudo[264360]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.857 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.858 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.858 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:45:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:10.930 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.000 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.001 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.023 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.053 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_VOLUME_EXTEND,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,HW_CPU_X86_FMA3,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 09:45:11 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63022 DF PROTO=TCP SPT=48422 DPT=9102 SEQ=1265174586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A336240000000001030307) 
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.091 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:45:11 np0005541913.localdomain sudo[264425]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykvptuelfojknwtfsclirkkgntqcbvkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668709.9002917-102-35089067462670/AnsiballZ_dnf.py
Dec 02 09:45:11 np0005541913.localdomain sudo[264425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.224 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:11 np0005541913.localdomain python3.9[264427]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.521 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.529 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.549 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.551 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.551 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.552 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.552 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.752 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Triggering sync for uuid b254bb7f-2891-4b37-9c44-9700e301ce16 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.753 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.754 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.755 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:11.845 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:45:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:45:12 np0005541913.localdomain systemd[1]: tmp-crun.HOULG4.mount: Deactivated successfully.
Dec 02 09:45:12 np0005541913.localdomain podman[264452]: 2025-12-02 09:45:12.454966701 +0000 UTC m=+0.094162519 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:45:12 np0005541913.localdomain podman[264452]: 2025-12-02 09:45:12.467848858 +0000 UTC m=+0.107044736 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:45:12 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:45:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:12.793 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:13.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:13.722 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:45:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:13.722 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:45:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:13.853 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.184 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.184 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.185 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.185 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.538 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.555 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.556 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.557 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.557 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:14.723 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:14 np0005541913.localdomain sudo[264425]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:45:15 np0005541913.localdomain sudo[264578]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfqpytqhpkmchlcyigkdwdcdnqhbegno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668715.0042734-138-73733977205712/AnsiballZ_stat.py
Dec 02 09:45:15 np0005541913.localdomain sudo[264578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:15 np0005541913.localdomain podman[264579]: 2025-12-02 09:45:15.464498904 +0000 UTC m=+0.098030904 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 09:45:15 np0005541913.localdomain podman[264579]: 2025-12-02 09:45:15.500044718 +0000 UTC m=+0.133576668 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Dec 02 09:45:15 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:45:15 np0005541913.localdomain python3.9[264590]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:45:15 np0005541913.localdomain sudo[264578]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:15.718 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:15.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:16.263 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:16 np0005541913.localdomain sudo[264706]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecpomoqdbgcvixkrzofiugrtustodmjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668715.9016018-168-264122024468600/AnsiballZ_command.py
Dec 02 09:45:16 np0005541913.localdomain sudo[264706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:16 np0005541913.localdomain python3.9[264708]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:45:16 np0005541913.localdomain sudo[264706]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:16.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:17 np0005541913.localdomain sudo[264817]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsyqhytgkpjmooihuqxuscrzyltcaeim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668716.8293834-198-56708539962298/AnsiballZ_stat.py
Dec 02 09:45:17 np0005541913.localdomain sudo[264817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:17 np0005541913.localdomain python3.9[264819]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:45:17 np0005541913.localdomain sudo[264817]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:17.718 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:45:18 np0005541913.localdomain sudo[264929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkojwjfzqlfyvqvnyecsfdyyfpqggfpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668717.6538463-231-61483426048747/AnsiballZ_lineinfile.py
Dec 02 09:45:18 np0005541913.localdomain sudo[264929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:18 np0005541913.localdomain python3.9[264931]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:18 np0005541913.localdomain sudo[264929]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:18.854 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:19 np0005541913.localdomain sudo[265039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdbkiwkqegnisirulsjgsiukgbejlkzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668718.5736737-258-76940183307619/AnsiballZ_systemd_service.py
Dec 02 09:45:19 np0005541913.localdomain sudo[265039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63023 DF PROTO=TCP SPT=48422 DPT=9102 SEQ=1265174586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A355E40000000001030307) 
Dec 02 09:45:19 np0005541913.localdomain python3.9[265041]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:45:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:45:20 np0005541913.localdomain systemd[1]: tmp-crun.lCJdOj.mount: Deactivated successfully.
Dec 02 09:45:20 np0005541913.localdomain podman[265043]: 2025-12-02 09:45:20.462636578 +0000 UTC m=+0.097050498 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Dec 02 09:45:20 np0005541913.localdomain podman[265043]: 2025-12-02 09:45:20.501133781 +0000 UTC m=+0.135547701 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41)
Dec 02 09:45:20 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:45:20 np0005541913.localdomain sudo[265039]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:20 np0005541913.localdomain sudo[265173]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-heqfozzxgykvpztaeeohemrzuvogxvgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668720.6905386-282-18984989228502/AnsiballZ_systemd_service.py
Dec 02 09:45:20 np0005541913.localdomain sudo[265173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:21 np0005541913.localdomain python3.9[265175]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:45:21 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:21.300 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:22 np0005541913.localdomain sudo[265173]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:23 np0005541913.localdomain sudo[265285]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndftkiptdwrkmnaosjbbklrwnuesdaid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668722.880225-315-208012741893930/AnsiballZ_service_facts.py
Dec 02 09:45:23 np0005541913.localdomain sudo[265285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:45:23 np0005541913.localdomain podman[265288]: 2025-12-02 09:45:23.248159843 +0000 UTC m=+0.086131665 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:45:23 np0005541913.localdomain podman[265288]: 2025-12-02 09:45:23.255224083 +0000 UTC m=+0.093195915 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:45:23 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:45:23 np0005541913.localdomain python3.9[265287]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:45:23 np0005541913.localdomain network[265327]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:45:23 np0005541913.localdomain network[265328]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:45:23 np0005541913.localdomain network[265329]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:45:23 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:23.908 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:45:26 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:26.313 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:26 np0005541913.localdomain sudo[265285]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:28 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:28.946 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:29 np0005541913.localdomain sudo[265561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrpphudbsiriketyhtgbeppfvcuvsbes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668728.790699-345-134235471674796/AnsiballZ_file.py
Dec 02 09:45:29 np0005541913.localdomain sudo[265561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:29 np0005541913.localdomain python3.9[265563]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 09:45:29 np0005541913.localdomain sudo[265561]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:29 np0005541913.localdomain sudo[265671]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-godxxpbnnqzohjlabiakourvblroywii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668729.5941505-369-175536004949341/AnsiballZ_modprobe.py
Dec 02 09:45:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:45:29 np0005541913.localdomain sudo[265671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:30 np0005541913.localdomain systemd[1]: tmp-crun.coWoKH.mount: Deactivated successfully.
Dec 02 09:45:30 np0005541913.localdomain podman[265673]: 2025-12-02 09:45:30.110872867 +0000 UTC m=+0.101666762 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125)
Dec 02 09:45:30 np0005541913.localdomain podman[265673]: 2025-12-02 09:45:30.150128071 +0000 UTC m=+0.140921956 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125)
Dec 02 09:45:30 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:45:30 np0005541913.localdomain python3.9[265674]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 02 09:45:30 np0005541913.localdomain sudo[265671]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:30 np0005541913.localdomain sudo[265799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmmvycfrjppvxtitaddoetqpbokaxxyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668730.4053369-393-130135626818990/AnsiballZ_stat.py
Dec 02 09:45:30 np0005541913.localdomain sudo[265799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:30 np0005541913.localdomain python3.9[265801]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:45:30 np0005541913.localdomain sudo[265799]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:31 np0005541913.localdomain sudo[265856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cymlafhtsplfuapwymudnwxafvamfhpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668730.4053369-393-130135626818990/AnsiballZ_file.py
Dec 02 09:45:31 np0005541913.localdomain sudo[265856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:31 np0005541913.localdomain python3.9[265858]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:31 np0005541913.localdomain sudo[265856]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:31 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:31.356 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:31 np0005541913.localdomain sudo[265966]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amhmuecsbqgtvixgoxxgmdawnspimuxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668731.593038-432-52293445143586/AnsiballZ_lineinfile.py
Dec 02 09:45:31 np0005541913.localdomain sudo[265966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:32 np0005541913.localdomain python3.9[265968]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:32 np0005541913.localdomain sudo[265966]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:32 np0005541913.localdomain sudo[266076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfltlyxslqmqqujdgcdiclkcmxufgbfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668732.3283675-459-185852087126216/AnsiballZ_file.py
Dec 02 09:45:32 np0005541913.localdomain sudo[266076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:45:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:45:32 np0005541913.localdomain podman[266079]: 2025-12-02 09:45:32.661828721 +0000 UTC m=+0.059555630 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:45:32 np0005541913.localdomain systemd[1]: tmp-crun.mTSXyd.mount: Deactivated successfully.
Dec 02 09:45:32 np0005541913.localdomain podman[266080]: 2025-12-02 09:45:32.693968775 +0000 UTC m=+0.084195702 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 09:45:32 np0005541913.localdomain podman[266079]: 2025-12-02 09:45:32.719795338 +0000 UTC m=+0.117522337 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:45:32 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:45:32 np0005541913.localdomain podman[266080]: 2025-12-02 09:45:32.77195464 +0000 UTC m=+0.162181577 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:45:32 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:45:32 np0005541913.localdomain python3.9[266078]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:45:32 np0005541913.localdomain sudo[266076]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:33 np0005541913.localdomain sudo[266234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnnfraokidehqnjyxlgwypyxrpbtuizt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668733.101201-486-116573638098378/AnsiballZ_stat.py
Dec 02 09:45:33 np0005541913.localdomain sudo[266234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:33 np0005541913.localdomain python3.9[266236]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:45:33 np0005541913.localdomain sudo[266234]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31399 DF PROTO=TCP SPT=33290 DPT=9102 SEQ=3631968402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A38F6D0000000001030307) 
Dec 02 09:45:33 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:33.993 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:45:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:45:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:45:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:45:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:45:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:45:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:45:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:45:34 np0005541913.localdomain sudo[266346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsxtamgahvpajlrpcxnibrbjfgbolxht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668733.8125246-513-207974829369409/AnsiballZ_stat.py
Dec 02 09:45:34 np0005541913.localdomain sudo[266346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:34 np0005541913.localdomain python3.9[266348]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:45:34 np0005541913.localdomain sudo[266346]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:34 np0005541913.localdomain sudo[266458]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uintumtwknlxqondjuuzkqucbzwauptq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668734.5383005-540-135309479024273/AnsiballZ_command.py
Dec 02 09:45:34 np0005541913.localdomain sudo[266458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31400 DF PROTO=TCP SPT=33290 DPT=9102 SEQ=3631968402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A393640000000001030307) 
Dec 02 09:45:35 np0005541913.localdomain python3.9[266460]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:45:35 np0005541913.localdomain sudo[266458]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63024 DF PROTO=TCP SPT=48422 DPT=9102 SEQ=1265174586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A395E40000000001030307) 
Dec 02 09:45:35 np0005541913.localdomain sudo[266569]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkkhcbxllftmrxhyebtlgjqjhogsrwyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668735.3053935-570-148627771695279/AnsiballZ_replace.py
Dec 02 09:45:35 np0005541913.localdomain sudo[266569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:35 np0005541913.localdomain python3.9[266571]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:35 np0005541913.localdomain sudo[266569]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:45:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:45:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:45:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148825 "" "Go-http-client/1.1"
Dec 02 09:45:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:45:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17216 "" "Go-http-client/1.1"
Dec 02 09:45:36 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:36.399 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:36 np0005541913.localdomain sudo[266679]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfbbrqrhawpsmecjwrjyqubttrvbmrdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668736.1500707-597-180498040696044/AnsiballZ_lineinfile.py
Dec 02 09:45:36 np0005541913.localdomain sudo[266679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:36 np0005541913.localdomain python3.9[266681]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:36 np0005541913.localdomain sudo[266679]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31401 DF PROTO=TCP SPT=33290 DPT=9102 SEQ=3631968402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A39B640000000001030307) 
Dec 02 09:45:37 np0005541913.localdomain sudo[266789]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvotclrnzasgpxqozvforqcutjtnpuyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668736.7633305-597-131062207780164/AnsiballZ_lineinfile.py
Dec 02 09:45:37 np0005541913.localdomain sudo[266789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:37 np0005541913.localdomain python3.9[266791]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:37 np0005541913.localdomain sudo[266789]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:37 np0005541913.localdomain sudo[266899]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orstjforhglsqvmofkiiyannpyjhhncr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668737.3513072-597-230300818946785/AnsiballZ_lineinfile.py
Dec 02 09:45:37 np0005541913.localdomain sudo[266899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:37 np0005541913.localdomain python3.9[266901]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:37 np0005541913.localdomain sudo[266899]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:38 np0005541913.localdomain sudo[267009]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdtwuimxtnwfinrlthvygcmitjbxfwbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668737.8876593-597-53822634728643/AnsiballZ_lineinfile.py
Dec 02 09:45:38 np0005541913.localdomain sudo[267009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:38 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51722 DF PROTO=TCP SPT=35230 DPT=9102 SEQ=1998856449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A39FE50000000001030307) 
Dec 02 09:45:38 np0005541913.localdomain python3.9[267011]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:38 np0005541913.localdomain sudo[267009]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:38 np0005541913.localdomain sudo[267119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-femajkoqdocryablctnhaopkhzdjwvse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668738.4320326-684-189672370451544/AnsiballZ_stat.py
Dec 02 09:45:38 np0005541913.localdomain sudo[267119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:38 np0005541913.localdomain python3.9[267121]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:45:38 np0005541913.localdomain sudo[267119]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:39 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:39.023 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:39 np0005541913.localdomain sudo[267231]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjeizhlxypbkudgovbiwpinjftzkyscs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668739.3272061-714-240631080985906/AnsiballZ_file.py
Dec 02 09:45:39 np0005541913.localdomain sudo[267231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:39 np0005541913.localdomain python3.9[267233]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:45:39 np0005541913.localdomain sudo[267231]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:40 np0005541913.localdomain sudo[267341]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxbhknbpoebnucgxyuyxrjgmpmycfkbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668739.990462-738-216320346031049/AnsiballZ_stat.py
Dec 02 09:45:40 np0005541913.localdomain sudo[267341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:40 np0005541913.localdomain python3.9[267343]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:45:40 np0005541913.localdomain sudo[267341]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:40 np0005541913.localdomain sudo[267398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orinzwylgkynekerfposlxsiacuuwsza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668739.990462-738-216320346031049/AnsiballZ_file.py
Dec 02 09:45:40 np0005541913.localdomain sudo[267398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:40 np0005541913.localdomain python3.9[267400]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:45:40 np0005541913.localdomain sudo[267398]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31402 DF PROTO=TCP SPT=33290 DPT=9102 SEQ=3631968402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A3AB240000000001030307) 
Dec 02 09:45:41 np0005541913.localdomain sudo[267508]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxaxfqyvcsydhxmjpvpktyuptmpvhqcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668740.9927783-738-39661341473377/AnsiballZ_stat.py
Dec 02 09:45:41 np0005541913.localdomain sudo[267508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:41 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:41.402 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:41 np0005541913.localdomain python3.9[267510]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:45:41 np0005541913.localdomain sudo[267508]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:41 np0005541913.localdomain sudo[267565]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esvunkoasigcfppkczntoioftwxhyjrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668740.9927783-738-39661341473377/AnsiballZ_file.py
Dec 02 09:45:41 np0005541913.localdomain sudo[267565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:41 np0005541913.localdomain python3.9[267567]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:45:41 np0005541913.localdomain sudo[267565]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:42 np0005541913.localdomain sudo[267675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfjepikriezbkngehddjnmkaudpznfpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668742.1186657-807-117567268412561/AnsiballZ_file.py
Dec 02 09:45:42 np0005541913.localdomain sudo[267675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:42 np0005541913.localdomain python3.9[267677]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:42 np0005541913.localdomain sudo[267675]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:43 np0005541913.localdomain sudo[267785]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tinytvcpliygxzarcghnxvvykqasywcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668742.7710757-831-144036035226539/AnsiballZ_stat.py
Dec 02 09:45:43 np0005541913.localdomain sudo[267785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:45:43 np0005541913.localdomain podman[267788]: 2025-12-02 09:45:43.142277642 +0000 UTC m=+0.082332683 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 02 09:45:43 np0005541913.localdomain podman[267788]: 2025-12-02 09:45:43.1530318 +0000 UTC m=+0.093086871 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 02 09:45:43 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:45:43 np0005541913.localdomain python3.9[267787]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:45:43 np0005541913.localdomain sudo[267785]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:43 np0005541913.localdomain sudo[267862]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxxcelepunnmugmsoqdsplbiojhmwwtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668742.7710757-831-144036035226539/AnsiballZ_file.py
Dec 02 09:45:43 np0005541913.localdomain sudo[267862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:43 np0005541913.localdomain python3.9[267864]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:43 np0005541913.localdomain sudo[267862]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:44 np0005541913.localdomain sudo[267972]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhoamfxnqdlwxpndpxafqjvikgflkbof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668743.8166668-867-129775404528838/AnsiballZ_stat.py
Dec 02 09:45:44 np0005541913.localdomain sudo[267972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:44 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:44.084 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:44 np0005541913.localdomain python3.9[267974]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:45:44 np0005541913.localdomain sudo[267972]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:44 np0005541913.localdomain sudo[268029]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frbolipdpdoqdeylmwcmzxrkpxwdlerb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668743.8166668-867-129775404528838/AnsiballZ_file.py
Dec 02 09:45:44 np0005541913.localdomain sudo[268029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:44 np0005541913.localdomain python3.9[268031]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:44 np0005541913.localdomain sudo[268029]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:45 np0005541913.localdomain sudo[268139]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjcbffvvvrsiuatpectgafkdntczooyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668744.944209-903-131830110868223/AnsiballZ_systemd.py
Dec 02 09:45:45 np0005541913.localdomain sudo[268139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:45 np0005541913.localdomain python3.9[268141]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:45:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:45:45 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:45:45 np0005541913.localdomain podman[268143]: 2025-12-02 09:45:45.926673937 +0000 UTC m=+0.099529874 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec 02 09:45:45 np0005541913.localdomain systemd-sysv-generator[268189]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:45:45 np0005541913.localdomain systemd-rc-local-generator[268184]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:45:45 np0005541913.localdomain podman[268143]: 2025-12-02 09:45:45.960047113 +0000 UTC m=+0.132903070 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:45:45 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:45 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:45 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:45 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:45 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:45:46 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:46 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:46 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:46 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:46 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:45:46 np0005541913.localdomain sudo[268139]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:46 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:46.440 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:46 np0005541913.localdomain sudo[268305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvarsrdpyjjzpqwxgdsyljyhwydxmpto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668746.4008203-927-243023314112045/AnsiballZ_stat.py
Dec 02 09:45:46 np0005541913.localdomain sudo[268305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:46 np0005541913.localdomain python3.9[268307]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:45:46 np0005541913.localdomain sudo[268305]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:47 np0005541913.localdomain sudo[268362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsvsuwfjzjbrhuelkizrfqreybgsojzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668746.4008203-927-243023314112045/AnsiballZ_file.py
Dec 02 09:45:47 np0005541913.localdomain sudo[268362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:47 np0005541913.localdomain python3.9[268364]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:47 np0005541913.localdomain sudo[268362]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:47 np0005541913.localdomain sudo[268472]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nseowwfczqgvqeeqqxjwkvzazzkhverp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668747.5439522-963-102021455923175/AnsiballZ_stat.py
Dec 02 09:45:47 np0005541913.localdomain sudo[268472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:47 np0005541913.localdomain python3.9[268474]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:45:48 np0005541913.localdomain sudo[268472]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:48 np0005541913.localdomain sudo[268529]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkfbgbqijreaiseqbokuzugahepeasgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668747.5439522-963-102021455923175/AnsiballZ_file.py
Dec 02 09:45:48 np0005541913.localdomain sudo[268529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:48 np0005541913.localdomain python3.9[268531]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:48 np0005541913.localdomain sudo[268529]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:48 np0005541913.localdomain sudo[268639]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyacqvkwsaevfewsnvinbesojzhtebjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668748.6110911-999-73125527420172/AnsiballZ_systemd.py
Dec 02 09:45:48 np0005541913.localdomain sudo[268639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:49 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:49.137 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:49 np0005541913.localdomain python3.9[268641]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:45:49 np0005541913.localdomain systemd-rc-local-generator[268669]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:45:49 np0005541913.localdomain systemd-sysv-generator[268673]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31403 DF PROTO=TCP SPT=33290 DPT=9102 SEQ=3631968402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A3CBE40000000001030307) 
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:49 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:45:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:45:50 np0005541913.localdomain systemd[1]: Starting Create netns directory...
Dec 02 09:45:50 np0005541913.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 09:45:50 np0005541913.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 09:45:50 np0005541913.localdomain systemd[1]: Finished Create netns directory.
Dec 02 09:45:50 np0005541913.localdomain sudo[268639]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:50 np0005541913.localdomain podman[268682]: 2025-12-02 09:45:50.744322192 +0000 UTC m=+0.088483967 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Dec 02 09:45:50 np0005541913.localdomain podman[268682]: 2025-12-02 09:45:50.757914358 +0000 UTC m=+0.102076123 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 09:45:50 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:45:51 np0005541913.localdomain sudo[268812]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftxnphqnadgpwemfdwkqgugkhhyrgejv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668751.0419688-1029-99400959080288/AnsiballZ_file.py
Dec 02 09:45:51 np0005541913.localdomain sudo[268812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:51 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:51.441 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:51 np0005541913.localdomain python3.9[268814]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:45:51 np0005541913.localdomain sudo[268812]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:51 np0005541913.localdomain sudo[268922]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyteebfargjiemvhidwkkeasifrwfiua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668751.7111082-1053-22373776680217/AnsiballZ_stat.py
Dec 02 09:45:51 np0005541913.localdomain sudo[268922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:52 np0005541913.localdomain python3.9[268924]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:45:52 np0005541913.localdomain sudo[268922]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:52 np0005541913.localdomain sudo[268979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beyyrvaowclvziveltzjwjhqpzrbjheo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668751.7111082-1053-22373776680217/AnsiballZ_file.py
Dec 02 09:45:52 np0005541913.localdomain sudo[268979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:52 np0005541913.localdomain python3.9[268981]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:45:52 np0005541913.localdomain sudo[268979]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:52 np0005541913.localdomain sudo[268999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:45:52 np0005541913.localdomain sudo[268999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:45:52 np0005541913.localdomain sudo[268999]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:53 np0005541913.localdomain sudo[269017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:45:53 np0005541913.localdomain sudo[269017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:45:53 np0005541913.localdomain sudo[269125]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drufzeezxixikvapfftnfcawstfhbtws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668753.0838802-1095-107481735428909/AnsiballZ_file.py
Dec 02 09:45:53 np0005541913.localdomain sudo[269125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:45:53 np0005541913.localdomain podman[269139]: 2025-12-02 09:45:53.45172521 +0000 UTC m=+0.086367241 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:45:53 np0005541913.localdomain podman[269139]: 2025-12-02 09:45:53.483255747 +0000 UTC m=+0.117897768 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:45:53 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:45:53 np0005541913.localdomain python3.9[269140]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:45:53 np0005541913.localdomain sudo[269125]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:53 np0005541913.localdomain sudo[269017]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:54 np0005541913.localdomain sudo[269289]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjotslvpsvszgznzhqptxtyhomccgevq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668753.8065307-1119-235372356328805/AnsiballZ_stat.py
Dec 02 09:45:54 np0005541913.localdomain sudo[269289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:54 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:54.182 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:54 np0005541913.localdomain python3.9[269291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:45:54 np0005541913.localdomain sudo[269289]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:54 np0005541913.localdomain sudo[269346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgrsaftprlmpatqlculevdztkgbhzxqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668753.8065307-1119-235372356328805/AnsiballZ_file.py
Dec 02 09:45:54 np0005541913.localdomain sudo[269346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:54 np0005541913.localdomain python3.9[269348]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.psqpkend recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:54 np0005541913.localdomain sudo[269346]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:55 np0005541913.localdomain sudo[269456]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbyatzlclgdtqolnaalnzrmwsbqhzgyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668754.8953419-1155-206022203468007/AnsiballZ_file.py
Dec 02 09:45:55 np0005541913.localdomain sudo[269456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:55 np0005541913.localdomain python3.9[269458]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:45:55 np0005541913.localdomain sudo[269456]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:55 np0005541913.localdomain sudo[269566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzwwmuxzkorqqrorsqpzjzoopcazrvur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668755.5688133-1179-183590267000261/AnsiballZ_stat.py
Dec 02 09:45:55 np0005541913.localdomain sudo[269566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:56 np0005541913.localdomain sudo[269566]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:56 np0005541913.localdomain sudo[269623]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywdwiggyfnimrgzqdldrsfqriahwsbsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668755.5688133-1179-183590267000261/AnsiballZ_file.py
Dec 02 09:45:56 np0005541913.localdomain sudo[269623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:56 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:56.443 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:56 np0005541913.localdomain sudo[269623]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:57 np0005541913.localdomain sudo[269733]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovfzlngzydclmvjaxmheabsrkfxwkgxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668756.938464-1221-40691168339692/AnsiballZ_container_config_data.py
Dec 02 09:45:57 np0005541913.localdomain sudo[269733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:57 np0005541913.localdomain python3.9[269735]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 02 09:45:57 np0005541913.localdomain sudo[269733]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:57 np0005541913.localdomain sudo[269736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:45:57 np0005541913.localdomain sudo[269736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:45:57 np0005541913.localdomain sudo[269736]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:58 np0005541913.localdomain sudo[269861]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kljgnhplpwfdwzulnjmidjsiiqgqaeav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668757.8680818-1248-268560169310088/AnsiballZ_container_config_hash.py
Dec 02 09:45:58 np0005541913.localdomain sudo[269861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:58 np0005541913.localdomain python3.9[269863]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:45:58 np0005541913.localdomain sudo[269861]: pam_unix(sudo:session): session closed for user root
Dec 02 09:45:59 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:45:59.223 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:45:59 np0005541913.localdomain sudo[269971]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojolexpuaetlipvsabakgtueptcafszi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668758.8520582-1275-23282422186062/AnsiballZ_podman_container_info.py
Dec 02 09:45:59 np0005541913.localdomain sudo[269971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:45:59 np0005541913.localdomain python3.9[269973]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 09:45:59 np0005541913.localdomain sudo[269971]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:46:00 np0005541913.localdomain systemd[1]: tmp-crun.KZUNbz.mount: Deactivated successfully.
Dec 02 09:46:00 np0005541913.localdomain podman[270015]: 2025-12-02 09:46:00.467441543 +0000 UTC m=+0.100285404 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:46:00 np0005541913.localdomain podman[270015]: 2025-12-02 09:46:00.477135744 +0000 UTC m=+0.109979625 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3)
Dec 02 09:46:00 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:46:01 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:01.446 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:46:03.028 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:46:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:46:03.030 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:46:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:46:03.031 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:46:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:46:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:46:03 np0005541913.localdomain systemd[1]: tmp-crun.2Y0RaC.mount: Deactivated successfully.
Dec 02 09:46:03 np0005541913.localdomain podman[270073]: 2025-12-02 09:46:03.44024645 +0000 UTC m=+0.076103555 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller)
Dec 02 09:46:03 np0005541913.localdomain podman[270072]: 2025-12-02 09:46:03.505571395 +0000 UTC m=+0.139923129 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:46:03 np0005541913.localdomain podman[270072]: 2025-12-02 09:46:03.520025643 +0000 UTC m=+0.154377347 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:46:03 np0005541913.localdomain podman[270073]: 2025-12-02 09:46:03.532053096 +0000 UTC m=+0.167910201 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 09:46:03 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:46:03 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:46:03 np0005541913.localdomain sudo[270173]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjvectzoxpgqrjcnfsgzkxybuprjytcd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668763.258581-1314-152287478745654/AnsiballZ_edpm_container_manage.py
Dec 02 09:46:03 np0005541913.localdomain sudo[270173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36160 DF PROTO=TCP SPT=37952 DPT=9102 SEQ=513994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A4049E0000000001030307) 
Dec 02 09:46:03 np0005541913.localdomain python3[270175]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:46:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:46:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:46:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:46:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:46:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:46:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:46:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:46:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:04.225 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:04 np0005541913.localdomain python3[270175]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7",
                                                                    "Digest": "sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:11:02.031267563Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 249482216,
                                                                    "VirtualSize": 249482216,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:8c448567789503f6c5be645a12473dfc27734872532d528b6ee764c214f9f2f3"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:24.212273596Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:11:01.523582443Z",
                                                                              "created_by": "/bin/sh -c dnf -y install device-mapper-multipath iscsi-initiator-utils && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:11:03.162365736Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 02 09:46:04 np0005541913.localdomain sudo[270173]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:04 np0005541913.localdomain sudo[270348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aurxesntzfjxbwlzrnxktzjwqlbkokui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668764.5113604-1338-37582708765475/AnsiballZ_stat.py
Dec 02 09:46:04 np0005541913.localdomain sudo[270348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:04 np0005541913.localdomain python3.9[270350]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:46:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36161 DF PROTO=TCP SPT=37952 DPT=9102 SEQ=513994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A408A40000000001030307) 
Dec 02 09:46:04 np0005541913.localdomain sudo[270348]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:05 np0005541913.localdomain sudo[270460]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndgrlohkbpvftszmtxvtxxpybpuffgdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668765.193512-1365-138294843213162/AnsiballZ_file.py
Dec 02 09:46:05 np0005541913.localdomain sudo[270460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:05 np0005541913.localdomain python3.9[270462]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:05 np0005541913.localdomain sudo[270460]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:05 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31404 DF PROTO=TCP SPT=33290 DPT=9102 SEQ=3631968402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A40BE40000000001030307) 
Dec 02 09:46:05 np0005541913.localdomain sudo[270515]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lovvoehrkqbtvzyuhsxjmpzbmxocljur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668765.193512-1365-138294843213162/AnsiballZ_stat.py
Dec 02 09:46:05 np0005541913.localdomain sudo[270515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:46:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:46:06 np0005541913.localdomain python3.9[270517]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:46:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:46:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148825 "" "Go-http-client/1.1"
Dec 02 09:46:06 np0005541913.localdomain sudo[270515]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:46:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17228 "" "Go-http-client/1.1"
Dec 02 09:46:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:06.449 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:06 np0005541913.localdomain sudo[270624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvfirujrkbtiozphcwsuhnjwicbibxje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668766.1321316-1365-260924313518560/AnsiballZ_copy.py
Dec 02 09:46:06 np0005541913.localdomain sudo[270624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:06 np0005541913.localdomain python3.9[270626]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764668766.1321316-1365-260924313518560/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:06 np0005541913.localdomain sudo[270624]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36162 DF PROTO=TCP SPT=37952 DPT=9102 SEQ=513994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A410A40000000001030307) 
Dec 02 09:46:07 np0005541913.localdomain sudo[270679]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elbccgytcncijersmhwoavtoijqoxhdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668766.1321316-1365-260924313518560/AnsiballZ_systemd.py
Dec 02 09:46:07 np0005541913.localdomain sudo[270679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:07 np0005541913.localdomain python3.9[270681]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:46:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63025 DF PROTO=TCP SPT=48422 DPT=9102 SEQ=1265174586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A413E40000000001030307) 
Dec 02 09:46:08 np0005541913.localdomain sudo[270679]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:09.229 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:09 np0005541913.localdomain python3.9[270791]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:46:09 np0005541913.localdomain sudo[270899]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqqweefpvnzodezwmngfwylkxhcevubj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668769.6020646-1467-1319116174610/AnsiballZ_file.py
Dec 02 09:46:09 np0005541913.localdomain sudo[270899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:10 np0005541913.localdomain python3.9[270901]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:10 np0005541913.localdomain sudo[270899]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:10.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:46:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:10.746 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:46:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:10.746 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:46:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:10.746 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:46:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:10.747 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:46:10 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:10.747 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:46:10 np0005541913.localdomain sudo[271009]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cilvelrmondwmuhzzdlpbjscvugxywrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668770.540116-1503-242714215623603/AnsiballZ_file.py
Dec 02 09:46:10 np0005541913.localdomain sudo[271009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:10 np0005541913.localdomain python3.9[271012]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 09:46:10 np0005541913.localdomain sudo[271009]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:11 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36163 DF PROTO=TCP SPT=37952 DPT=9102 SEQ=513994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A420640000000001030307) 
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.191 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.255 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.257 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:46:11 np0005541913.localdomain sudo[271141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huqcjchtkslusapsbqmwzkzholwndctp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668771.176675-1527-55658470944725/AnsiballZ_modprobe.py
Dec 02 09:46:11 np0005541913.localdomain sudo[271141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.431 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.432 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12140MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.432 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.433 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.450 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.535 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.535 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.536 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:46:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:11.566 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:46:11 np0005541913.localdomain python3.9[271143]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 02 09:46:11 np0005541913.localdomain sudo[271141]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:12.106 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:46:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:12.113 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:46:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:12.128 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:46:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:12.131 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:46:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:12.131 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:46:12 np0005541913.localdomain sudo[271273]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvnqmzuwatqrxkezkiblxbznpzkebupo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668771.907612-1551-165061002148392/AnsiballZ_stat.py
Dec 02 09:46:12 np0005541913.localdomain sudo[271273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:12 np0005541913.localdomain python3.9[271275]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:46:12 np0005541913.localdomain sudo[271273]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:12 np0005541913.localdomain sudo[271330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdihkhyctbxtzxyrwfaxziqgrzsfnhia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668771.907612-1551-165061002148392/AnsiballZ_file.py
Dec 02 09:46:12 np0005541913.localdomain sudo[271330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:12 np0005541913.localdomain python3.9[271332]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:12 np0005541913.localdomain sudo[271330]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:13 np0005541913.localdomain sudo[271440]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnpdywnxjxpgdtaygwcwyexwkxqvdsih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668773.055854-1590-94436982380570/AnsiballZ_lineinfile.py
Dec 02 09:46:13 np0005541913.localdomain sudo[271440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:46:13 np0005541913.localdomain systemd[1]: tmp-crun.uk8zLm.mount: Deactivated successfully.
Dec 02 09:46:13 np0005541913.localdomain podman[271443]: 2025-12-02 09:46:13.382751341 +0000 UTC m=+0.080477712 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 09:46:13 np0005541913.localdomain podman[271443]: 2025-12-02 09:46:13.389502153 +0000 UTC m=+0.087228544 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 09:46:13 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:46:13 np0005541913.localdomain python3.9[271442]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:13 np0005541913.localdomain sudo[271440]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:14 np0005541913.localdomain sudo[271569]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlfiaghbxypnpxagmjvqyowdakixszbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668773.859803-1617-96176830442440/AnsiballZ_dnf.py
Dec 02 09:46:14 np0005541913.localdomain sudo[271569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:14.266 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:14 np0005541913.localdomain python3.9[271571]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 09:46:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:15.133 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:46:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:15.134 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:46:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:15.134 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:46:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:15.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:46:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:15.722 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:46:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:15.723 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.098 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.113 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.114 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a89f84e-73d0-411c-b89a-7fe08893be7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:46:16.099271', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bef84baa-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.318343494, 'message_signature': '9af0cdd049b4aab894039072128f62f0e2d278bb316d1c0ffa4202d7b9f70b15'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:46:16.099271', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bef85f50-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.318343494, 'message_signature': 'da13a6892b3a6095e8944e1acc8b371cdba6acd319e502f57a970664da7e1f5d'}]}, 'timestamp': '2025-12-02 09:46:16.114841', '_unique_id': 'dbb03d2929be406db9e3579544390bb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.116 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.154 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1433516318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.155 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 164656012 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cca7c219-e1ee-4324-82f7-3c03aaacb7e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1433516318, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:46:16.118038', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'befe94ba-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': '6baec9d85f340eee836a205dccbf4e75c5b1b5cb39c68fe6485f4adad2b578d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164656012, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:46:16.118038', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'befeaac2-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': 'e55978fd3b6ff0791eb251210cda871fb1ea4b951277978cfbb6935cc60c0998'}]}, 'timestamp': '2025-12-02 09:46:16.156051', '_unique_id': 'dedbd51a25f74a069799edde39cdc410'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.163 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1fa8267-7354-4c28-b0ad-4bc1688f9761', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.159096', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'beffe572-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': '38a3b426aeef1aa8bcb8e9828750a9f291a4a73c2b1e0dc13c52bbce6c4cd171'}]}, 'timestamp': '2025-12-02 09:46:16.164145', '_unique_id': 'b1a9f13a8a3f463ea07d5d1e17f3914d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.167 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43f00cc4-b792-45d3-ab0f-cce287b21020', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.167228', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'bf007244-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': 'a7285e08d7f8d17ed2481e6f044bfb2306297766ec45b7773e8382ffaff7e2d6'}]}, 'timestamp': '2025-12-02 09:46:16.167785', '_unique_id': 'b2fd0dc3bc9543539cfc6c619112ebb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.170 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.170 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c925dee-eb18-4702-bce6-a3b49a489439', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:46:16.170283', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf00e918-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.318343494, 'message_signature': '0cdd0e076a1105570114d15c4e5c921e228b2a9cd38645f582845a64f143db62'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:46:16.170283', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf00faac-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.318343494, 'message_signature': '0d72366fb81dc7cc991d3df80f9a15b26b951899cec612090be16affc4b2547e'}]}, 'timestamp': '2025-12-02 09:46:16.171177', '_unique_id': '8441cbe0da644678b51af211d85a018a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.173 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d125171-54f3-4f82-8065-b1d51b157568', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.173494', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'bf016a50-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': '3bfd6cae092e3d660b5e3613b6f58f22a021936ba9b778a3ae48b8b03dd83fad'}]}, 'timestamp': '2025-12-02 09:46:16.174109', '_unique_id': '17bf2180ca394d35b4e47023a22326c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.175 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.176 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.176 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61999be4-8de6-4ced-bcc6-59b83116ec7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.176477', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'bf01dc7e-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': 'ab1a35b1a8bd2f6dff692ac5b3a146630a7c5b2f606244e570e043551c859ec0'}]}, 'timestamp': '2025-12-02 09:46:16.176993', '_unique_id': 'd578022b512c43eb9374810e23c31c25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.179 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '694dee5c-c059-4744-bfd4-0be443043b65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.179259', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'bf0248e4-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': '0ccd689ae86b4384763ca0ff1c7120611779c953f47bc5853493e5f84e88e46b'}]}, 'timestamp': '2025-12-02 09:46:16.179800', '_unique_id': '7937268aad744c9a8cf01908c0f3ef03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.182 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af3783e6-9ff5-416a-8f42-639400209a5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.182338', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'bf02c300-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': '124e6914a0ca426a0fde4306cee6320f160a61d4ee1af644612f2cdff0f46da6'}]}, 'timestamp': '2025-12-02 09:46:16.182914', '_unique_id': 'fd3d18ea52374f09896e64354b81e40d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.185 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.185 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7277ee5a-f41e-4df7-8d0d-08e75bad0314', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:46:16.185193', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf032fb6-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': '1cb01a29bfe97ab941e249b885af30b3f80e114507337b1aa2a41166c9e3f323'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:46:16.185193', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf03447e-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': '6ae6b612c915a94c89b32f65e33dcead58c074793b7f7aaaa82cb884a1d900cb'}]}, 'timestamp': '2025-12-02 09:46:16.186196', '_unique_id': '2dcbd544b3fa4dc1977ad44b4ca50000'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.187 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ae40175-bb13-4eca-ae66-7fad19c6e3b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.188576', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'bf03b5a8-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': '9b22c1617d1f39de4fc76df8dcac7e80dd0c568de6c811d31a762d50b166c022'}]}, 'timestamp': '2025-12-02 09:46:16.189184', '_unique_id': 'a2eb66c25a464bc190fca6efc0867a6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.191 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.192 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.192 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b51a201-4e13-4962-b639-9be423dc7bb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:46:16.192013', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf043a32-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': '3bb3847b5457056d16f9270869004b87b50b5bce6d88bb826bac8b2b136bf7d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:46:16.192013', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf044c7a-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': 'f73aeaf5a6b390541ae3e6c659371d9ee21528699089749ba1cef060a501701d'}]}, 'timestamp': '2025-12-02 09:46:16.192949', '_unique_id': 'd97d60f31a83446cab66a79489de3161'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.194 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.195 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.195 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.195 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c25e9341-44a8-49c2-b6a3-687e8a6e9825', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:46:16.195352', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf04bca0-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.318343494, 'message_signature': '3833870e022007cf49fd1fcdb46cf08a87e485e813216079fceafb1a577f60b4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:46:16.195352', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf04cee8-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.318343494, 'message_signature': '093fa427caf6deec27a2c5513efa6db0f92378b2678423d4635c89dd299c614d'}]}, 'timestamp': '2025-12-02 09:46:16.196287', '_unique_id': '1c1fb258faa5466fa5409c6dc4989c6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.198 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80a1acff-5a02-479f-82ea-e8e72b9bfe5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.198690', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'bf053f54-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': 'f4276e0a7762fc091ef6a8f9bdcecbb49a7e1618d00f31b165309ceacba2a862'}]}, 'timestamp': '2025-12-02 09:46:16.199193', '_unique_id': '7b608551bec64c40be1df7d01bb866d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.201 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1c8cd0e-2bb8-4dad-b754-93038f11f825', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.201451', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'bf05acbe-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': '88e1d7c1c66663385d21967e1702e56d146e98eb0de58adbd2961777151f4354'}]}, 'timestamp': '2025-12-02 09:46:16.202031', '_unique_id': '4a7fb775c75c458db64a099498633fd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.204 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.205 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.205 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.205 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.227 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.228 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.229 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.229 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.231 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 57840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f301483e-db00-4f9c-88fe-199e3013336d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57840000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:46:16.205881', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'bf0a564c-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.45076119, 'message_signature': '76c3c117b3b0961f249bc40e56abb4de2ceead051972a23042c0a7c101acb1fe'}]}, 'timestamp': '2025-12-02 09:46:16.232573', '_unique_id': '7ced1302c25341c7a665520ff1a3e9e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.234 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.235 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 9229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c204161-71fc-409c-8bd7-f1dc5f912501', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9229, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:46:16.235221', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'bf0acfc8-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.378196052, 'message_signature': 'ac2bd590559422053dc293cc4d3f4ac2ca438bd3fbdff417929b410398137d65'}]}, 'timestamp': '2025-12-02 09:46:16.235579', '_unique_id': '7ff483cab4c84f0bbea3d04581db8a1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.236 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.237 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 286697561 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.238 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 39228582 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd18122a2-217e-41fd-9ef4-92a007e6bedd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 286697561, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:46:16.237587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf0b2cca-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': '1305c263a2656256622c364f7edd4876f903f95aeeeea2f08d6cbf1ecb33093f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39228582, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:46:16.237587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf0b3b34-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': 'eb347ca8174340243e17699f0f5ddf30cd2e14822975de9bc4820b31d72f20be'}]}, 'timestamp': '2025-12-02 09:46:16.238306', '_unique_id': 'a3a788cabf0a4f7e881f5f1bcc0bd54b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.240 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.240 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad20faac-7c9e-463e-95e4-c9aa3f344f95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:46:16.240215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf0b920a-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': '8cac713482d996f439efff7e6c7d45f7d6fc0df7a54c81e75e2e75224856e8f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:46:16.240215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf0b9dfe-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': '060159a64c9bb125e04afa3cfd9906129a48bef582c5c5c280876da640edd675'}]}, 'timestamp': '2025-12-02 09:46:16.240832', '_unique_id': '58e38f5131ed4774944815a654201c99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.241 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.242 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.242 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4aadf9ab-317d-4004-910a-f104a78bd646', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:46:16.242351', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'bf0be584-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.45076119, 'message_signature': 'dfa86aa0121ca26ad45045baba5bac654a9f07c37e3f09599c2c84cbd8a5dead'}]}, 'timestamp': '2025-12-02 09:46:16.242704', '_unique_id': 'd1857f461de74254b001a2951a4def49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.243 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.244 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.244 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ff4aacd-067b-4c10-b341-bb5d83bd96c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:46:16.244130', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf0c2abc-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': '9c1223cebe164f3e230f99f7d641b9634bcc535c5940b30ef176572e1b2ec9d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:46:16.244130', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf0c35b6-cf63-11f0-a0da-fa163e3f40cc', 'monotonic_time': 10938.337128519, 'message_signature': 'e0480ce6e86e246bde5418b5873bca78d2524ea6d6404b84be66135c10fd2e73'}]}, 'timestamp': '2025-12-02 09:46:16.244737', '_unique_id': 'd3e593f656844adab751edaacd692d8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:46:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:46:16.245 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:46:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.453 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:16 np0005541913.localdomain podman[271574]: 2025-12-02 09:46:16.458714487 +0000 UTC m=+0.089845334 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 02 09:46:16 np0005541913.localdomain podman[271574]: 2025-12-02 09:46:16.491114748 +0000 UTC m=+0.122245585 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:46:16 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.608 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.623 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.623 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.624 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:46:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:16.624 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:46:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:17.619 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:46:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:17.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:46:17 np0005541913.localdomain sudo[271569]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:18 np0005541913.localdomain python3.9[271699]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 09:46:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:18.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:46:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36164 DF PROTO=TCP SPT=37952 DPT=9102 SEQ=513994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A43FE40000000001030307) 
Dec 02 09:46:19 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:19.303 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:19 np0005541913.localdomain sudo[271811]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uknxpwohrjuazoqqtskncumuflqcyfnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668779.2121089-1669-154360257117722/AnsiballZ_file.py
Dec 02 09:46:19 np0005541913.localdomain sudo[271811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:19 np0005541913.localdomain python3.9[271813]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:19 np0005541913.localdomain sudo[271811]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:20 np0005541913.localdomain sudo[271921]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asodypmlqqifbdserajnjknqptwckack ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668780.2033694-1702-139951632984255/AnsiballZ_systemd_service.py
Dec 02 09:46:20 np0005541913.localdomain sudo[271921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:20 np0005541913.localdomain python3.9[271923]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:46:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:46:20 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:46:20 np0005541913.localdomain systemd-sysv-generator[271964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:46:20 np0005541913.localdomain systemd-rc-local-generator[271960]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:46:20 np0005541913.localdomain podman[271925]: 2025-12-02 09:46:20.956268805 +0000 UTC m=+0.101053315 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm)
Dec 02 09:46:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:46:20 np0005541913.localdomain podman[271925]: 2025-12-02 09:46:20.999102036 +0000 UTC m=+0.143886526 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.buildah.version=1.33.7)
Dec 02 09:46:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:21 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:46:21 np0005541913.localdomain sudo[271921]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:21 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:21.456 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:21 np0005541913.localdomain python3.9[272086]: ansible-ansible.builtin.service_facts Invoked
Dec 02 09:46:21 np0005541913.localdomain network[272103]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 09:46:21 np0005541913.localdomain network[272104]: 'network-scripts' will be removed from distribution in near future.
Dec 02 09:46:21 np0005541913.localdomain network[272105]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 09:46:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:46:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:46:24 np0005541913.localdomain podman[272143]: 2025-12-02 09:46:24.079517042 +0000 UTC m=+0.080068162 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:46:24 np0005541913.localdomain podman[272143]: 2025-12-02 09:46:24.116025973 +0000 UTC m=+0.116577093 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:46:24 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:46:24 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:24.336 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:26 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:26.461 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:27 np0005541913.localdomain sudo[272360]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzxzhobyufgcuxrkalarldgbbjbwysek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668787.1646738-1759-90905796296718/AnsiballZ_systemd_service.py
Dec 02 09:46:27 np0005541913.localdomain sudo[272360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:27 np0005541913.localdomain python3.9[272362]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:46:28 np0005541913.localdomain sudo[272360]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:29 np0005541913.localdomain sudo[272471]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryngvrgzwmlkjzctidtsdjslarsviupq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668788.9389079-1759-55176898168888/AnsiballZ_systemd_service.py
Dec 02 09:46:29 np0005541913.localdomain sudo[272471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:29 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:29.373 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:29 np0005541913.localdomain python3.9[272473]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:46:29 np0005541913.localdomain sudo[272471]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:29 np0005541913.localdomain sudo[272582]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdfvnjswciuhxbmpwllzifskfykctwmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668789.7250743-1759-281023818677015/AnsiballZ_systemd_service.py
Dec 02 09:46:29 np0005541913.localdomain sudo[272582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:30 np0005541913.localdomain python3.9[272584]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:46:30 np0005541913.localdomain sudo[272582]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:30 np0005541913.localdomain sudo[272693]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnpyyqusvfamontescfegstzrawcpukx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668790.4717705-1759-166795631165877/AnsiballZ_systemd_service.py
Dec 02 09:46:30 np0005541913.localdomain sudo[272693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:46:30 np0005541913.localdomain podman[272695]: 2025-12-02 09:46:30.882218863 +0000 UTC m=+0.093722018 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 02 09:46:30 np0005541913.localdomain podman[272695]: 2025-12-02 09:46:30.89513317 +0000 UTC m=+0.106636385 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:46:30 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:46:31 np0005541913.localdomain python3.9[272696]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:46:31 np0005541913.localdomain sudo[272693]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:31 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:31.465 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:31 np0005541913.localdomain sudo[272824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpeojztbkaihzedenogsohvquztagcvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668791.2302666-1759-131252619089640/AnsiballZ_systemd_service.py
Dec 02 09:46:31 np0005541913.localdomain sudo[272824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:31 np0005541913.localdomain python3.9[272826]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:46:31 np0005541913.localdomain sudo[272824]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:32 np0005541913.localdomain sudo[272935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duyqrajihudiubkpeejtukredoujjuht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668791.9555748-1759-275230088568859/AnsiballZ_systemd_service.py
Dec 02 09:46:32 np0005541913.localdomain sudo[272935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:32 np0005541913.localdomain python3.9[272937]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:46:32 np0005541913.localdomain sudo[272935]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:32 np0005541913.localdomain sudo[273046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqwsjrwkmagrtldewvdsgltsdkzuwkmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668792.7186627-1759-112653731301328/AnsiballZ_systemd_service.py
Dec 02 09:46:32 np0005541913.localdomain sudo[273046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:33 np0005541913.localdomain python3.9[273048]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:46:33 np0005541913.localdomain sudo[273046]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:33 np0005541913.localdomain sudo[273157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejkaorzalaabyfwxlprbsbduarytiljs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668793.466937-1759-259344838465056/AnsiballZ_systemd_service.py
Dec 02 09:46:33 np0005541913.localdomain sudo[273157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:46:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:46:33 np0005541913.localdomain podman[273161]: 2025-12-02 09:46:33.8280887 +0000 UTC m=+0.083065981 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 09:46:33 np0005541913.localdomain podman[273160]: 2025-12-02 09:46:33.881704293 +0000 UTC m=+0.138707248 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:46:33 np0005541913.localdomain podman[273160]: 2025-12-02 09:46:33.888938916 +0000 UTC m=+0.145941851 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:46:33 np0005541913.localdomain podman[273161]: 2025-12-02 09:46:33.897062983 +0000 UTC m=+0.152040304 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 09:46:33 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:46:33 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:46:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46422 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2936571885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A479CE0000000001030307) 
Dec 02 09:46:34 np0005541913.localdomain python3.9[273159]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:46:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:46:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:46:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:46:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:46:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:46:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:46:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:46:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:46:34 np0005541913.localdomain sudo[273157]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:34 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:34.418 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46423 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2936571885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A47DE40000000001030307) 
Dec 02 09:46:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36165 DF PROTO=TCP SPT=37952 DPT=9102 SEQ=513994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A47FE40000000001030307) 
Dec 02 09:46:35 np0005541913.localdomain sudo[273316]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsxrpgzstczhpodmvrkymygaceqqstbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668795.340058-1936-265101319487315/AnsiballZ_file.py
Dec 02 09:46:35 np0005541913.localdomain sudo[273316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:35 np0005541913.localdomain python3.9[273318]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:35 np0005541913.localdomain sudo[273316]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:46:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:46:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:46:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148825 "" "Go-http-client/1.1"
Dec 02 09:46:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:46:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17224 "" "Go-http-client/1.1"
Dec 02 09:46:36 np0005541913.localdomain sudo[273426]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrxkuotjqjxjevnqxjnzgqxxjuttzdub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668795.8892972-1936-264624424580362/AnsiballZ_file.py
Dec 02 09:46:36 np0005541913.localdomain sudo[273426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:36 np0005541913.localdomain python3.9[273428]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:36 np0005541913.localdomain sudo[273426]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:36 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:36.468 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:36 np0005541913.localdomain sudo[273536]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvirmpibmqfkxklveicriwkxxslrefqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668796.497971-1936-156592021636418/AnsiballZ_file.py
Dec 02 09:46:36 np0005541913.localdomain sudo[273536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:36 np0005541913.localdomain python3.9[273538]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:36 np0005541913.localdomain sudo[273536]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46424 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2936571885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A485E40000000001030307) 
Dec 02 09:46:37 np0005541913.localdomain sudo[273646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjanjakrkkeqomcbpxqxlokysmibppkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668797.1035986-1936-110876888940923/AnsiballZ_file.py
Dec 02 09:46:37 np0005541913.localdomain sudo[273646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:37 np0005541913.localdomain python3.9[273648]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:37 np0005541913.localdomain sudo[273646]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:37 np0005541913.localdomain sudo[273756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaszppfnpigiwuciqxrtskalbmpcgiil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668797.639647-1936-67800382581920/AnsiballZ_file.py
Dec 02 09:46:37 np0005541913.localdomain sudo[273756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:38 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31405 DF PROTO=TCP SPT=33290 DPT=9102 SEQ=3631968402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A489E40000000001030307) 
Dec 02 09:46:38 np0005541913.localdomain python3.9[273758]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:38 np0005541913.localdomain sudo[273756]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:38 np0005541913.localdomain sudo[273866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efwzayhpciepomdcggtcejlnjgumvyjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668798.2220092-1936-198079878095921/AnsiballZ_file.py
Dec 02 09:46:38 np0005541913.localdomain sudo[273866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:38 np0005541913.localdomain python3.9[273868]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:38 np0005541913.localdomain sudo[273866]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:39 np0005541913.localdomain sudo[273976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmnqixcdjcsunfhwafyaymdkxkvpizyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668798.7976108-1936-87513469305688/AnsiballZ_file.py
Dec 02 09:46:39 np0005541913.localdomain sudo[273976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:39 np0005541913.localdomain python3.9[273978]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:39 np0005541913.localdomain sudo[273976]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:39 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:39.470 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:39 np0005541913.localdomain sudo[274086]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glelexfhimejkfeswlqnqfyhxoapgxqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668799.3712285-1936-225917446229147/AnsiballZ_file.py
Dec 02 09:46:39 np0005541913.localdomain sudo[274086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:39 np0005541913.localdomain python3.9[274088]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:39 np0005541913.localdomain sudo[274086]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:40 np0005541913.localdomain sudo[274196]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccfnszqxecvdlnwvcbzknufkfllodcnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668800.0762727-2107-109028887487653/AnsiballZ_file.py
Dec 02 09:46:40 np0005541913.localdomain sudo[274196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:40 np0005541913.localdomain python3.9[274198]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:40 np0005541913.localdomain sudo[274196]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:40 np0005541913.localdomain sudo[274306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndbawyfjcnbzbtwwfkkrnrqfxsllknnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668800.674531-2107-93953174730382/AnsiballZ_file.py
Dec 02 09:46:40 np0005541913.localdomain sudo[274306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46425 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2936571885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A495A40000000001030307) 
Dec 02 09:46:41 np0005541913.localdomain python3.9[274308]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:41 np0005541913.localdomain sudo[274306]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:41 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:41.471 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:41 np0005541913.localdomain sudo[274416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drabktzaddntndbvpbyrwyzmhgvgerro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668801.2562726-2107-221125781972021/AnsiballZ_file.py
Dec 02 09:46:41 np0005541913.localdomain sudo[274416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:41 np0005541913.localdomain python3.9[274418]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:41 np0005541913.localdomain sudo[274416]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:42 np0005541913.localdomain sudo[274526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eecvnuywkhqvgshezcttcvebdhrjuwky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668801.8356574-2107-237681742695932/AnsiballZ_file.py
Dec 02 09:46:42 np0005541913.localdomain sudo[274526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:42 np0005541913.localdomain python3.9[274528]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:42 np0005541913.localdomain sudo[274526]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:42 np0005541913.localdomain sudo[274636]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyxjocqxlofprpwifimjbcghkizolrje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668802.3925953-2107-155629574029259/AnsiballZ_file.py
Dec 02 09:46:42 np0005541913.localdomain sudo[274636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:42 np0005541913.localdomain python3.9[274638]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:42 np0005541913.localdomain sudo[274636]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:43 np0005541913.localdomain sudo[274746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olpkucdciwxzxuwjitxxhzdexkteoijh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668802.955515-2107-190781052137430/AnsiballZ_file.py
Dec 02 09:46:43 np0005541913.localdomain sudo[274746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:43 np0005541913.localdomain python3.9[274748]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:43 np0005541913.localdomain sudo[274746]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:43 np0005541913.localdomain sudo[274856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ephkajkmcrvukalcwnpmddyzuffcvzbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668803.4659503-2107-132180199963342/AnsiballZ_file.py
Dec 02 09:46:43 np0005541913.localdomain sudo[274856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:46:43 np0005541913.localdomain python3.9[274858]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:43 np0005541913.localdomain sudo[274856]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:43 np0005541913.localdomain podman[274859]: 2025-12-02 09:46:43.957382657 +0000 UTC m=+0.228419576 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 09:46:44 np0005541913.localdomain podman[274859]: 2025-12-02 09:46:44.040082846 +0000 UTC m=+0.311119775 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:46:44 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:46:44 np0005541913.localdomain sudo[274985]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqbtzeoctazkargpamewexllxzajvipe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668804.0490382-2107-138712841441535/AnsiballZ_file.py
Dec 02 09:46:44 np0005541913.localdomain sudo[274985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:44 np0005541913.localdomain python3.9[274987]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:46:44 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:44.509 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:44 np0005541913.localdomain sudo[274985]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:45 np0005541913.localdomain sudo[275095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uupdsttfpwvlhuayxoddtdexyyxahzuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668804.9325645-2281-207128173940768/AnsiballZ_command.py
Dec 02 09:46:45 np0005541913.localdomain sudo[275095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:45 np0005541913.localdomain python3.9[275097]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:46:45 np0005541913.localdomain sudo[275095]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:46 np0005541913.localdomain python3.9[275207]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 09:46:46 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:46.476 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:46 np0005541913.localdomain sudo[275315]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxhzqzeymyyqzhibvtufldxtkfiuotvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668806.5851648-2335-249471742337885/AnsiballZ_systemd_service.py
Dec 02 09:46:46 np0005541913.localdomain sudo[275315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:46:46 np0005541913.localdomain podman[275318]: 2025-12-02 09:46:46.976914658 +0000 UTC m=+0.089211884 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 09:46:46 np0005541913.localdomain podman[275318]: 2025-12-02 09:46:46.982773176 +0000 UTC m=+0.095070382 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:46:46 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:46:47 np0005541913.localdomain python3.9[275317]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:46:47 np0005541913.localdomain systemd-rc-local-generator[275361]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:46:47 np0005541913.localdomain systemd-sysv-generator[275364]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:47 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:46:47 np0005541913.localdomain sudo[275315]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:47 np0005541913.localdomain sudo[275479]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srebfhpoixoikipkdvvwyvnrvolulkjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668807.752101-2359-225371957500574/AnsiballZ_command.py
Dec 02 09:46:47 np0005541913.localdomain sudo[275479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:48 np0005541913.localdomain python3.9[275481]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:46:48 np0005541913.localdomain sudo[275479]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:48 np0005541913.localdomain sudo[275590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lytfkccdtionkakhhsaybczfidqajrfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668808.309031-2359-142085406691736/AnsiballZ_command.py
Dec 02 09:46:48 np0005541913.localdomain sudo[275590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:48 np0005541913.localdomain python3.9[275592]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:46:48 np0005541913.localdomain sudo[275590]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:49 np0005541913.localdomain sudo[275701]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awufdfztpwwgqhonrocxdtxvxwarqxps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668808.926374-2359-146720618754671/AnsiballZ_command.py
Dec 02 09:46:49 np0005541913.localdomain sudo[275701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46426 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2936571885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A4B5E50000000001030307) 
Dec 02 09:46:49 np0005541913.localdomain python3.9[275703]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:46:49 np0005541913.localdomain sudo[275701]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:49 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:49.536 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:49 np0005541913.localdomain sudo[275812]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhtgxngkbwzdyvojvamqwjhebdjryuhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668809.5098512-2359-138637013024029/AnsiballZ_command.py
Dec 02 09:46:49 np0005541913.localdomain sudo[275812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:49 np0005541913.localdomain python3.9[275814]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:46:50 np0005541913.localdomain sudo[275812]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:50 np0005541913.localdomain sudo[275923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iosvkewmnwmeshtmzfppzjecdoqbmllu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668810.10716-2359-143515702608547/AnsiballZ_command.py
Dec 02 09:46:50 np0005541913.localdomain sudo[275923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:50 np0005541913.localdomain python3.9[275925]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:46:50 np0005541913.localdomain sudo[275923]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:50 np0005541913.localdomain sudo[276034]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swdstxnxvrsozcaesmckodesbqdjhark ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668810.7219417-2359-160467600100805/AnsiballZ_command.py
Dec 02 09:46:50 np0005541913.localdomain sudo[276034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:51 np0005541913.localdomain python3.9[276036]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:46:51 np0005541913.localdomain sudo[276034]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:46:51 np0005541913.localdomain podman[276109]: 2025-12-02 09:46:51.428708938 +0000 UTC m=+0.070518266 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, release=1755695350)
Dec 02 09:46:51 np0005541913.localdomain podman[276109]: 2025-12-02 09:46:51.440924195 +0000 UTC m=+0.082733483 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc.)
Dec 02 09:46:51 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:46:51 np0005541913.localdomain sudo[276164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikttwnrtuevbntdqlgigpsjlzwqgxgth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668811.2509997-2359-116720659879503/AnsiballZ_command.py
Dec 02 09:46:51 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:51.477 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:51 np0005541913.localdomain sudo[276164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:51 np0005541913.localdomain python3.9[276166]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:46:52 np0005541913.localdomain sudo[276164]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:53 np0005541913.localdomain sudo[276275]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-viiebzqtmfhzwjzpwldognkvysghsres ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668812.795629-2359-211856987405353/AnsiballZ_command.py
Dec 02 09:46:53 np0005541913.localdomain sudo[276275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:53 np0005541913.localdomain python3.9[276277]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:46:53 np0005541913.localdomain sudo[276275]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:46:54 np0005541913.localdomain systemd[1]: tmp-crun.waDUHa.mount: Deactivated successfully.
Dec 02 09:46:54 np0005541913.localdomain podman[276296]: 2025-12-02 09:46:54.433447536 +0000 UTC m=+0.073166046 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:46:54 np0005541913.localdomain podman[276296]: 2025-12-02 09:46:54.44667904 +0000 UTC m=+0.086397570 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:46:54 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:46:54 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:54.593 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:56 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:56.481 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:57 np0005541913.localdomain sudo[276409]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daiincavuulkpouuhtrmzxtpghipzrmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668816.8788784-2566-114950445349901/AnsiballZ_file.py
Dec 02 09:46:57 np0005541913.localdomain sudo[276409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:57 np0005541913.localdomain python3.9[276411]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:46:57 np0005541913.localdomain sudo[276409]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:57 np0005541913.localdomain sudo[276519]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzjbsufzqfmjdovtuyghpmwpzeeftkcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668817.3651688-2566-170041965372911/AnsiballZ_file.py
Dec 02 09:46:57 np0005541913.localdomain sudo[276519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:57 np0005541913.localdomain sudo[276522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:46:57 np0005541913.localdomain sudo[276522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:46:57 np0005541913.localdomain sudo[276522]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:57 np0005541913.localdomain python3.9[276521]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:46:57 np0005541913.localdomain sudo[276519]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:57 np0005541913.localdomain sudo[276540]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:46:57 np0005541913.localdomain sudo[276540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:46:58 np0005541913.localdomain sudo[276681]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lztoxbpbvgwlxvhpynephihfdzztguyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668817.964239-2566-57785826364362/AnsiballZ_file.py
Dec 02 09:46:58 np0005541913.localdomain sudo[276681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:58 np0005541913.localdomain python3.9[276683]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:46:58 np0005541913.localdomain sudo[276681]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:58 np0005541913.localdomain sudo[276540]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:58 np0005541913.localdomain sudo[276734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:46:58 np0005541913.localdomain sudo[276734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:46:58 np0005541913.localdomain sudo[276734]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:58 np0005541913.localdomain sudo[276777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 09:46:58 np0005541913.localdomain sudo[276777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:46:59 np0005541913.localdomain sudo[276844]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkexltfqwyrasywdvcrpsbcpcpxfpthf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668818.7613823-2632-218917650339487/AnsiballZ_file.py
Dec 02 09:46:59 np0005541913.localdomain sudo[276844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:59 np0005541913.localdomain python3.9[276846]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:46:59 np0005541913.localdomain sudo[276844]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:59 np0005541913.localdomain sudo[276777]: pam_unix(sudo:session): session closed for user root
Dec 02 09:46:59 np0005541913.localdomain sudo[276974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmglnfwukfogcwhgjswqdwacfsbszxfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668819.337232-2632-53640144687098/AnsiballZ_file.py
Dec 02 09:46:59 np0005541913.localdomain sudo[276974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:46:59 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:46:59.630 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:46:59 np0005541913.localdomain python3.9[276976]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:46:59 np0005541913.localdomain sudo[276974]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:00 np0005541913.localdomain sudo[277084]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghrpzuttqetdtlknfrhfjjinffosohed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668819.9224482-2632-191037049717219/AnsiballZ_file.py
Dec 02 09:47:00 np0005541913.localdomain sudo[277084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:00 np0005541913.localdomain python3.9[277086]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:00 np0005541913.localdomain sudo[277084]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:00 np0005541913.localdomain sudo[277194]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovcebymoooumnzzkxhhkedtjfbfzfdvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668820.4504926-2632-239892957733538/AnsiballZ_file.py
Dec 02 09:47:00 np0005541913.localdomain sudo[277194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:00 np0005541913.localdomain python3.9[277196]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:00 np0005541913.localdomain sudo[277194]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:01 np0005541913.localdomain sudo[277304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpnfudgjbqjhrlmeksodaxttoefaapyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668821.0614657-2632-155850746618716/AnsiballZ_file.py
Dec 02 09:47:01 np0005541913.localdomain sudo[277304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:47:01 np0005541913.localdomain systemd[1]: tmp-crun.Gj3kKJ.mount: Deactivated successfully.
Dec 02 09:47:01 np0005541913.localdomain podman[277307]: 2025-12-02 09:47:01.43369143 +0000 UTC m=+0.091433075 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 09:47:01 np0005541913.localdomain podman[277307]: 2025-12-02 09:47:01.443182063 +0000 UTC m=+0.100923708 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 09:47:01 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:47:01 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:01.483 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:01 np0005541913.localdomain python3.9[277306]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:01 np0005541913.localdomain sudo[277304]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:02 np0005541913.localdomain sudo[277432]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rovgmfltwiazzqqdqpcokupgmhvamccw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668821.8765795-2632-124787078572860/AnsiballZ_file.py
Dec 02 09:47:02 np0005541913.localdomain sudo[277432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:02 np0005541913.localdomain sudo[277435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:47:02 np0005541913.localdomain sudo[277435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:47:02 np0005541913.localdomain sudo[277435]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:02 np0005541913.localdomain python3.9[277434]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:02 np0005541913.localdomain sudo[277432]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:02 np0005541913.localdomain sudo[277560]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdkzifymfjxcecrcbjodfqvzixlaqljv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668822.4860957-2632-217392789628222/AnsiballZ_file.py
Dec 02 09:47:02 np0005541913.localdomain sudo[277560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:02 np0005541913.localdomain python3.9[277562]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:02 np0005541913.localdomain sudo[277560]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:47:03.031 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:47:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:47:03.031 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:47:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:47:03.033 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:47:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37563 DF PROTO=TCP SPT=42372 DPT=9102 SEQ=1588680873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A4EEFD0000000001030307) 
Dec 02 09:47:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:47:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:47:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:47:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:47:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:47:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:47:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:47:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:47:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:47:04 np0005541913.localdomain podman[277580]: 2025-12-02 09:47:04.441485271 +0000 UTC m=+0.081286973 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:47:04 np0005541913.localdomain podman[277580]: 2025-12-02 09:47:04.448038756 +0000 UTC m=+0.087840488 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:47:04 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:47:04 np0005541913.localdomain systemd[1]: tmp-crun.bLoULd.mount: Deactivated successfully.
Dec 02 09:47:04 np0005541913.localdomain podman[277581]: 2025-12-02 09:47:04.497826386 +0000 UTC m=+0.137837934 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:47:04 np0005541913.localdomain podman[277581]: 2025-12-02 09:47:04.577962747 +0000 UTC m=+0.217974325 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:47:04 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:47:04 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:04.633 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37564 DF PROTO=TCP SPT=42372 DPT=9102 SEQ=1588680873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A4F3240000000001030307) 
Dec 02 09:47:05 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46427 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2936571885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A4F5E50000000001030307) 
Dec 02 09:47:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:47:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:47:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:47:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148825 "" "Go-http-client/1.1"
Dec 02 09:47:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:47:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17219 "" "Go-http-client/1.1"
Dec 02 09:47:06 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:06.489 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37565 DF PROTO=TCP SPT=42372 DPT=9102 SEQ=1588680873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A4FB240000000001030307) 
Dec 02 09:47:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36166 DF PROTO=TCP SPT=37952 DPT=9102 SEQ=513994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A4FDE40000000001030307) 
Dec 02 09:47:08 np0005541913.localdomain sudo[277719]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrgwjcnubstipepsxocaaafscjayudtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668828.1911752-2957-138897291517895/AnsiballZ_getent.py
Dec 02 09:47:08 np0005541913.localdomain sudo[277719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:08 np0005541913.localdomain python3.9[277721]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 02 09:47:08 np0005541913.localdomain sudo[277719]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:09 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:09.680 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:09 np0005541913.localdomain sshd[277740]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:47:09 np0005541913.localdomain sshd[277740]: Accepted publickey for zuul from 192.168.122.30 port 54824 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:47:09 np0005541913.localdomain systemd-logind[757]: New session 61 of user zuul.
Dec 02 09:47:09 np0005541913.localdomain systemd[1]: Started Session 61 of User zuul.
Dec 02 09:47:10 np0005541913.localdomain sshd[277740]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:47:10 np0005541913.localdomain sshd[277743]: Received disconnect from 192.168.122.30 port 54824:11: disconnected by user
Dec 02 09:47:10 np0005541913.localdomain sshd[277743]: Disconnected from user zuul 192.168.122.30 port 54824
Dec 02 09:47:10 np0005541913.localdomain sshd[277740]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:47:10 np0005541913.localdomain systemd[1]: session-61.scope: Deactivated successfully.
Dec 02 09:47:10 np0005541913.localdomain systemd-logind[757]: Session 61 logged out. Waiting for processes to exit.
Dec 02 09:47:10 np0005541913.localdomain systemd-logind[757]: Removed session 61.
Dec 02 09:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:47:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 4776 writes, 21K keys, 4776 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4776 writes, 569 syncs, 8.39 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:47:10 np0005541913.localdomain python3.9[277851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:47:11 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37566 DF PROTO=TCP SPT=42372 DPT=9102 SEQ=1588680873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A50AE40000000001030307) 
Dec 02 09:47:11 np0005541913.localdomain python3.9[277937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668830.3757043-3038-59879810286393/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:11 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:11.490 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:12 np0005541913.localdomain python3.9[278045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:47:12 np0005541913.localdomain python3.9[278100]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:12.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:12.744 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:47:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:12.744 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:47:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:12.744 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:47:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:12.744 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:47:12 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:12.745 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:47:13 np0005541913.localdomain python3.9[278209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.185 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.240 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.240 230641 DEBUG nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.454 230641 WARNING nova.virt.libvirt.driver [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.456 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12130MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.456 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.456 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:47:13 np0005541913.localdomain python3.9[278316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668832.5736573-3038-229553464159819/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.555 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.556 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.556 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:47:13 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:13.600 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:47:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:14.059 230641 DEBUG oslo_concurrency.processutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:47:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:14.065 230641 DEBUG nova.compute.provider_tree [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:47:14 np0005541913.localdomain python3.9[278444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:47:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:47:14 np0005541913.localdomain systemd[1]: tmp-crun.I1W1G7.mount: Deactivated successfully.
Dec 02 09:47:14 np0005541913.localdomain podman[278533]: 2025-12-02 09:47:14.466777966 +0000 UTC m=+0.099493700 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:47:14 np0005541913.localdomain podman[278533]: 2025-12-02 09:47:14.478003866 +0000 UTC m=+0.110719650 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:47:14 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:47:14 np0005541913.localdomain python3.9[278532]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668833.6509578-3038-46550452600796/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=be0176be25a535cff695cce5406adb3d3b53bef4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:14.624 230641 DEBUG nova.scheduler.client.report [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:47:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:14.626 230641 DEBUG nova.compute.resource_tracker [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:47:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:14.626 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:47:14 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:14.720 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:47:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.2 total, 600.0 interval
                                                          Cumulative writes: 5722 writes, 25K keys, 5722 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5722 writes, 780 syncs, 7.34 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:47:15 np0005541913.localdomain python3.9[278659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:47:15 np0005541913.localdomain python3.9[278745]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668834.6595895-3038-101659954237068/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:15.626 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:15.627 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:15.627 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:47:15 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:15.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:16 np0005541913.localdomain python3.9[278853]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:47:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:16.493 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:16.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:16.722 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:47:16 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:16.723 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:47:16 np0005541913.localdomain python3.9[278939]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764668835.7094193-3038-89830932834428/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:17.324 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:47:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:17.325 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:47:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:17.325 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:47:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:17.326 230641 DEBUG nova.objects.instance [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:47:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:47:17 np0005541913.localdomain podman[278962]: 2025-12-02 09:47:17.476264342 +0000 UTC m=+0.091880376 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 09:47:17 np0005541913.localdomain podman[278962]: 2025-12-02 09:47:17.48331318 +0000 UTC m=+0.098929184 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 02 09:47:17 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:47:17 np0005541913.localdomain sudo[279065]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agthfeyyxneamzfmxxsmfbychhqgkzhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668837.3939922-3287-54175333077069/AnsiballZ_file.py
Dec 02 09:47:17 np0005541913.localdomain sudo[279065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:17 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:17.801 230641 DEBUG nova.network.neutron [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:47:17 np0005541913.localdomain python3.9[279067]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:47:17 np0005541913.localdomain sudo[279065]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:18.027 230641 DEBUG oslo_concurrency.lockutils [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:47:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:18.027 230641 DEBUG nova.compute.manager [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:47:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:18.028 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:18.029 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:18 np0005541913.localdomain sudo[279175]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfaammmxdbwmjjikqxivxcgmanvyxych ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668838.0758123-3311-87262902392616/AnsiballZ_copy.py
Dec 02 09:47:18 np0005541913.localdomain sudo[279175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:18 np0005541913.localdomain python3.9[279177]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:47:18 np0005541913.localdomain sudo[279175]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:18.721 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:18 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:18.722 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:19 np0005541913.localdomain sudo[279285]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsjsppcaguvuchpulumvkfuvqssavnls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668838.776077-3335-87375882887206/AnsiballZ_stat.py
Dec 02 09:47:19 np0005541913.localdomain sudo[279285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:19 np0005541913.localdomain python3.9[279287]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:47:19 np0005541913.localdomain sudo[279285]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37567 DF PROTO=TCP SPT=42372 DPT=9102 SEQ=1588680873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A52BE40000000001030307) 
Dec 02 09:47:19 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:19.717 230641 DEBUG oslo_service.periodic_task [None req-b1c85115-e3ce-415a-91e1-26304922cec6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:47:19 np0005541913.localdomain sudo[279397]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mulvcyayojdskcbioyeycjhhzlhgcgwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668839.5035598-3362-73177405998728/AnsiballZ_file.py
Dec 02 09:47:19 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:19.771 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:19 np0005541913.localdomain sudo[279397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:19 np0005541913.localdomain python3.9[279399]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:47:20 np0005541913.localdomain sudo[279397]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:20 np0005541913.localdomain python3.9[279507]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:47:21 np0005541913.localdomain python3.9[279617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:47:21 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:21.496 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:21 np0005541913.localdomain python3.9[279672]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:47:22 np0005541913.localdomain podman[279781]: 2025-12-02 09:47:22.444959356 +0000 UTC m=+0.085079245 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Dec 02 09:47:22 np0005541913.localdomain podman[279781]: 2025-12-02 09:47:22.464105498 +0000 UTC m=+0.104225357 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 09:47:22 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:47:22 np0005541913.localdomain python3.9[279780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 09:47:22 np0005541913.localdomain python3.9[279855]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 09:47:23 np0005541913.localdomain sudo[279963]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxptkkmawhqigkovjyngorqsinfxxuxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668843.3919885-3491-9763856437661/AnsiballZ_container_config_data.py
Dec 02 09:47:23 np0005541913.localdomain sudo[279963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:23 np0005541913.localdomain python3.9[279965]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 02 09:47:23 np0005541913.localdomain sudo[279963]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:24 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:24.811 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:24 np0005541913.localdomain sudo[280073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epbcpdndpfdlnmdovhbpzillwjmkvebx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668844.560017-3518-56511995854841/AnsiballZ_container_config_hash.py
Dec 02 09:47:24 np0005541913.localdomain sudo[280073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:47:24 np0005541913.localdomain systemd[1]: tmp-crun.kpveXJ.mount: Deactivated successfully.
Dec 02 09:47:24 np0005541913.localdomain podman[280076]: 2025-12-02 09:47:24.95895259 +0000 UTC m=+0.103157738 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:47:24 np0005541913.localdomain podman[280076]: 2025-12-02 09:47:24.970298873 +0000 UTC m=+0.114503951 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:47:24 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:47:25 np0005541913.localdomain python3.9[280075]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:47:25 np0005541913.localdomain sudo[280073]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:25 np0005541913.localdomain sudo[280206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxmdbqobsfqrshxmqcacrtmuubsbkwhu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668845.5204058-3548-226556314780854/AnsiballZ_edpm_container_manage.py
Dec 02 09:47:25 np0005541913.localdomain sudo[280206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:26 np0005541913.localdomain python3[280208]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:47:26 np0005541913.localdomain python3[280208]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",
                                                                    "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:31:10.62653219Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211779450,
                                                                    "VirtualSize": 1211779450,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",
                                                                              "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:53.072482982Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:02.761216507Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:03.785234187Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:17.194997182Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:24.14458279Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:30.048641643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:09.707360362Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.208898452Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624465805Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624514176Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:18.661822382Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 02 09:47:26 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:26.499 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:26 np0005541913.localdomain sudo[280206]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:27 np0005541913.localdomain sudo[280378]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqylkarlidxvoginljwbbrbbtqzwfguu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668846.7246985-3572-148172241128634/AnsiballZ_stat.py
Dec 02 09:47:27 np0005541913.localdomain sudo[280378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:27 np0005541913.localdomain python3.9[280380]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:47:27 np0005541913.localdomain sudo[280378]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:28 np0005541913.localdomain sudo[280490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhauvwahcayykvbidpaoszbfpsczxnds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668848.067012-3608-147957531823746/AnsiballZ_container_config_data.py
Dec 02 09:47:28 np0005541913.localdomain sudo[280490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:28 np0005541913.localdomain python3.9[280492]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 02 09:47:28 np0005541913.localdomain sudo[280490]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:29 np0005541913.localdomain sudo[280600]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlcvtqbtglgmbcjcndxdevtmfdnorpia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668848.8581164-3635-175038266276255/AnsiballZ_container_config_hash.py
Dec 02 09:47:29 np0005541913.localdomain sudo[280600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:29 np0005541913.localdomain python3.9[280602]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 09:47:29 np0005541913.localdomain sudo[280600]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:29 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:29.857 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:30 np0005541913.localdomain sudo[280710]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwdtevbjushfupdewhpbiwunsifotlpo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764668849.7221034-3665-248483894319932/AnsiballZ_edpm_container_manage.py
Dec 02 09:47:30 np0005541913.localdomain sudo[280710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:30 np0005541913.localdomain python3[280712]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 09:47:30 np0005541913.localdomain python3[280712]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",
                                                                    "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:31:10.62653219Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211779450,
                                                                    "VirtualSize": 1211779450,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",
                                                                              "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:53.072482982Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:02.761216507Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:03.785234187Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:17.194997182Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:24.14458279Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:30.048641643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:09.707360362Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.208898452Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624465805Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624514176Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:18.661822382Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 02 09:47:30 np0005541913.localdomain sudo[280710]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:31 np0005541913.localdomain sudo[280881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypmtxqyymfschmycndfsglouszaiueqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668850.8609798-3689-161365323417987/AnsiballZ_stat.py
Dec 02 09:47:31 np0005541913.localdomain sudo[280881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:31 np0005541913.localdomain python3.9[280883]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:47:31 np0005541913.localdomain sudo[280881]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:31 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:31.503 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:31 np0005541913.localdomain sudo[280993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnloosqrphbupcgodskhevpvqekpdjpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668851.6498704-3716-67595672583807/AnsiballZ_file.py
Dec 02 09:47:31 np0005541913.localdomain sudo[280993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:47:32 np0005541913.localdomain systemd[1]: tmp-crun.NSpZzR.mount: Deactivated successfully.
Dec 02 09:47:32 np0005541913.localdomain podman[280996]: 2025-12-02 09:47:32.056765331 +0000 UTC m=+0.102036737 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:47:32 np0005541913.localdomain podman[280996]: 2025-12-02 09:47:32.069940973 +0000 UTC m=+0.115212409 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:47:32 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:47:32 np0005541913.localdomain python3.9[280995]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:47:32 np0005541913.localdomain sudo[280993]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:32 np0005541913.localdomain sudo[281121]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkoekplhrnawehtaflvzesdaaedowjzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668852.2002988-3716-129887121747905/AnsiballZ_copy.py
Dec 02 09:47:32 np0005541913.localdomain sudo[281121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:32 np0005541913.localdomain python3.9[281123]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764668852.2002988-3716-129887121747905/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:47:32 np0005541913.localdomain sudo[281121]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:33 np0005541913.localdomain sudo[281176]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqkoimzgrluruskeybwcbglxxmtyebdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668852.2002988-3716-129887121747905/AnsiballZ_systemd.py
Dec 02 09:47:33 np0005541913.localdomain sudo[281176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:33 np0005541913.localdomain python3.9[281178]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:47:33 np0005541913.localdomain sudo[281176]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55248 DF PROTO=TCP SPT=40112 DPT=9102 SEQ=3432569286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A5642E0000000001030307) 
Dec 02 09:47:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:47:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:47:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:47:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:47:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:47:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:47:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:47:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:47:34 np0005541913.localdomain python3.9[281288]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:47:34 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:34.894 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55249 DF PROTO=TCP SPT=40112 DPT=9102 SEQ=3432569286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A568240000000001030307) 
Dec 02 09:47:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:47:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:47:35 np0005541913.localdomain podman[281397]: 2025-12-02 09:47:35.441526016 +0000 UTC m=+0.072685463 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:47:35 np0005541913.localdomain python3.9[281396]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:47:35 np0005541913.localdomain podman[281397]: 2025-12-02 09:47:35.454072041 +0000 UTC m=+0.085231498 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:47:35 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:47:35 np0005541913.localdomain podman[281398]: 2025-12-02 09:47:35.507856088 +0000 UTC m=+0.134579027 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 09:47:35 np0005541913.localdomain podman[281398]: 2025-12-02 09:47:35.545203897 +0000 UTC m=+0.171926846 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 02 09:47:35 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:47:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37568 DF PROTO=TCP SPT=42372 DPT=9102 SEQ=1588680873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A56BE40000000001030307) 
Dec 02 09:47:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:47:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:47:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:47:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148825 "" "Go-http-client/1.1"
Dec 02 09:47:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:47:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Dec 02 09:47:36 np0005541913.localdomain python3.9[281549]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 09:47:36 np0005541913.localdomain nova_compute[230637]: 2025-12-02 09:47:36.506 230641 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55250 DF PROTO=TCP SPT=40112 DPT=9102 SEQ=3432569286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A570240000000001030307) 
Dec 02 09:47:37 np0005541913.localdomain sudo[281657]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihmcctxcqmkwtktufjxvyxtdikhkwsnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668856.6428885-3886-166072411767814/AnsiballZ_podman_container.py
Dec 02 09:47:37 np0005541913.localdomain sudo[281657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:37 np0005541913.localdomain python3.9[281659]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 02 09:47:37 np0005541913.localdomain sudo[281657]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:37 np0005541913.localdomain systemd-journald[47611]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 119.8 (399 of 333 items), suggesting rotation.
Dec 02 09:47:37 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 09:47:37 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:47:37 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:47:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46428 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2936571885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A573E40000000001030307) 
Dec 02 09:47:37 np0005541913.localdomain sudo[281790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnhfakhgvkeewsiedebmcvejyaxlfwum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668857.7236614-3908-220591422734210/AnsiballZ_systemd.py
Dec 02 09:47:37 np0005541913.localdomain sudo[281790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:38 np0005541913.localdomain python3.9[281792]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 09:47:38 np0005541913.localdomain systemd[1]: Stopping nova_compute container...
Dec 02 09:47:38 np0005541913.localdomain virtqemud[203664]: End of file while reading data: Input/output error
Dec 02 09:47:38 np0005541913.localdomain systemd[1]: libpod-a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f.scope: Deactivated successfully.
Dec 02 09:47:38 np0005541913.localdomain systemd[1]: libpod-a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f.scope: Consumed 20.354s CPU time.
Dec 02 09:47:38 np0005541913.localdomain podman[281796]: 2025-12-02 09:47:38.405853834 +0000 UTC m=+0.088948358 container died a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 02 09:47:38 np0005541913.localdomain podman[281796]: 2025-12-02 09:47:38.577173403 +0000 UTC m=+0.260267867 container cleanup a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 09:47:38 np0005541913.localdomain podman[281796]: nova_compute
Dec 02 09:47:38 np0005541913.localdomain podman[281837]: error opening file `/run/crun/a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f/status`: No such file or directory
Dec 02 09:47:38 np0005541913.localdomain podman[281825]: 2025-12-02 09:47:38.687410538 +0000 UTC m=+0.077528202 container cleanup a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:47:38 np0005541913.localdomain podman[281825]: nova_compute
Dec 02 09:47:38 np0005541913.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 02 09:47:38 np0005541913.localdomain systemd[1]: Stopped nova_compute container.
Dec 02 09:47:38 np0005541913.localdomain systemd[1]: Starting nova_compute container...
Dec 02 09:47:38 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:47:38 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 02 09:47:38 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 09:47:38 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 09:47:38 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 09:47:38 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599d32757aa561883618730f7ad2a353ae4158b524af51217e1c260ed80653f0/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 09:47:38 np0005541913.localdomain podman[281839]: 2025-12-02 09:47:38.850496347 +0000 UTC m=+0.125414652 container init a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 02 09:47:38 np0005541913.localdomain podman[281839]: 2025-12-02 09:47:38.860444983 +0000 UTC m=+0.135363288 container start a094c2e6ac4b91ea01dadd1aec120ace7c4876478b7e6f2c2a6086a977536c7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 09:47:38 np0005541913.localdomain podman[281839]: nova_compute
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: + sudo -E kolla_set_configs
Dec 02 09:47:38 np0005541913.localdomain systemd[1]: Started nova_compute container.
Dec 02 09:47:38 np0005541913.localdomain sudo[281790]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Validating config file
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying service configuration files
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /etc/ceph
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Creating directory /etc/ceph
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/ceph
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Writing out command to execute
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: ++ cat /run_command
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: + CMD=nova-compute
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: + ARGS=
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: + sudo kolla_copy_cacerts
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: + [[ ! -n '' ]]
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: + . kolla_extend_start
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: Running command: 'nova-compute'
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: + echo 'Running command: '\''nova-compute'\'''
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: + umask 0022
Dec 02 09:47:38 np0005541913.localdomain nova_compute[281854]: + exec nova-compute
Dec 02 09:47:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:40.673 281858 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 02 09:47:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:40.673 281858 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 02 09:47:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:40.673 281858 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 02 09:47:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:40.673 281858 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 02 09:47:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:40.795 281858 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:47:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:40.818 281858 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:47:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:40.819 281858 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 02 09:47:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55251 DF PROTO=TCP SPT=40112 DPT=9102 SEQ=3432569286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A57FE50000000001030307) 
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.222 281858 INFO nova.virt.driver [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.338 281858 INFO nova.compute.provider_config [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.346 281858 DEBUG oslo_concurrency.lockutils [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.346 281858 DEBUG oslo_concurrency.lockutils [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.346 281858 DEBUG oslo_concurrency.lockutils [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.347 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.347 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.347 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.347 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.347 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.348 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.348 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.348 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.348 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.348 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.348 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.348 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.349 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.349 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.349 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.349 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.349 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.349 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.350 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.350 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] console_host                   = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.350 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.350 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.350 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.350 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.350 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.351 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.351 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.351 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.351 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.351 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.351 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.351 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.352 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.352 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.352 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.352 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.352 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.352 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.352 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] host                           = np0005541913.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.353 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.353 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.353 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.353 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.353 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.354 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.354 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.354 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.354 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.354 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.354 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.354 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.355 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.355 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.355 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.355 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.355 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.355 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.355 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.356 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.356 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.356 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.356 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.356 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.356 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.356 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.356 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.357 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.357 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.357 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.357 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.357 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.357 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.357 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.358 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.358 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.358 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.358 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.358 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.358 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.358 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.359 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.359 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.359 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.359 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.359 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.359 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.359 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.360 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.360 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.360 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.360 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.360 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.360 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.360 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.360 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.361 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.361 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.361 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.361 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.361 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.361 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.361 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.362 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.362 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.362 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.362 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.362 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.362 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.362 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.362 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.363 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.363 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.363 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.363 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.363 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.363 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.363 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.364 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.364 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.364 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.364 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.364 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.364 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.364 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.364 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.365 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.365 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.365 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.365 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.365 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.365 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.365 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.366 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.366 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.366 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.366 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.366 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.366 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.366 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.367 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.367 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.367 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.367 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.367 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.367 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.367 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.367 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.368 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.368 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.368 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.368 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.368 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.368 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.368 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.369 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.369 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.369 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.369 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.369 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.369 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.370 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.370 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.370 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.370 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.370 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.370 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.370 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.370 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.371 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.371 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.371 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.371 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.371 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.371 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.371 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.372 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.372 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.372 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.372 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.372 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.372 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.372 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.373 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.373 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.373 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.373 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.373 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.373 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.373 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.374 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.374 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.374 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.374 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.374 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.374 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.374 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.375 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.375 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.375 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.375 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.375 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.375 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.375 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.376 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.376 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.376 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.376 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.376 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.376 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.376 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.376 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.377 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.377 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.377 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.377 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.377 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.377 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.377 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.378 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.378 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.378 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.378 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.378 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.379 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.379 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.379 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.379 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.379 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.379 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.379 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.379 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.380 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.380 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.380 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.380 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.380 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.380 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.381 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.381 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.381 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.381 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.381 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.381 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.381 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.382 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.382 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.382 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.382 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.382 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.382 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.382 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.383 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.383 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.383 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.383 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.383 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.383 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.383 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.384 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.384 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.384 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.384 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.384 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.384 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.384 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.385 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.385 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.385 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.385 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.385 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.385 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.385 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.386 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.386 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.386 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.386 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.386 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.386 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.386 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.387 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.387 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.387 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.387 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.387 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.387 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.387 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.388 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.388 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.388 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.388 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.388 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.388 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.388 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.389 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.389 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.389 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.389 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.389 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.389 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.389 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.389 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.390 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.390 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.390 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.390 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.390 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.390 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.390 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.391 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.391 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.391 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.391 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.391 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.391 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.391 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.391 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.392 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.392 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.392 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.392 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.392 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.392 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.392 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.393 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.393 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.393 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.393 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.393 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.393 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.393 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.394 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.394 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.394 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.394 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.394 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.394 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.394 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.394 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.395 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.395 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.395 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.395 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.395 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.395 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.395 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.396 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.396 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.396 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.396 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.396 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.396 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.397 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.397 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.397 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.397 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.397 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.397 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.398 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.398 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.398 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.398 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.398 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.398 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.398 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.399 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.399 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.399 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.399 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.399 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.399 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.399 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.400 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.400 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.400 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.400 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.400 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.400 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.400 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.401 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.401 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.401 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.401 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.401 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.401 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.401 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.401 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.402 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.402 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.402 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.402 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.402 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.402 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.402 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.403 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.403 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.403 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.403 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.403 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.403 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.403 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.404 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.404 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.404 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.404 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.404 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.404 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.404 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.405 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.405 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.405 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.405 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.405 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.405 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.405 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.405 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.406 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.406 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.406 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.406 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.406 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.406 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.406 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.407 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.407 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.407 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.407 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.407 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.407 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.408 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.408 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.408 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.408 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.408 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.408 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.408 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.409 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.409 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.409 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.409 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.409 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.409 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.409 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.410 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.410 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.410 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.410 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.410 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.410 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.410 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.410 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.411 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.411 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.411 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.411 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.411 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.411 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.411 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.412 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.412 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.412 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.412 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.412 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.412 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.412 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.413 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.413 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.413 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.413 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.413 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.413 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.413 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.414 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.414 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.414 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.414 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.414 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.414 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.414 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.415 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.415 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.415 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.415 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.415 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.415 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.415 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.415 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.416 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.416 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.416 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.416 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.416 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.416 281858 WARNING oslo_config.cfg [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: and ``live_migration_inbound_addr`` respectively.
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: ).  Its value may be silently ignored in the future.
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.417 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.417 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.417 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.417 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.417 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.417 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.418 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.418 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.418 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.418 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.418 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.418 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.418 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.419 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.419 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.419 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.419 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.419 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.419 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rbd_secret_uuid        = c7c8e171-a193-56fb-95fa-8879fcfa7074 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.419 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.420 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.420 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.420 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.420 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.420 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.420 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.420 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.421 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.421 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.421 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.421 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.421 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.421 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.421 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.422 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.422 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.422 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.422 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.422 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.422 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.422 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.423 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.423 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.423 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.423 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.423 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.423 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.423 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.424 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.424 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.424 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.424 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.424 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.424 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.424 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.425 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.425 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.425 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.425 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.425 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.425 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.425 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.426 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.426 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.426 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.426 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.426 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.426 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.426 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.427 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.427 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.427 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.427 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.427 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.427 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.427 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.428 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.428 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.428 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.428 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.428 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.428 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.428 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.428 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.429 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.429 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.429 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.429 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.429 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.429 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.430 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.430 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.430 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.430 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.430 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.430 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.430 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.430 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.431 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.431 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.431 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.431 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.431 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.431 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.431 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.432 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.432 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.432 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.432 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.432 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.432 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.432 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.433 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.433 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.433 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.433 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.433 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.433 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.433 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.434 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.434 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.434 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.434 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.434 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.434 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.434 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.435 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.435 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.435 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.435 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.435 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.435 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.435 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.436 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.436 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.436 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.436 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.436 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.436 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.437 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.437 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.437 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.437 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.437 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.437 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.438 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.438 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.438 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.438 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.438 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.438 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.438 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.439 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.439 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.439 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.439 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.439 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.439 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.439 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.440 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.440 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.440 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.440 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.440 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.440 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.441 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.441 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.441 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.441 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.442 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.442 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.442 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.442 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.442 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.443 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.443 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.443 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.443 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.443 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.444 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.444 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.444 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.444 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.445 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.445 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.445 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.445 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.446 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.446 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.446 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.446 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.446 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.447 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.447 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.447 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.447 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.448 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.448 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.448 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.448 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.449 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.449 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.449 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.449 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.450 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.450 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.450 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.450 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.450 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.451 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.451 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.451 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.451 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.451 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.452 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.452 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.452 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.452 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.453 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.453 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.453 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.453 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.453 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.454 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.454 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.454 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.454 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.455 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.455 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.455 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.455 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.455 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.456 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.456 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.456 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.456 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.456 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.457 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.457 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.457 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.457 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.457 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.458 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.458 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.458 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.458 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.459 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.459 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.459 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.459 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.460 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.460 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.460 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.460 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.461 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.461 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.461 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.461 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.462 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.462 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.462 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.462 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.462 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.463 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.463 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.463 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.463 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.463 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.464 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.464 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.464 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.464 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.464 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.465 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.465 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.465 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.465 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.465 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.465 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.465 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.466 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.466 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.466 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.466 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.466 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.466 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.466 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.467 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.467 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.467 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.467 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.467 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.467 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.467 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.468 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.468 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.468 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.468 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.468 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.468 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.469 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.469 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.469 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.469 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.469 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.469 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.469 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.470 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.470 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.470 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.470 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.470 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.470 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.470 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.470 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.471 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.471 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.471 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.471 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.471 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.471 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.471 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.472 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.472 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.472 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.472 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.472 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.472 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.472 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.473 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.473 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.473 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.473 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.473 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.473 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.473 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.474 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.474 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.474 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.474 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.474 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.474 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.474 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.475 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.475 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.475 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.475 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.475 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.475 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.475 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.475 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.476 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.476 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.476 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.476 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.476 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.476 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.476 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.477 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.477 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.477 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.477 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.477 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.477 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.477 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.477 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.478 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.478 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.478 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.478 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.478 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.478 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.478 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.478 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.479 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.479 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.479 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.479 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.479 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.479 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.479 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.480 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.480 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.480 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.480 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.480 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.480 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.480 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.480 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.481 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.481 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.481 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.481 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.481 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.481 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.482 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.482 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.482 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.482 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.482 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.482 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.482 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.482 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.483 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.483 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.483 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.483 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.483 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.483 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.483 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.484 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.484 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.484 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.484 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.484 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.484 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.484 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.484 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.485 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.485 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.485 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.485 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.485 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.485 281858 DEBUG oslo_service.service [None req-bfdcfc7a-e0b9-4ff0-98cd-7e9e403fb20a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.486 281858 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.502 281858 INFO nova.virt.node [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Determined node identity c79215b2-6762-4f7f-a322-f44db2b0b9bd from /var/lib/nova/compute_id
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.503 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.503 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.503 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.504 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.515 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7f6f4bb070> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.518 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7f6f4bb070> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.518 281858 INFO nova.virt.libvirt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Connection event '1' reason 'None'
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.523 281858 INFO nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Libvirt host capabilities <capabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <host>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <uuid>f041467c-26d0-44b9-832e-8db5f9b7a49d</uuid>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <arch>x86_64</arch>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model>EPYC-Rome-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <vendor>AMD</vendor>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <microcode version='16777317'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <signature family='23' model='49' stepping='0'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='x2apic'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='tsc-deadline'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='osxsave'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='hypervisor'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='tsc_adjust'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='spec-ctrl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='stibp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='arch-capabilities'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='cmp_legacy'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='topoext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='virt-ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='lbrv'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='tsc-scale'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='vmcb-clean'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='pause-filter'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='pfthreshold'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='svme-addr-chk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='rdctl-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='skip-l1dfl-vmentry'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='mds-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature name='pschange-mc-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <pages unit='KiB' size='4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <pages unit='KiB' size='2048'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <pages unit='KiB' size='1048576'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <power_management>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <suspend_mem/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <suspend_disk/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <suspend_hybrid/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </power_management>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <iommu support='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <migration_features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <live/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <uri_transports>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <uri_transport>tcp</uri_transport>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <uri_transport>rdma</uri_transport>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </uri_transports>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </migration_features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <topology>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <cells num='1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <cell id='0'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:           <memory unit='KiB'>16116612</memory>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:           <pages unit='KiB' size='4'>4029153</pages>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:           <pages unit='KiB' size='2048'>0</pages>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:           <distances>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:             <sibling id='0' value='10'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:           </distances>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:           <cpus num='8'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:           </cpus>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         </cell>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </cells>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </topology>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <cache>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </cache>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <secmodel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model>selinux</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <doi>0</doi>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </secmodel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <secmodel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model>dac</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <doi>0</doi>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </secmodel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </host>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <guest>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <os_type>hvm</os_type>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <arch name='i686'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <wordsize>32</wordsize>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <domain type='qemu'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <domain type='kvm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </arch>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <pae/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <nonpae/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <acpi default='on' toggle='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <apic default='on' toggle='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <cpuselection/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <deviceboot/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <disksnapshot default='on' toggle='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <externalSnapshot/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </guest>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <guest>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <os_type>hvm</os_type>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <arch name='x86_64'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <wordsize>64</wordsize>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <domain type='qemu'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <domain type='kvm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </arch>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <acpi default='on' toggle='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <apic default='on' toggle='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <cpuselection/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <deviceboot/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <disksnapshot default='on' toggle='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <externalSnapshot/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </guest>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: </capabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.529 281858 DEBUG nova.virt.libvirt.volume.mount [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.535 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.541 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: <domainCapabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <domain>kvm</domain>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <arch>i686</arch>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <vcpu max='240'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <iothreads supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <os supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <enum name='firmware'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <loader supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>rom</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pflash</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='readonly'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>yes</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>no</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='secure'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>no</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </loader>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </os>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>on</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>off</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='maximum' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='maximumMigratable'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>on</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>off</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='host-model' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <vendor>AMD</vendor>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='x2apic'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='stibp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='succor'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='lbrv'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='custom' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Dhyana-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Genoa'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='auto-ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='auto-ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-128'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-256'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-512'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='KnightsMill'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4fmaps'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4vnniw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512er'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512pf'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='KnightsMill-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4fmaps'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4vnniw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512er'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512pf'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tbm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tbm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SierraForest'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ne-convert'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cmpccxadd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SierraForest-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ne-convert'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cmpccxadd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='athlon'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='athlon-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='core2duo'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='core2duo-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='coreduo'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='coreduo-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='n270'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='n270-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='phenom'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='phenom-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <memoryBacking supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <enum name='sourceType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>file</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>anonymous</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>memfd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </memoryBacking>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <devices>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <disk supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='diskDevice'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>disk</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>cdrom</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>floppy</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>lun</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='bus'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ide</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>fdc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>scsi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>sata</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-non-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <graphics supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vnc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>egl-headless</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dbus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </graphics>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <video supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='modelType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vga</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>cirrus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>none</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>bochs</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ramfb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </video>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <hostdev supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='mode'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>subsystem</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='startupPolicy'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>default</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>mandatory</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>requisite</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>optional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='subsysType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pci</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>scsi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='capsType'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='pciBackend'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </hostdev>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <rng supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-non-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>random</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>egd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>builtin</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </rng>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <filesystem supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='driverType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>path</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>handle</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtiofs</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </filesystem>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <tpm supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tpm-tis</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tpm-crb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>emulator</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>external</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendVersion'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>2.0</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </tpm>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <redirdev supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='bus'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </redirdev>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <channel supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pty</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>unix</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </channel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <crypto supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>qemu</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>builtin</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </crypto>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <interface supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>default</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>passt</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </interface>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <panic supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>isa</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>hyperv</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </panic>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <console supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>null</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pty</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dev</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>file</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pipe</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>stdio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>udp</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tcp</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>unix</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>qemu-vdagent</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dbus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </console>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </devices>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <gic supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <vmcoreinfo supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <genid supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <backingStoreInput supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <backup supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <async-teardown supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <ps2 supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <sev supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <sgx supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <hyperv supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='features'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>relaxed</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vapic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>spinlocks</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vpindex</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>runtime</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>synic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>stimer</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>reset</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vendor_id</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>frequencies</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>reenlightenment</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tlbflush</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ipi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>avic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>emsr_bitmap</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>xmm_input</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <defaults>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <spinlocks>4095</spinlocks>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <stimer_direct>on</stimer_direct>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </defaults>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </hyperv>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <launchSecurity supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='sectype'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tdx</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </launchSecurity>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: </domainCapabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.548 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: <domainCapabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <domain>kvm</domain>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <arch>i686</arch>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <vcpu max='1024'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <iothreads supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <os supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <enum name='firmware'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <loader supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>rom</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pflash</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='readonly'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>yes</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>no</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='secure'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>no</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </loader>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </os>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>on</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>off</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='maximum' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='maximumMigratable'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>on</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>off</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='host-model' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <vendor>AMD</vendor>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='x2apic'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='stibp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='succor'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='lbrv'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='custom' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Dhyana-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Genoa'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='auto-ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='auto-ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-128'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-256'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-512'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='KnightsMill'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4fmaps'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4vnniw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512er'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512pf'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='KnightsMill-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4fmaps'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4vnniw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512er'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512pf'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tbm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tbm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SierraForest'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ne-convert'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cmpccxadd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SierraForest-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ne-convert'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cmpccxadd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='athlon'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='athlon-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='core2duo'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='core2duo-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='coreduo'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='coreduo-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='n270'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='n270-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='phenom'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='phenom-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <memoryBacking supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <enum name='sourceType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>file</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>anonymous</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>memfd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </memoryBacking>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <devices>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <disk supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='diskDevice'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>disk</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>cdrom</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>floppy</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>lun</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='bus'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>fdc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>scsi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>sata</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-non-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <graphics supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vnc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>egl-headless</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dbus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </graphics>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <video supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='modelType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vga</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>cirrus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>none</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>bochs</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ramfb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </video>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <hostdev supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='mode'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>subsystem</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='startupPolicy'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>default</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>mandatory</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>requisite</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>optional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='subsysType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pci</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>scsi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='capsType'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='pciBackend'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </hostdev>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <rng supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-non-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>random</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>egd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>builtin</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </rng>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <filesystem supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='driverType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>path</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>handle</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtiofs</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </filesystem>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <tpm supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tpm-tis</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tpm-crb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>emulator</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>external</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendVersion'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>2.0</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </tpm>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <redirdev supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='bus'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </redirdev>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <channel supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pty</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>unix</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </channel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <crypto supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>qemu</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>builtin</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </crypto>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <interface supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>default</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>passt</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </interface>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <panic supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>isa</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>hyperv</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </panic>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <console supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>null</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pty</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dev</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>file</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pipe</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>stdio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>udp</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tcp</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>unix</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>qemu-vdagent</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dbus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </console>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </devices>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <gic supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <vmcoreinfo supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <genid supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <backingStoreInput supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <backup supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <async-teardown supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <ps2 supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <sev supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <sgx supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <hyperv supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='features'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>relaxed</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vapic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>spinlocks</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vpindex</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>runtime</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>synic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>stimer</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>reset</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vendor_id</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>frequencies</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>reenlightenment</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tlbflush</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ipi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>avic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>emsr_bitmap</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>xmm_input</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <defaults>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <spinlocks>4095</spinlocks>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <stimer_direct>on</stimer_direct>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </defaults>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </hyperv>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <launchSecurity supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='sectype'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tdx</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </launchSecurity>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: </domainCapabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.594 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.599 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: <domainCapabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <domain>kvm</domain>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <arch>x86_64</arch>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <vcpu max='240'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <iothreads supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <os supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <enum name='firmware'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <loader supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>rom</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pflash</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='readonly'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>yes</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>no</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='secure'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>no</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </loader>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </os>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>on</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>off</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='maximum' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='maximumMigratable'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>on</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>off</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='host-model' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <vendor>AMD</vendor>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='x2apic'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='stibp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='succor'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='lbrv'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='custom' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Dhyana-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Genoa'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='auto-ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='auto-ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-128'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-256'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-512'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='KnightsMill'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4fmaps'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4vnniw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512er'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512pf'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='KnightsMill-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4fmaps'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4vnniw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512er'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512pf'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tbm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tbm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SierraForest'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ne-convert'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cmpccxadd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SierraForest-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ne-convert'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cmpccxadd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='athlon'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='athlon-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='core2duo'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='core2duo-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='coreduo'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='coreduo-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='n270'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='n270-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='phenom'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='phenom-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <memoryBacking supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <enum name='sourceType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>file</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>anonymous</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>memfd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </memoryBacking>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <devices>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <disk supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='diskDevice'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>disk</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>cdrom</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>floppy</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>lun</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='bus'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ide</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>fdc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>scsi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>sata</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-non-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <graphics supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vnc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>egl-headless</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dbus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </graphics>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <video supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='modelType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vga</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>cirrus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>none</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>bochs</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ramfb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </video>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <hostdev supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='mode'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>subsystem</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='startupPolicy'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>default</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>mandatory</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>requisite</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>optional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='subsysType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pci</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>scsi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='capsType'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='pciBackend'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </hostdev>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <rng supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-non-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>random</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>egd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>builtin</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </rng>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <filesystem supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='driverType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>path</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>handle</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtiofs</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </filesystem>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <tpm supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tpm-tis</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tpm-crb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>emulator</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>external</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendVersion'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>2.0</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </tpm>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <redirdev supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='bus'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </redirdev>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <channel supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pty</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>unix</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </channel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <crypto supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>qemu</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>builtin</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </crypto>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <interface supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>default</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>passt</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </interface>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <panic supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>isa</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>hyperv</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </panic>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <console supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>null</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pty</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dev</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>file</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pipe</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>stdio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>udp</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tcp</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>unix</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>qemu-vdagent</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dbus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </console>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </devices>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <gic supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <vmcoreinfo supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <genid supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <backingStoreInput supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <backup supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <async-teardown supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <ps2 supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <sev supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <sgx supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <hyperv supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='features'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>relaxed</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vapic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>spinlocks</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vpindex</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>runtime</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>synic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>stimer</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>reset</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vendor_id</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>frequencies</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>reenlightenment</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tlbflush</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ipi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>avic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>emsr_bitmap</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>xmm_input</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <defaults>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <spinlocks>4095</spinlocks>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <stimer_direct>on</stimer_direct>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </defaults>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </hyperv>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <launchSecurity supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='sectype'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tdx</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </launchSecurity>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: </domainCapabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.693 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: <domainCapabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <domain>kvm</domain>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <arch>x86_64</arch>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <vcpu max='1024'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <iothreads supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <os supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <enum name='firmware'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>efi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <loader supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>rom</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pflash</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='readonly'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>yes</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>no</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='secure'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>yes</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>no</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </loader>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </os>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='host-passthrough' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='hostPassthroughMigratable'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>on</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>off</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='maximum' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='maximumMigratable'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>on</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>off</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='host-model' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <vendor>AMD</vendor>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='x2apic'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='hypervisor'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='stibp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='overflow-recov'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='succor'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='lbrv'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='tsc-scale'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='pause-filter'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='pfthreshold'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <feature policy='disable' name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <mode name='custom' supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Broadwell-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Cooperlake-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Denverton-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Dhyana-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Genoa'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='auto-ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='auto-ibrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Milan-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amd-psfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='no-nested-data-bp'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='null-sel-clr-base'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='stibp-always-on'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-Rome-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='EPYC-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='GraniteRapids-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-128'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-256'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx10-512'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='prefetchiti'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Haswell-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v6'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Icelake-Server-v7'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='IvyBridge-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='KnightsMill'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4fmaps'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4vnniw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512er'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512pf'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='KnightsMill-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4fmaps'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-4vnniw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512er'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512pf'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G4-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tbm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Opteron_G5-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fma4'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tbm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xop'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SapphireRapids-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='amx-tile'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-bf16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-fp16'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512-vpopcntdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bitalg'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vbmi2'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrc'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fzrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='la57'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='taa-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='tsx-ldtrk'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xfd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SierraForest'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ne-convert'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cmpccxadd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='SierraForest-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ifma'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-ne-convert'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx-vnni-int8'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='bus-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cmpccxadd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fbsdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='fsrs'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ibrs-all'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mcdt-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pbrsb-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='psdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='sbdr-ssdp-no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='serialize'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vaes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='vpclmulqdq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Client-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='hle'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='rtm'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Skylake-Server-v5'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512bw'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512cd'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512dq'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512f'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='avx512vl'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='invpcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pcid'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='pku'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='mpx'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v2'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v3'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='core-capability'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='split-lock-detect'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='Snowridge-v4'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='cldemote'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='erms'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='gfni'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdir64b'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='movdiri'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='xsaves'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='athlon'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='athlon-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='core2duo'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='core2duo-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='coreduo'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='coreduo-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='n270'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='n270-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='ss'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='phenom'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <blockers model='phenom-v1'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnow'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <feature name='3dnowext'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </blockers>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </mode>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </cpu>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <memoryBacking supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <enum name='sourceType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>file</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>anonymous</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <value>memfd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </memoryBacking>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <devices>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <disk supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='diskDevice'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>disk</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>cdrom</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>floppy</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>lun</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='bus'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>fdc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>scsi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>sata</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-non-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <graphics supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vnc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>egl-headless</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dbus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </graphics>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <video supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='modelType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vga</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>cirrus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>none</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>bochs</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ramfb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </video>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <hostdev supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='mode'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>subsystem</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='startupPolicy'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>default</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>mandatory</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>requisite</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>optional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='subsysType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pci</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>scsi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='capsType'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='pciBackend'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </hostdev>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <rng supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtio-non-transitional</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>random</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>egd</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>builtin</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </rng>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <filesystem supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='driverType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>path</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>handle</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>virtiofs</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </filesystem>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <tpm supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tpm-tis</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tpm-crb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>emulator</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>external</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendVersion'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>2.0</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </tpm>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <redirdev supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='bus'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>usb</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </redirdev>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <channel supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pty</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>unix</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </channel>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <crypto supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>qemu</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendModel'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>builtin</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </crypto>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <interface supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='backendType'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>default</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>passt</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </interface>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <panic supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='model'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>isa</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>hyperv</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </panic>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <console supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='type'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>null</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vc</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pty</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dev</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>file</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>pipe</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>stdio</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>udp</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tcp</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>unix</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>qemu-vdagent</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>dbus</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </console>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </devices>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   <features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <gic supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <vmcoreinfo supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <genid supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <backingStoreInput supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <backup supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <async-teardown supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <ps2 supported='yes'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <sev supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <sgx supported='no'/>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <hyperv supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='features'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>relaxed</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vapic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>spinlocks</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vpindex</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>runtime</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>synic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>stimer</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>reset</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>vendor_id</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>frequencies</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>reenlightenment</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tlbflush</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>ipi</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>avic</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>emsr_bitmap</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>xmm_input</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <defaults>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <spinlocks>4095</spinlocks>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <stimer_direct>on</stimer_direct>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <tlbflush_direct>off</tlbflush_direct>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <tlbflush_extended>off</tlbflush_extended>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </defaults>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </hyperv>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     <launchSecurity supported='yes'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       <enum name='sectype'>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:         <value>tdx</value>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:       </enum>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:     </launchSecurity>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:   </features>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: </domainCapabilities>
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.757 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.757 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.758 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.758 281858 INFO nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Secure Boot support detected
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.762 281858 INFO nova.virt.libvirt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.762 281858 INFO nova.virt.libvirt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.777 281858 DEBUG nova.virt.libvirt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.804 281858 INFO nova.virt.node [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Determined node identity c79215b2-6762-4f7f-a322-f44db2b0b9bd from /var/lib/nova/compute_id
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.822 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Verified node c79215b2-6762-4f7f-a322-f44db2b0b9bd matches my host np0005541913.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.862 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.868 281858 DEBUG nova.virt.libvirt.vif [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-02T08:31:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005541913.localdomain',hostname='test',id=2,image_ref='6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T08:31:55Z,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005541913.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e2d97696ab6749899bb8ba5ce29a3de2',ramdisk_id='',reservation_id='r-6ofcfgb6',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T08:31:55Z,user_data=None,user_id='cb8b7d2a63b642aa999db12e17eeb9e4',uuid=b254bb7f-2891-4b37-9c44-9700e301ce16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.868 281858 DEBUG nova.network.os_vif_util [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Converting VIF {"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.869 281858 DEBUG nova.network.os_vif_util [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.870 281858 DEBUG os_vif [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.916 281858 DEBUG ovsdbapp.backend.ovs_idl [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.917 281858 DEBUG ovsdbapp.backend.ovs_idl [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.917 281858 DEBUG ovsdbapp.backend.ovs_idl [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.917 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.918 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.918 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.918 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.920 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.921 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.937 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.938 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.938 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:47:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:41.939 281858 INFO oslo.privsep.daemon [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp0bjjh037/privsep.sock']
Dec 02 09:47:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:42.709 281858 INFO oslo.privsep.daemon [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Spawned new privsep daemon via rootwrap
Dec 02 09:47:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:42.571 281913 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 09:47:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:42.577 281913 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 09:47:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:42.580 281913 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 02 09:47:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:42.581 281913 INFO oslo.privsep.daemon [-] privsep daemon running as pid 281913
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.010 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.010 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a318f6a-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.011 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a318f6a-b3, col_values=(('external_ids', {'iface-id': '4a318f6a-b3c1-4690-8246-f7d046ccd64a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:b2:03', 'vm-uuid': 'b254bb7f-2891-4b37-9c44-9700e301ce16'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.012 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.012 281858 INFO os_vif [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3')
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.013 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.016 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.017 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.180 281858 DEBUG oslo_concurrency.lockutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.180 281858 DEBUG oslo_concurrency.lockutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.181 281858 DEBUG oslo_concurrency.lockutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.181 281858 DEBUG nova.compute.resource_tracker [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.182 281858 DEBUG oslo_concurrency.processutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:47:43 np0005541913.localdomain sudo[282027]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrgugurxdhezledjxqfifsfifdevnufc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764668863.2463427-3935-117862045235641/AnsiballZ_podman_container.py
Dec 02 09:47:43 np0005541913.localdomain sudo[282027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.667 281858 DEBUG oslo_concurrency.processutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.731 281858 DEBUG nova.virt.libvirt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.732 281858 DEBUG nova.virt.libvirt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:47:43 np0005541913.localdomain python3.9[282029]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.966 281858 WARNING nova.virt.libvirt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.968 281858 DEBUG nova.compute.resource_tracker [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12138MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.968 281858 DEBUG oslo_concurrency.lockutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:47:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:43.968 281858 DEBUG oslo_concurrency.lockutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.128 281858 DEBUG nova.compute.resource_tracker [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.129 281858 DEBUG nova.compute.resource_tracker [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.129 281858 DEBUG nova.compute.resource_tracker [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:47:44 np0005541913.localdomain systemd[1]: Started libpod-conmon-ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a.scope.
Dec 02 09:47:44 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:47:44 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 02 09:47:44 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 02 09:47:44 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.207 281858 DEBUG nova.scheduler.client.report [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 09:47:44 np0005541913.localdomain podman[282057]: 2025-12-02 09:47:44.217907025 +0000 UTC m=+0.149062345 container init ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, io.buildah.version=1.41.3)
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.229 281858 DEBUG nova.scheduler.client.report [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.229 281858 DEBUG nova.compute.provider_tree [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 09:47:44 np0005541913.localdomain podman[282057]: 2025-12-02 09:47:44.235221758 +0000 UTC m=+0.166377078 container start ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, container_name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3)
Dec 02 09:47:44 np0005541913.localdomain python3.9[282029]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.245 281858 DEBUG nova.scheduler.client.report [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.273 281858 DEBUG nova.scheduler.client.report [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_ABM,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Applying nova statedir ownership
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/b254bb7f-2891-4b37-9c44-9700e301ce16/
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/b254bb7f-2891-4b37-9c44-9700e301ce16 already 42436:42436
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/b254bb7f-2891-4b37-9c44-9700e301ce16 to system_u:object_r:container_file_t:s0
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/b254bb7f-2891-4b37-9c44-9700e301ce16/console.log
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/4ee0f3f792b433d78f415a6f600ca9c7d9f0adb3
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-4ee0f3f792b433d78f415a6f600ca9c7d9f0adb3
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/20273498b7380904530133bcb3f720bd45f4f00b810dc4597d81d23acd8f9673
Dec 02 09:47:44 np0005541913.localdomain nova_compute_init[282077]: INFO:nova_statedir:Nova statedir ownership complete
Dec 02 09:47:44 np0005541913.localdomain systemd[1]: libpod-ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a.scope: Deactivated successfully.
Dec 02 09:47:44 np0005541913.localdomain podman[282078]: 2025-12-02 09:47:44.302694361 +0000 UTC m=+0.047451589 container died ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3)
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.309 281858 DEBUG oslo_concurrency.processutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:47:44 np0005541913.localdomain podman[282090]: 2025-12-02 09:47:44.388872744 +0000 UTC m=+0.071946024 container cleanup ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 09:47:44 np0005541913.localdomain systemd[1]: libpod-conmon-ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a.scope: Deactivated successfully.
Dec 02 09:47:44 np0005541913.localdomain sudo[282027]: pam_unix(sudo:session): session closed for user root
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.755 281858 DEBUG oslo_concurrency.processutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.764 281858 DEBUG nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.764 281858 INFO nova.virt.libvirt.host [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] kernel doesn't support AMD SEV
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.766 281858 DEBUG nova.compute.provider_tree [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.766 281858 DEBUG nova.virt.libvirt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.797 281858 DEBUG nova.scheduler.client.report [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.835 281858 DEBUG nova.compute.resource_tracker [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.835 281858 DEBUG oslo_concurrency.lockutils [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.867s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.836 281858 DEBUG nova.service [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.865 281858 DEBUG nova.service [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 02 09:47:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:44.866 281858 DEBUG nova.servicegroup.drivers.db [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] DB_Driver: join new ServiceGroup member np0005541913.localdomain to the compute group, service = <Service: host=np0005541913.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 02 09:47:45 np0005541913.localdomain sshd[263865]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:47:45 np0005541913.localdomain systemd[1]: session-60.scope: Deactivated successfully.
Dec 02 09:47:45 np0005541913.localdomain systemd[1]: session-60.scope: Consumed 1min 31.314s CPU time.
Dec 02 09:47:45 np0005541913.localdomain systemd-logind[757]: Session 60 logged out. Waiting for processes to exit.
Dec 02 09:47:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:47:45 np0005541913.localdomain systemd-logind[757]: Removed session 60.
Dec 02 09:47:45 np0005541913.localdomain systemd[1]: tmp-crun.olc6Vo.mount: Deactivated successfully.
Dec 02 09:47:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-eb1eee259f81f330f1ee0081bc4f7673956ff5103e0f4825b5784a1732364fc7-merged.mount: Deactivated successfully.
Dec 02 09:47:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ced4ea4dc33686779bdf74a001de2acc50ced170aea28ab3292bc675d820599a-userdata-shm.mount: Deactivated successfully.
Dec 02 09:47:45 np0005541913.localdomain podman[282156]: 2025-12-02 09:47:45.204895581 +0000 UTC m=+0.102288725 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 02 09:47:45 np0005541913.localdomain podman[282156]: 2025-12-02 09:47:45.216127891 +0000 UTC m=+0.113520985 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 09:47:45 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:47:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:46.512 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:46.922 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:47:48 np0005541913.localdomain podman[282175]: 2025-12-02 09:47:48.44834888 +0000 UTC m=+0.082970619 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 02 09:47:48 np0005541913.localdomain podman[282175]: 2025-12-02 09:47:48.482191014 +0000 UTC m=+0.116812753 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 09:47:48 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:47:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55252 DF PROTO=TCP SPT=40112 DPT=9102 SEQ=3432569286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A59FE40000000001030307) 
Dec 02 09:47:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:51.514 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:51.925 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:47:53 np0005541913.localdomain podman[282193]: 2025-12-02 09:47:53.458780507 +0000 UTC m=+0.092481613 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public)
Dec 02 09:47:53 np0005541913.localdomain podman[282193]: 2025-12-02 09:47:53.500951934 +0000 UTC m=+0.134653040 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6)
Dec 02 09:47:53 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:47:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:47:55 np0005541913.localdomain systemd[1]: tmp-crun.ueAFtY.mount: Deactivated successfully.
Dec 02 09:47:55 np0005541913.localdomain podman[282213]: 2025-12-02 09:47:55.455336303 +0000 UTC m=+0.096681455 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:47:55 np0005541913.localdomain podman[282213]: 2025-12-02 09:47:55.46344696 +0000 UTC m=+0.104792072 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:47:55 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:47:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:56.517 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:47:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:47:56.927 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:01.521 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:01.929 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:48:02 np0005541913.localdomain sudo[282237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:48:02 np0005541913.localdomain sudo[282237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:48:02 np0005541913.localdomain sudo[282237]: pam_unix(sudo:session): session closed for user root
Dec 02 09:48:02 np0005541913.localdomain podman[282243]: 2025-12-02 09:48:02.463812998 +0000 UTC m=+0.097784744 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 09:48:02 np0005541913.localdomain podman[282243]: 2025-12-02 09:48:02.478014917 +0000 UTC m=+0.111986663 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 09:48:02 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:48:02 np0005541913.localdomain sudo[282268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:48:02 np0005541913.localdomain sudo[282268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:48:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:03.032 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:48:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:03.033 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:48:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:03.035 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:48:03 np0005541913.localdomain sudo[282268]: pam_unix(sudo:session): session closed for user root
Dec 02 09:48:03 np0005541913.localdomain sudo[282323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:48:03 np0005541913.localdomain sudo[282323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:48:03 np0005541913.localdomain sudo[282323]: pam_unix(sudo:session): session closed for user root
Dec 02 09:48:03 np0005541913.localdomain sudo[282341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 -- inventory --format=json-pretty --filter-for-batch
Dec 02 09:48:03 np0005541913.localdomain sudo[282341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:48:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22740 DF PROTO=TCP SPT=49628 DPT=9102 SEQ=420584671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A5D95E0000000001030307) 
Dec 02 09:48:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:48:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:48:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:48:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:48:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:48:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:48:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:48:04 np0005541913.localdomain podman[282401]: 
Dec 02 09:48:04 np0005541913.localdomain podman[282401]: 2025-12-02 09:48:04.129987984 +0000 UTC m=+0.084174651 container create 25e48a37bc847410d76883af3a0a8bdc7b07c94cfd8c846b2e92aca008e4d243 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_goldstine, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:48:04 np0005541913.localdomain systemd[1]: Started libpod-conmon-25e48a37bc847410d76883af3a0a8bdc7b07c94cfd8c846b2e92aca008e4d243.scope.
Dec 02 09:48:04 np0005541913.localdomain podman[282401]: 2025-12-02 09:48:04.098186934 +0000 UTC m=+0.052373631 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:48:04 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:48:04 np0005541913.localdomain podman[282401]: 2025-12-02 09:48:04.228527898 +0000 UTC m=+0.182714565 container init 25e48a37bc847410d76883af3a0a8bdc7b07c94cfd8c846b2e92aca008e4d243 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_goldstine, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Dec 02 09:48:04 np0005541913.localdomain podman[282401]: 2025-12-02 09:48:04.241767581 +0000 UTC m=+0.195954248 container start 25e48a37bc847410d76883af3a0a8bdc7b07c94cfd8c846b2e92aca008e4d243 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_goldstine, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., release=1763362218, version=7, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:48:04 np0005541913.localdomain podman[282401]: 2025-12-02 09:48:04.242133292 +0000 UTC m=+0.196319989 container attach 25e48a37bc847410d76883af3a0a8bdc7b07c94cfd8c846b2e92aca008e4d243 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_goldstine, version=7, release=1763362218, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 09:48:04 np0005541913.localdomain lucid_goldstine[282415]: 167 167
Dec 02 09:48:04 np0005541913.localdomain systemd[1]: libpod-25e48a37bc847410d76883af3a0a8bdc7b07c94cfd8c846b2e92aca008e4d243.scope: Deactivated successfully.
Dec 02 09:48:04 np0005541913.localdomain podman[282401]: 2025-12-02 09:48:04.25220193 +0000 UTC m=+0.206388597 container died 25e48a37bc847410d76883af3a0a8bdc7b07c94cfd8c846b2e92aca008e4d243 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_goldstine, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, name=rhceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:48:04 np0005541913.localdomain podman[282420]: 2025-12-02 09:48:04.347348293 +0000 UTC m=+0.080146783 container remove 25e48a37bc847410d76883af3a0a8bdc7b07c94cfd8c846b2e92aca008e4d243 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_goldstine, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public)
Dec 02 09:48:04 np0005541913.localdomain systemd[1]: libpod-conmon-25e48a37bc847410d76883af3a0a8bdc7b07c94cfd8c846b2e92aca008e4d243.scope: Deactivated successfully.
Dec 02 09:48:04 np0005541913.localdomain podman[282442]: 
Dec 02 09:48:04 np0005541913.localdomain podman[282442]: 2025-12-02 09:48:04.575678575 +0000 UTC m=+0.072689563 container create 4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_chebyshev, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_BRANCH=main, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 02 09:48:04 np0005541913.localdomain systemd[1]: Started libpod-conmon-4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47.scope.
Dec 02 09:48:04 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:48:04 np0005541913.localdomain podman[282442]: 2025-12-02 09:48:04.539150029 +0000 UTC m=+0.036160937 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:48:04 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b686dc17f23e0315f34b7cc4f7018493dbaebddc656563916366a73d5a1742ce/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 09:48:04 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b686dc17f23e0315f34b7cc4f7018493dbaebddc656563916366a73d5a1742ce/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 09:48:04 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b686dc17f23e0315f34b7cc4f7018493dbaebddc656563916366a73d5a1742ce/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 09:48:04 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b686dc17f23e0315f34b7cc4f7018493dbaebddc656563916366a73d5a1742ce/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 09:48:04 np0005541913.localdomain podman[282442]: 2025-12-02 09:48:04.648880291 +0000 UTC m=+0.145891199 container init 4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_chebyshev, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 02 09:48:04 np0005541913.localdomain podman[282442]: 2025-12-02 09:48:04.660279816 +0000 UTC m=+0.157290724 container start 4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_chebyshev, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 09:48:04 np0005541913.localdomain podman[282442]: 2025-12-02 09:48:04.660510542 +0000 UTC m=+0.157521510 container attach 4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_chebyshev, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, version=7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:48:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22741 DF PROTO=TCP SPT=49628 DPT=9102 SEQ=420584671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A5DD650000000001030307) 
Dec 02 09:48:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-1c72a26b6b6ad18fa93a0fd1675b8dcf633d0b072b12660cc6d8a09e43e9dfea-merged.mount: Deactivated successfully.
Dec 02 09:48:05 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55253 DF PROTO=TCP SPT=40112 DPT=9102 SEQ=3432569286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A5DFE40000000001030307) 
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]: [
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:     {
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         "available": false,
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         "ceph_device": false,
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         "lsm_data": {},
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         "lvs": [],
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         "path": "/dev/sr0",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         "rejected_reasons": [
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "Has a FileSystem",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "Insufficient space (<5GB)"
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         ],
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         "sys_api": {
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "actuators": null,
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "device_nodes": "sr0",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "human_readable_size": "482.00 KB",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "id_bus": "ata",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "model": "QEMU DVD-ROM",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "nr_requests": "2",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "partitions": {},
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "path": "/dev/sr0",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "removable": "1",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "rev": "2.5+",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "ro": "0",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "rotational": "1",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "sas_address": "",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "sas_device_handle": "",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "scheduler_mode": "mq-deadline",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "sectors": 0,
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "sectorsize": "2048",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "size": 493568.0,
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "support_discard": "0",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "type": "disk",
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:             "vendor": "QEMU"
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:         }
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]:     }
Dec 02 09:48:05 np0005541913.localdomain blissful_chebyshev[282457]: ]
Dec 02 09:48:05 np0005541913.localdomain systemd[1]: libpod-4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47.scope: Deactivated successfully.
Dec 02 09:48:05 np0005541913.localdomain systemd[1]: libpod-4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47.scope: Consumed 1.085s CPU time.
Dec 02 09:48:05 np0005541913.localdomain podman[282442]: 2025-12-02 09:48:05.706857084 +0000 UTC m=+1.203867992 container died 4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_chebyshev, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, ceph=True, architecture=x86_64, build-date=2025-11-26T19:44:28Z)
Dec 02 09:48:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:48:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:48:05 np0005541913.localdomain podman[284297]: 2025-12-02 09:48:05.838243155 +0000 UTC m=+0.099058428 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:48:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b686dc17f23e0315f34b7cc4f7018493dbaebddc656563916366a73d5a1742ce-merged.mount: Deactivated successfully.
Dec 02 09:48:05 np0005541913.localdomain podman[284291]: 2025-12-02 09:48:05.869344256 +0000 UTC m=+0.147752649 container remove 4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_chebyshev, ceph=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, release=1763362218, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:48:05 np0005541913.localdomain systemd[1]: libpod-conmon-4c6ea5adc25334a5169f7005a9991657a3b03f99c8e523e217a37408b43a7b47.scope: Deactivated successfully.
Dec 02 09:48:05 np0005541913.localdomain podman[284298]: 2025-12-02 09:48:05.805799509 +0000 UTC m=+0.069354415 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:48:05 np0005541913.localdomain sudo[282341]: pam_unix(sudo:session): session closed for user root
Dec 02 09:48:05 np0005541913.localdomain podman[284297]: 2025-12-02 09:48:05.923117444 +0000 UTC m=+0.183932667 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:48:05 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:48:05 np0005541913.localdomain podman[284298]: 2025-12-02 09:48:05.943297203 +0000 UTC m=+0.206852049 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:48:05 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:48:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:48:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:48:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:48:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148825 "" "Go-http-client/1.1"
Dec 02 09:48:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:48:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17215 "" "Go-http-client/1.1"
Dec 02 09:48:06 np0005541913.localdomain sudo[284349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:48:06 np0005541913.localdomain sudo[284349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:48:06 np0005541913.localdomain sudo[284349]: pam_unix(sudo:session): session closed for user root
Dec 02 09:48:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:06.524 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:06.963 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22742 DF PROTO=TCP SPT=49628 DPT=9102 SEQ=420584671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A5E5640000000001030307) 
Dec 02 09:48:08 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37569 DF PROTO=TCP SPT=42372 DPT=9102 SEQ=1588680873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A5E9E50000000001030307) 
Dec 02 09:48:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:08.318 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 09:48:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:08.320 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 09:48:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:08.364 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:11 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22743 DF PROTO=TCP SPT=49628 DPT=9102 SEQ=420584671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A5F5240000000001030307) 
Dec 02 09:48:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:11.527 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:11.997 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:14 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:14.322 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:48:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:48:15 np0005541913.localdomain podman[284367]: 2025-12-02 09:48:15.443138376 +0000 UTC m=+0.086065422 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:48:15 np0005541913.localdomain podman[284367]: 2025-12-02 09:48:15.481196783 +0000 UTC m=+0.124123819 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 09:48:15 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.101 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.105 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf7794d1-1fb1-472e-a0fa-db85a6da187b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.101966', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '067da1fa-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': 'a3afb7a36447b56f91819bbb341c4aea7285e8efad95e2704509d49457e82e61'}]}, 'timestamp': '2025-12-02 09:48:16.106354', '_unique_id': 'c134ad952f364f0eb0d2c2cfce35c3b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.107 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.145 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.145 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f971dd93-304a-447b-8849-cd486efae039', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:48:16.108514', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0683a6c2-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': 'bb380173e34ef89dab37cc4801ba889ec107d89944f3b5914706ec253627090e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:48:16.108514', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0683b4aa-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': '6efeb5763f31f0ca34b8a060186f85c92e7b7ab1b472e00237999cbc4c3f9ad8'}]}, 'timestamp': '2025-12-02 09:48:16.146081', '_unique_id': '6e79939c99d74fe19baea6b37a93e624'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.146 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.160 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.160 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd518270d-bb9f-4977-b09e-a69dd26637c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:48:16.147871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0685e5fe-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.366935341, 'message_signature': 'c7ea3f6bdca6866d789b9ed88c0e235ca81bb996f09af537a4396ef13b1f45fd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:48:16.147871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0685ee96-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.366935341, 'message_signature': '7208b7ff9601c2cee205e47ba7cced047802a93adeb6ed93856605e5f6d58cfc'}]}, 'timestamp': '2025-12-02 09:48:16.160659', '_unique_id': '30226740ef96457dbdb18dfadbf8f2a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.161 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82d38609-915c-4476-a7c0-40d846c97d51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.161703', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '068620be-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': '8e827bc9bb15bbd26e95e573eac53565e2e0de56b994a386d8f2e59fc29eb572'}]}, 'timestamp': '2025-12-02 09:48:16.161932', '_unique_id': '6a85fcb9b4ed474da18d1f54269f8e2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.162 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 196 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '027e1845-e938-4ce7-bd91-4c5c3ddc8eef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 196, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.162884', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '06864eae-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': '78064fd3a992352eac982b2922f929862d81288249d1f1a160794adbb2a8bf2b'}]}, 'timestamp': '2025-12-02 09:48:16.163092', '_unique_id': '7a65e6d1173f4db1b437a88e6940f216'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2adea395-f14f-4bfa-9108-58ff2b78145f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:48:16.164023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06867afa-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': 'f72b8903d73cb4d4f59ea24c77129b79135453a5a630c895097e344d19568c10'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:48:16.164023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '068681f8-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': 'ec24df6c38c59655f7de4ca3ed88394f6bbe3631733dbc7127e9deecf505742d'}]}, 'timestamp': '2025-12-02 09:48:16.164391', '_unique_id': '81800868100547a2b06da2bd1a4d781c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 11468 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63ade8cb-9927-4767-a450-e61bad7ed309', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11468, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.165356', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '0686af2a-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': '026a8cca430826ffeb78133547e926b15937654058cab852b89a833cc2b3a979'}]}, 'timestamp': '2025-12-02 09:48:16.165561', '_unique_id': '9a69d4279d404dda906c463d00e01e5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.166 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.166 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2060b8cb-326f-4002-b9ae-dc6e30af26d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:48:16.166487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0686db3a-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.366935341, 'message_signature': '7032598a6188ae4f9ca05b388eb980bb6468fdf3a85d53440512f143a97ac802'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:48:16.166487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0686e2ec-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.366935341, 'message_signature': 'fe4538f5a0c674bf8c876ebcdc29fe25a901e02c79d4bbd6dca65beb94aac193'}]}, 'timestamp': '2025-12-02 09:48:16.166873', '_unique_id': 'a8159618c7a744a7ad2ec9069aff17f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.167 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '254d8390-0c11-4b09-a74c-d1a30f7d6592', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.167817', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '06870f42-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': '12624cf682925ac19618af4c349f8b6a13ec4caf758f50cf1ab2244111e5a231'}]}, 'timestamp': '2025-12-02 09:48:16.168021', '_unique_id': 'f10ac44f8ed744b48571838a14adb11b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.168 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '455389c2-1a76-4386-b51b-5b8cd574d4c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:48:16.168955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06873cc4-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': 'ab658e0182105320ca732198ecbece5d25f71401677d973e3ac35cb169958b59'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:48:16.168955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0687461a-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': 'c2c4e7726fa2115dd0563f44db4af5a3d34b7cc6cf6fd7fe05b27615200fcd86'}]}, 'timestamp': '2025-12-02 09:48:16.169446', '_unique_id': '3eedf128a6e74343b2b62ffca1b738f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.169 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.170 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 196 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91f763e3-6e69-490c-bb08-89d4e306549b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 196, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.170900', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '068788f0-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': 'b058f1ba24338b9b1b1e896dda9cbcb706d83b63608896c8e32366c924f6ca24'}]}, 'timestamp': '2025-12-02 09:48:16.171181', '_unique_id': '9f82aef4816d4c159404867ab7f82fcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.171 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.172 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.192 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d6d6c0d-e44e-4d1d-8443-5b654436b33b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:48:16.172437', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '068ac66e-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.411057499, 'message_signature': 'ba98e5962066bd60e0920bf45ba5e3c53781aae309d3c5c280b792840a1c64c3'}]}, 'timestamp': '2025-12-02 09:48:16.192419', '_unique_id': '9b8f0ce8920e4446974204909c87aa6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.194 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.194 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '310d61ea-cf45-4a08-a439-7fb4173c6110', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:48:16.194025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '068b1060-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': '35654fae5e48963456276ec7388093352e08a6a0042cb9ddae2b0e1183fa46a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:48:16.194025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '068b19de-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': 'db380c5b3eb09e516568e9c735818f03b923c44408c6ab82b4814a1657351cf8'}]}, 'timestamp': '2025-12-02 09:48:16.194529', '_unique_id': 'cba6f88da0da4b93a737976f52dfe744'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.196 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 59090000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9deab18e-0c55-435d-85ba-754accf74b71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59090000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:48:16.196142', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '068b62e0-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.411057499, 'message_signature': '688952417cae1626d599c639e5d717d290001a75b452334499e6020caffcec7b'}]}, 'timestamp': '2025-12-02 09:48:16.196403', '_unique_id': 'ec760b6241be4b81bc1ff74e41d3b06e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.197 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 9425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5418429-a254-4783-943a-47cb0be53f29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9425, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.197840', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '068ba598-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': '58b83a4170b81a3e8e74367d60eef6513f753066179fe5fed3a2295084461ca5'}]}, 'timestamp': '2025-12-02 09:48:16.198125', '_unique_id': '276736bfee8f4d069e832fd30983a755'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.199 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64c4b715-aa19-4f6a-9fb9-7811583cf400', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.199365', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '068be0da-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': '129b3c9a031226f0ef25d52853bde03f6afc60cb391b34f838ebac37aa12a030'}]}, 'timestamp': '2025-12-02 09:48:16.199655', '_unique_id': 'ae78b229860544278397d127328e7b4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3517292e-f157-41f6-a3b8-fb2e6af8aa50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.200927', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '068c1dde-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': '42d5e9d2764b4074b22a23fe021d8324c59a0665bf6fadd00f9b449ed312dae0'}]}, 'timestamp': '2025-12-02 09:48:16.201202', '_unique_id': 'f35af534e87140e4a076ce32ccd0c103'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.202 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1433516318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.202 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 164656012 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0dd16b95-9d1c-48fb-9ecc-199c8f7fa029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1433516318, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:48:16.202444', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '068c58f8-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': '80c3ea3a870043e5aef8514fa8770a3d7e67b7a8f7a50783b9fbcf909b33aff7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164656012, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:48:16.202444', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '068c6348-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': '7f0fe3b965ceab1b928e4f6ea0d9022d232d9280cc797860f82a25aa8d024638'}]}, 'timestamp': '2025-12-02 09:48:16.202969', '_unique_id': 'c95b82a958a745e4a807df071bd7e3de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.204 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 89 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '375d7814-f891-41c1-bdfd-7cfd695b75f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 89, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:48:16.204385', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '068ca510-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.321025073, 'message_signature': '486a65b11808b928f88783bf877244308df59dee43b7f5a72264faaea2d92cc6'}]}, 'timestamp': '2025-12-02 09:48:16.204677', '_unique_id': 'c37f7d706c7349fcb08e2814a536d5ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.205 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.206 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c32735f-2056-4730-9263-26c68babfa76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:48:16.206156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '068ce9ee-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.366935341, 'message_signature': 'cbd6e592e2b225a5cc7cf0f1aec8c17170789dca2930754d8ed058e69a0e40ef'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:48:16.206156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '068cf3bc-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.366935341, 'message_signature': '3272e9a018488cc05d202a63b6ee4e9912023af197dc29a8fcdd18f0a07ae522'}]}, 'timestamp': '2025-12-02 09:48:16.206685', '_unique_id': 'd80de2a1ab7b47f2b3927a467f48b631'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.207 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 286697561 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 39228582 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2e05042-d184-43b9-81d5-e87d0f322ee8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 286697561, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:48:16.207934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '068d2f58-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': 'c3b30f94d5c1b0c0d587feb341a76153e005454ab78c1101e65adba2051bbbd1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39228582, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:48:16.207934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '068d38ae-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11058.327607669, 'message_signature': 'd502c616b6a2d42484a87e2d07bcf3d2b6c08fde50d37fc887231e73c63ccf5f'}]}, 'timestamp': '2025-12-02 09:48:16.208422', '_unique_id': '20ca740e4fd64c53be59527ba7d62cdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:48:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:48:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:48:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:16.528 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:17.034 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:17.869 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:17.895 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Triggering sync for uuid b254bb7f-2891-4b37-9c44-9700e301ce16 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 02 09:48:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:17.896 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:48:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:17.897 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:48:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:17.897 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:17.921 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:48:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:48:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22744 DF PROTO=TCP SPT=49628 DPT=9102 SEQ=420584671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A615E40000000001030307) 
Dec 02 09:48:19 np0005541913.localdomain podman[284387]: 2025-12-02 09:48:19.454828075 +0000 UTC m=+0.084950991 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 09:48:19 np0005541913.localdomain podman[284387]: 2025-12-02 09:48:19.484699724 +0000 UTC m=+0.114822600 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:48:19 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:48:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:21.531 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:22.037 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:23.123 281858 DEBUG nova.compute.manager [None req-39f4a9bf-0492-4be9-985b-94a3d1c1b88a cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:48:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:23.128 281858 INFO nova.compute.manager [None req-39f4a9bf-0492-4be9-985b-94a3d1c1b88a cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Retrieving diagnostics
Dec 02 09:48:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:48:24 np0005541913.localdomain podman[284406]: 2025-12-02 09:48:24.441719392 +0000 UTC m=+0.081523309 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41)
Dec 02 09:48:24 np0005541913.localdomain podman[284406]: 2025-12-02 09:48:24.452734697 +0000 UTC m=+0.092538624 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 09:48:24 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:48:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:48:26 np0005541913.localdomain podman[284428]: 2025-12-02 09:48:26.435987306 +0000 UTC m=+0.072656793 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:48:26 np0005541913.localdomain podman[284428]: 2025-12-02 09:48:26.443438005 +0000 UTC m=+0.080107522 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:48:26 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:48:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:26.535 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:27.084 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:28.681 281858 DEBUG oslo_concurrency.lockutils [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:48:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:28.682 281858 DEBUG oslo_concurrency.lockutils [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:48:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:28.683 281858 DEBUG nova.compute.manager [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:48:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:28.687 281858 DEBUG nova.compute.manager [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Dec 02 09:48:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:28.692 281858 DEBUG nova.objects.instance [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lazy-loading 'flavor' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:48:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:28.732 281858 DEBUG nova.virt.libvirt.driver [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 02 09:48:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:31.544 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:31.752 281858 INFO nova.virt.libvirt.driver [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Instance shutdown successfully after 3 seconds.
Dec 02 09:48:32 np0005541913.localdomain kernel: device tap4a318f6a-b3 left promiscuous mode
Dec 02 09:48:32 np0005541913.localdomain NetworkManager[5965]: <info>  [1764668912.0417] device (tap4a318f6a-b3): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00048|binding|INFO|Releasing lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a from this chassis (sb_readonly=0)
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00049|binding|INFO|Setting lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a down in Southbound
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00050|binding|INFO|Removing iface tap4a318f6a-b3 ovn-installed in OVS
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.048 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.052 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:32.058 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:b2:03 192.168.0.102'], port_security=['fa:16:3e:26:b2:03 192.168.0.102'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.102/24', 'neutron:device_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005541913.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-595e1c9b-709c-41d2-9212-0b18b13291a8', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'neutron:revision_number': '7', 'neutron:security_group_ids': '1725f11b-f73c-4c4f-b3d3-772d68fcc09e 23293c48-39ca-43a0-a462-ebc8626a7f6c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23d69817-a35d-4528-880f-f329bfbd969c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=4a318f6a-b3c1-4690-8246-f7d046ccd64a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 09:48:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:32.061 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 4a318f6a-b3c1-4690-8246-f7d046ccd64a in datapath 595e1c9b-709c-41d2-9212-0b18b13291a8 unbound from our chassis
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00051|ovn_bfd|INFO|Disabled BFD on interface ovn-be95dc-0
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-2587fe-0
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00053|ovn_bfd|INFO|Disabled BFD on interface ovn-4d166c-0
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.061 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00054|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.063 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:32.065 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 595e1c9b-709c-41d2-9212-0b18b13291a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.068 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:32.074 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[efbcb7d0-e20d-48fe-b61d-8bddbcf37534]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:32.077 160221 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8 namespace which is not needed anymore
Dec 02 09:48:32 np0005541913.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 02 09:48:32 np0005541913.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 4min 7.353s CPU time.
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.086 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.089 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain systemd-machined[84262]: Machine qemu-1-instance-00000002 terminated.
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00055|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.099 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain kernel: device tap4a318f6a-b3 entered promiscuous mode
Dec 02 09:48:32 np0005541913.localdomain NetworkManager[5965]: <info>  [1764668912.1682] manager: (tap4a318f6a-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/15)
Dec 02 09:48:32 np0005541913.localdomain kernel: device tap4a318f6a-b3 left promiscuous mode
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00056|binding|INFO|Claiming lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a for this chassis.
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00057|binding|INFO|4a318f6a-b3c1-4690-8246-f7d046ccd64a: Claiming fa:16:3e:26:b2:03 192.168.0.102
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.170 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.177 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:32.184 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:b2:03 192.168.0.102'], port_security=['fa:16:3e:26:b2:03 192.168.0.102'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.102/24', 'neutron:device_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005541913.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-595e1c9b-709c-41d2-9212-0b18b13291a8', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'neutron:revision_number': '7', 'neutron:security_group_ids': '1725f11b-f73c-4c4f-b3d3-772d68fcc09e 23293c48-39ca-43a0-a462-ebc8626a7f6c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23d69817-a35d-4528-880f-f329bfbd969c, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=4a318f6a-b3c1-4690-8246-f7d046ccd64a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00058|ovn_bfd|INFO|Enabled BFD on interface ovn-be95dc-0
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00059|ovn_bfd|INFO|Enabled BFD on interface ovn-2587fe-0
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00060|ovn_bfd|INFO|Enabled BFD on interface ovn-4d166c-0
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.188 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.197 281858 INFO nova.virt.libvirt.driver [-] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Instance destroyed successfully.
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.199 281858 DEBUG nova.objects.instance [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lazy-loading 'numa_topology' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.201 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.209 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.214 281858 DEBUG nova.compute.manager [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00061|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00062|binding|INFO|Releasing lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a from this chassis (sb_readonly=0)
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.219 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:32.229 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:b2:03 192.168.0.102'], port_security=['fa:16:3e:26:b2:03 192.168.0.102'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.102/24', 'neutron:device_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005541913.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-595e1c9b-709c-41d2-9212-0b18b13291a8', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'neutron:revision_number': '7', 'neutron:security_group_ids': '1725f11b-f73c-4c4f-b3d3-772d68fcc09e 23293c48-39ca-43a0-a462-ebc8626a7f6c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23d69817-a35d-4528-880f-f329bfbd969c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=4a318f6a-b3c1-4690-8246-f7d046ccd64a) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.236 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00063|ovn_bfd|INFO|Disabled BFD on interface ovn-be95dc-0
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00064|ovn_bfd|INFO|Disabled BFD on interface ovn-2587fe-0
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00065|ovn_bfd|INFO|Disabled BFD on interface ovn-4d166c-0
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.240 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:48:32Z|00066|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.297 281858 DEBUG oslo_concurrency.lockutils [None req-e3730b24-38b8-4992-a4b0-220655a1395e cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.481 281858 DEBUG nova.compute.manager [req-4d6c6eb6-ae7a-4fc2-93af-2b34924fdc51 req-fbe73d02-414c-480b-8f2f-545135c3c23f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Received event network-vif-unplugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.482 281858 DEBUG oslo_concurrency.lockutils [req-4d6c6eb6-ae7a-4fc2-93af-2b34924fdc51 req-fbe73d02-414c-480b-8f2f-545135c3c23f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.483 281858 DEBUG oslo_concurrency.lockutils [req-4d6c6eb6-ae7a-4fc2-93af-2b34924fdc51 req-fbe73d02-414c-480b-8f2f-545135c3c23f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.484 281858 DEBUG oslo_concurrency.lockutils [req-4d6c6eb6-ae7a-4fc2-93af-2b34924fdc51 req-fbe73d02-414c-480b-8f2f-545135c3c23f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.484 281858 DEBUG nova.compute.manager [req-4d6c6eb6-ae7a-4fc2-93af-2b34924fdc51 req-fbe73d02-414c-480b-8f2f-545135c3c23f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] No waiting events found dispatching network-vif-unplugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 02 09:48:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:32.485 281858 WARNING nova.compute.manager [req-4d6c6eb6-ae7a-4fc2-93af-2b34924fdc51 req-fbe73d02-414c-480b-8f2f-545135c3c23f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Received unexpected event network-vif-unplugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a for instance with vm_state stopped and task_state None.
Dec 02 09:48:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:48:33 np0005541913.localdomain podman[284505]: 2025-12-02 09:48:33.450773028 +0000 UTC m=+0.089027091 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec 02 09:48:33 np0005541913.localdomain podman[284505]: 2025-12-02 09:48:33.464990887 +0000 UTC m=+0.103244940 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:48:33 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:48:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30372 DF PROTO=TCP SPT=40134 DPT=9102 SEQ=1121546869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A64E8D0000000001030307) 
Dec 02 09:48:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:48:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:48:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:48:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:48:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:48:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:48:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:48:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.511 281858 DEBUG nova.compute.manager [None req-e93c46bf-57e3-49d5-992e-2d67c7c6f1a5 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.527 281858 DEBUG nova.compute.manager [req-10a7b066-1348-4a94-8649-94365cb4398f req-8185f3e7-6ea6-4b45-aaab-316a66806064 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Received event network-vif-plugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.528 281858 DEBUG oslo_concurrency.lockutils [req-10a7b066-1348-4a94-8649-94365cb4398f req-8185f3e7-6ea6-4b45-aaab-316a66806064 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.528 281858 DEBUG oslo_concurrency.lockutils [req-10a7b066-1348-4a94-8649-94365cb4398f req-8185f3e7-6ea6-4b45-aaab-316a66806064 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.529 281858 DEBUG oslo_concurrency.lockutils [req-10a7b066-1348-4a94-8649-94365cb4398f req-8185f3e7-6ea6-4b45-aaab-316a66806064 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.530 281858 DEBUG nova.compute.manager [req-10a7b066-1348-4a94-8649-94365cb4398f req-8185f3e7-6ea6-4b45-aaab-316a66806064 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] No waiting events found dispatching network-vif-plugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.530 281858 WARNING nova.compute.manager [req-10a7b066-1348-4a94-8649-94365cb4398f req-8185f3e7-6ea6-4b45-aaab-316a66806064 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Received unexpected event network-vif-plugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a for instance with vm_state stopped and task_state None.
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server [None req-e93c46bf-57e3-49d5-992e-2d67c7c6f1a5 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance b254bb7f-2891-4b37-9c44-9700e301ce16 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     raise self.value
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     raise self.value
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance b254bb7f-2891-4b37-9c44-9700e301ce16 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Dec 02 09:48:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:34.564 281858 ERROR oslo_messaging.rpc.server 
Dec 02 09:48:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30373 DF PROTO=TCP SPT=40134 DPT=9102 SEQ=1121546869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A652A40000000001030307) 
Dec 02 09:48:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22745 DF PROTO=TCP SPT=49628 DPT=9102 SEQ=420584671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A655E40000000001030307) 
Dec 02 09:48:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:48:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:48:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:48:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148826 "" "Go-http-client/1.1"
Dec 02 09:48:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:48:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17104 "" "Go-http-client/1.1"
Dec 02 09:48:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:48:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:48:36 np0005541913.localdomain podman[284526]: 2025-12-02 09:48:36.455344121 +0000 UTC m=+0.090093949 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 09:48:36 np0005541913.localdomain podman[284526]: 2025-12-02 09:48:36.495021761 +0000 UTC m=+0.129771629 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 02 09:48:36 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:48:36 np0005541913.localdomain podman[284525]: 2025-12-02 09:48:36.505792739 +0000 UTC m=+0.141992896 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:48:36 np0005541913.localdomain podman[284525]: 2025-12-02 09:48:36.542091209 +0000 UTC m=+0.178291296 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:48:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:36.545 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:36 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:48:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30374 DF PROTO=TCP SPT=40134 DPT=9102 SEQ=1121546869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A65AA50000000001030307) 
Dec 02 09:48:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:37.088 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55254 DF PROTO=TCP SPT=40112 DPT=9102 SEQ=3432569286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A65DE40000000001030307) 
Dec 02 09:48:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:40.877 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:40.877 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:40.878 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:48:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:40.878 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:48:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30375 DF PROTO=TCP SPT=40134 DPT=9102 SEQ=1121546869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A66A640000000001030307) 
Dec 02 09:48:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:41.579 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:42.090 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:42.327 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:48:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:42.327 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:48:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:42.328 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:48:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:42.328 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:48:42 np0005541913.localdomain podman[284488]: 2025-12-02 09:48:42.365811307 +0000 UTC m=+10.145274427 container stop 7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:48:42 np0005541913.localdomain systemd[1]: libpod-7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0.scope: Deactivated successfully.
Dec 02 09:48:42 np0005541913.localdomain podman[284488]: 2025-12-02 09:48:42.374024384 +0000 UTC m=+10.153487494 container died 7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, release=1761123044, batch=17.1_20251118.1)
Dec 02 09:48:42 np0005541913.localdomain podman[284488]: 2025-12-02 09:48:42.563589973 +0000 UTC m=+10.343053063 container cleanup 7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64)
Dec 02 09:48:42 np0005541913.localdomain podman[284574]: 2025-12-02 09:48:42.5812966 +0000 UTC m=+0.197018436 container cleanup 7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 02 09:48:42 np0005541913.localdomain systemd[1]: libpod-conmon-7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0.scope: Deactivated successfully.
Dec 02 09:48:42 np0005541913.localdomain podman[284590]: 2025-12-02 09:48:42.663338574 +0000 UTC m=+0.076171871 container remove 7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.668 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[639379d6-f901-43fa-9586-c26625e173a1]: (4, ('Tue Dec  2 09:48:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8 (7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0)\n7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0\nTue Dec  2 09:48:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8 (7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0)\n7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0\n', 'time="2025-12-02T09:48:42Z" level=warning msg="StopSignal SIGTERM failed to stop container neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8 in 10 seconds, resorting to SIGKILL"\n', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.671 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[34ee909e-09fb-437b-9138-eba6356640a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.672 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap595e1c9b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:48:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:42.719 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:42 np0005541913.localdomain kernel: device tap595e1c9b-70 left promiscuous mode
Dec 02 09:48:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:42.732 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.735 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[f88c8799-0820-4a5b-abf4-90c06fbb73a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.753 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[eba3154b-9c17-49bb-9a14-d12b8833a0b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.754 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7f51dc-9761-4ec2-ab28-794b475bdb89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.770 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[0c15aafb-1c13-4c9c-b6b3-addea7869f3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647713, 'reachable_time': 41307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284614, 'error': None, 'target': 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.789 160371 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.791 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3f1b00-ba00-4181-8cc3-351d508cfa5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.793 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 4a318f6a-b3c1-4690-8246-f7d046ccd64a in datapath 595e1c9b-709c-41d2-9212-0b18b13291a8 unbound from our chassis
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.796 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 595e1c9b-709c-41d2-9212-0b18b13291a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.797 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[726a56ec-b375-4e1b-abf5-7de95fa1f034]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.799 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 4a318f6a-b3c1-4690-8246-f7d046ccd64a in datapath 595e1c9b-709c-41d2-9212-0b18b13291a8 unbound from our chassis
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.801 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 595e1c9b-709c-41d2-9212-0b18b13291a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 09:48:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:48:42.802 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5dae17ac-18fc-4de3-96de-27fc7b3b009c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:48:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-9f61ced3f88a0be87d665800e8e7cb17559a616ee2c3a746c87a603ddb5549d7-merged.mount: Deactivated successfully.
Dec 02 09:48:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e98e9f24e52d3758fb7e26858bb0e13707794227fcf30ccf3a4aafe11bccfd0-userdata-shm.mount: Deactivated successfully.
Dec 02 09:48:43 np0005541913.localdomain systemd[1]: run-netns-ovnmeta\x2d595e1c9b\x2d709c\x2d41d2\x2d9212\x2d0b18b13291a8.mount: Deactivated successfully.
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.446 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.472 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.473 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.474 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.475 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.475 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.476 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.477 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.477 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.478 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.478 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.498 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.498 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.499 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.499 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.500 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:48:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:43.954 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.473 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.474 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.649 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.651 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12579MB free_disk=41.83708190917969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.651 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.652 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.839 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.840 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.840 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:48:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:44.879 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:48:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:45.328 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:48:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:45.335 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:48:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:45.554 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:48:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:45.576 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:48:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:45.577 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.925s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:48:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:48:46 np0005541913.localdomain podman[284660]: 2025-12-02 09:48:46.438718408 +0000 UTC m=+0.079322032 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:48:46 np0005541913.localdomain podman[284660]: 2025-12-02 09:48:46.452780869 +0000 UTC m=+0.093384463 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:48:46 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:48:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:46.610 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:47.092 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:47.192 281858 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764668912.1907604, b254bb7f-2891-4b37-9c44-9700e301ce16 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 09:48:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:47.192 281858 INFO nova.compute.manager [-] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] VM Stopped (Lifecycle Event)
Dec 02 09:48:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:47.211 281858 DEBUG nova.compute.manager [None req-cd504abc-ed71-4db4-ab92-243250a2a677 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:48:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:47.214 281858 DEBUG nova.compute.manager [None req-cd504abc-ed71-4db4-ab92-243250a2a677 - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 02 09:48:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30376 DF PROTO=TCP SPT=40134 DPT=9102 SEQ=1121546869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A689E40000000001030307) 
Dec 02 09:48:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:48:50 np0005541913.localdomain podman[284677]: 2025-12-02 09:48:50.451366079 +0000 UTC m=+0.088358561 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 09:48:50 np0005541913.localdomain podman[284677]: 2025-12-02 09:48:50.484912103 +0000 UTC m=+0.121904535 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:48:50 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:48:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:51.612 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.093 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.555 281858 DEBUG nova.compute.manager [None req-e509d44f-e02c-4355-b6e6-d768bc766666 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server [None req-e509d44f-e02c-4355-b6e6-d768bc766666 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance b254bb7f-2891-4b37-9c44-9700e301ce16 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     raise self.value
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     raise self.value
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance b254bb7f-2891-4b37-9c44-9700e301ce16 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Dec 02 09:48:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:52.575 281858 ERROR oslo_messaging.rpc.server 
Dec 02 09:48:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:48:55 np0005541913.localdomain podman[284695]: 2025-12-02 09:48:55.433525403 +0000 UTC m=+0.078066349 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 09:48:55 np0005541913.localdomain podman[284695]: 2025-12-02 09:48:55.451271161 +0000 UTC m=+0.095812057 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_id=edpm, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 09:48:55 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:48:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:56.615 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:57.131 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:48:57 np0005541913.localdomain podman[284715]: 2025-12-02 09:48:57.44292931 +0000 UTC m=+0.086943844 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:48:57 np0005541913.localdomain podman[284715]: 2025-12-02 09:48:57.479420512 +0000 UTC m=+0.123435056 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:48:57 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:48:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:58.441 281858 DEBUG nova.objects.instance [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lazy-loading 'flavor' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:48:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:58.458 281858 DEBUG oslo_concurrency.lockutils [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:48:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:58.459 281858 DEBUG oslo_concurrency.lockutils [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:48:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:58.459 281858 DEBUG nova.network.neutron [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 02 09:48:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:58.460 281858 DEBUG nova.objects.instance [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.518 281858 DEBUG nova.network.neutron [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.531 281858 DEBUG oslo_concurrency.lockutils [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.555 281858 INFO nova.virt.libvirt.driver [-] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Instance destroyed successfully.
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.555 281858 DEBUG nova.objects.instance [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lazy-loading 'numa_topology' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.567 281858 DEBUG nova.objects.instance [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lazy-loading 'resources' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.579 281858 DEBUG nova.virt.libvirt.vif [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-02T08:31:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005541913.localdomain',hostname='test',id=2,image_ref='6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T08:31:55Z,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005541913.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e2d97696ab6749899bb8ba5ce29a3de2',ramdisk_id='',reservation_id='r-6ofcfgb6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T09:48:32Z,user_data=None,user_id='cb8b7d2a63b642aa999db12e17eeb9e4',uuid=b254bb7f-2891-4b37-9c44-9700e301ce16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.579 281858 DEBUG nova.network.os_vif_util [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Converting VIF {"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.580 281858 DEBUG nova.network.os_vif_util [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.581 281858 DEBUG os_vif [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.584 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.584 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a318f6a-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.587 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.589 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.593 281858 INFO os_vif [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3')
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.596 281858 DEBUG nova.virt.libvirt.host [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.596 281858 INFO nova.virt.libvirt.host [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] UEFI support detected
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.604 281858 DEBUG nova.virt.libvirt.driver [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Start _get_guest_xml network_info=[{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'size': 0, 'encryption_options': None, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}], 'ephemerals': [{'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'size': 1, 'encryption_options': None, 'encrypted': False, 'device_name': '/dev/vdb', 'encryption_secret_uuid': None, 'disk_bus': 'virtio'}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.608 281858 WARNING nova.virt.libvirt.driver [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.611 281858 DEBUG nova.virt.libvirt.host [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Searching host: 'np0005541913.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.611 281858 DEBUG nova.virt.libvirt.host [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.613 281858 DEBUG nova.virt.libvirt.host [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Searching host: 'np0005541913.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.614 281858 DEBUG nova.virt.libvirt.host [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.615 281858 DEBUG nova.virt.libvirt.driver [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.615 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T08:30:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='45a99238-6f19-4f9e-be82-6ef3af1dcb31',id=2,is_public=True,memory_mb=512,name='m1.small',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.616 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.616 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.617 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.617 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.618 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.618 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.619 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.619 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.619 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.620 281858 DEBUG nova.virt.hardware [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.620 281858 DEBUG nova.objects.instance [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.636 281858 DEBUG nova.privsep.utils [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 02 09:48:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:48:59.637 281858 DEBUG oslo_concurrency.processutils [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.125 281858 DEBUG oslo_concurrency.processutils [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.126 281858 DEBUG oslo_concurrency.processutils [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.592 281858 DEBUG oslo_concurrency.processutils [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.594 281858 DEBUG nova.virt.libvirt.vif [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-02T08:31:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005541913.localdomain',hostname='test',id=2,image_ref='6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T08:31:55Z,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005541913.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e2d97696ab6749899bb8ba5ce29a3de2',ramdisk_id='',reservation_id='r-6ofcfgb6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T09:48:32Z,user_data=None,user_id='cb8b7d2a63b642aa999db12e17eeb9e4',uuid=b254bb7f-2891-4b37-9c44-9700e301ce16,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.594 281858 DEBUG nova.network.os_vif_util [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Converting VIF {"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.595 281858 DEBUG nova.network.os_vif_util [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.597 281858 DEBUG nova.objects.instance [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Lazy-loading 'pci_devices' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.611 281858 DEBUG nova.virt.libvirt.driver [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] End _get_guest_xml xml=<domain type="kvm">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <uuid>b254bb7f-2891-4b37-9c44-9700e301ce16</uuid>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <name>instance-00000002</name>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <memory>524288</memory>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <vcpu>1</vcpu>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <metadata>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <nova:name>test</nova:name>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <nova:creationTime>2025-12-02 09:48:59</nova:creationTime>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <nova:flavor name="m1.small">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <nova:memory>512</nova:memory>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <nova:disk>1</nova:disk>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <nova:swap>0</nova:swap>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <nova:ephemeral>1</nova:ephemeral>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <nova:vcpus>1</nova:vcpus>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       </nova:flavor>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <nova:owner>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <nova:user uuid="cb8b7d2a63b642aa999db12e17eeb9e4">admin</nova:user>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <nova:project uuid="e2d97696ab6749899bb8ba5ce29a3de2">admin</nova:project>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       </nova:owner>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <nova:root type="image" uuid="6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <nova:ports>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <nova:port uuid="4a318f6a-b3c1-4690-8246-f7d046ccd64a">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:           <nova:ip type="fixed" address="192.168.0.102" ipVersion="4"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         </nova:port>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       </nova:ports>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     </nova:instance>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   </metadata>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <sysinfo type="smbios">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <system>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <entry name="manufacturer">RDO</entry>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <entry name="product">OpenStack Compute</entry>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <entry name="serial">b254bb7f-2891-4b37-9c44-9700e301ce16</entry>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <entry name="uuid">b254bb7f-2891-4b37-9c44-9700e301ce16</entry>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <entry name="family">Virtual Machine</entry>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     </system>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   </sysinfo>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <os>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <type arch="x86_64" machine="pc-q35-rhel9.0.0">hvm</type>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <boot dev="hd"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <smbios mode="sysinfo"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   </os>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <features>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <acpi/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <apic/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <vmcoreinfo/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   </features>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <clock offset="utc">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <timer name="hpet" present="no"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   </clock>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <cpu mode="host-model" match="exact">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   </cpu>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   <devices>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <disk type="network" device="disk">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <driver type="raw" cache="none"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <source protocol="rbd" name="vms/b254bb7f-2891-4b37-9c44-9700e301ce16_disk">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.103" port="6789"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.105" port="6789"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.104" port="6789"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       </source>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <auth username="openstack">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <secret type="ceph" uuid="c7c8e171-a193-56fb-95fa-8879fcfa7074"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       </auth>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <target dev="vda" bus="virtio"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <disk type="network" device="disk">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <driver type="raw" cache="none"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <source protocol="rbd" name="vms/b254bb7f-2891-4b37-9c44-9700e301ce16_disk.eph0">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.103" port="6789"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.105" port="6789"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.104" port="6789"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       </source>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <auth username="openstack">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:         <secret type="ceph" uuid="c7c8e171-a193-56fb-95fa-8879fcfa7074"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       </auth>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <target dev="vdb" bus="virtio"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <interface type="ethernet">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <mac address="fa:16:3e:26:b2:03"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <model type="virtio"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <mtu size="1292"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <target dev="tap4a318f6a-b3"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     </interface>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <serial type="pty">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <log file="/var/lib/nova/instances/b254bb7f-2891-4b37-9c44-9700e301ce16/console.log" append="off"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     </serial>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <video>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <model type="virtio"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     </video>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <input type="tablet" bus="usb"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <input type="keyboard" bus="usb"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <rng model="virtio">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <backend model="random">/dev/urandom</backend>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     </rng>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <controller type="usb" index="0"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     <memballoon model="virtio">
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:       <stats period="10"/>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:     </memballoon>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:   </devices>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: </domain>
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.612 281858 DEBUG nova.virt.libvirt.driver [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.612 281858 DEBUG nova.virt.libvirt.driver [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.613 281858 DEBUG nova.virt.libvirt.vif [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-02T08:31:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005541913.localdomain',hostname='test',id=2,image_ref='6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T08:31:55Z,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005541913.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='e2d97696ab6749899bb8ba5ce29a3de2',ramdisk_id='',reservation_id='r-6ofcfgb6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T09:48:32Z,user_data=None,user_id='cb8b7d2a63b642aa999db12e17eeb9e4',uuid=b254bb7f-2891-4b37-9c44-9700e301ce16,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.613 281858 DEBUG nova.network.os_vif_util [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Converting VIF {"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.613 281858 DEBUG nova.network.os_vif_util [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.614 281858 DEBUG os_vif [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.614 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.615 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.615 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.617 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.617 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a318f6a-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.618 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a318f6a-b3, col_values=(('external_ids', {'iface-id': '4a318f6a-b3c1-4690-8246-f7d046ccd64a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:b2:03', 'vm-uuid': 'b254bb7f-2891-4b37-9c44-9700e301ce16'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.619 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.621 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.624 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.625 281858 INFO os_vif [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:03,bridge_name='br-int',has_traffic_filtering=True,id=4a318f6a-b3c1-4690-8246-f7d046ccd64a,network=Network(595e1c9b-709c-41d2-9212-0b18b13291a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a318f6a-b3')
Dec 02 09:49:00 np0005541913.localdomain systemd[1]: Started libvirt secret daemon.
Dec 02 09:49:00 np0005541913.localdomain kernel: device tap4a318f6a-b3 entered promiscuous mode
Dec 02 09:49:00 np0005541913.localdomain NetworkManager[5965]: <info>  [1764668940.7296] manager: (tap4a318f6a-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/16)
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.731 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:00Z|00067|binding|INFO|Claiming lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a for this chassis.
Dec 02 09:49:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:00Z|00068|binding|INFO|4a318f6a-b3c1-4690-8246-f7d046ccd64a: Claiming fa:16:3e:26:b2:03 192.168.0.102
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.736 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain systemd-udevd[284811]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.742 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:00Z|00069|ovn_bfd|INFO|Enabled BFD on interface ovn-be95dc-0
Dec 02 09:49:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:00Z|00070|ovn_bfd|INFO|Enabled BFD on interface ovn-2587fe-0
Dec 02 09:49:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:00Z|00071|ovn_bfd|INFO|Enabled BFD on interface ovn-4d166c-0
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.751 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.747 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:b2:03 192.168.0.102'], port_security=['fa:16:3e:26:b2:03 192.168.0.102'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.102/24', 'neutron:device_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-595e1c9b-709c-41d2-9212-0b18b13291a8', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1725f11b-f73c-4c4f-b3d3-772d68fcc09e 23293c48-39ca-43a0-a462-ebc8626a7f6c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23d69817-a35d-4528-880f-f329bfbd969c, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=4a318f6a-b3c1-4690-8246-f7d046ccd64a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.749 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 4a318f6a-b3c1-4690-8246-f7d046ccd64a in datapath 595e1c9b-709c-41d2-9212-0b18b13291a8 bound to our chassis
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.751 160221 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 595e1c9b-709c-41d2-9212-0b18b13291a8
Dec 02 09:49:00 np0005541913.localdomain NetworkManager[5965]: <info>  [1764668940.7554] device (tap4a318f6a-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 02 09:49:00 np0005541913.localdomain NetworkManager[5965]: <info>  [1764668940.7559] device (tap4a318f6a-b3): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.761 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[640f5e76-ce0a-454e-94c9-f7a1e0c19217]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.762 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap595e1c9b-71 in ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.764 160340 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap595e1c9b-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.764 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[55b1b839-68f7-4b71-83e4-81738cb153e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.766 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[bed7f676-ce4b-4c7b-b633-bc4bd2b3416b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.776 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.779 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0aac6a-9db1-4c40-b9fd-23a8c7ae6dbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain systemd-machined[84262]: New machine qemu-2-instance-00000002.
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.795 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:00Z|00072|binding|INFO|Setting lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a ovn-installed in OVS
Dec 02 09:49:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:00Z|00073|binding|INFO|Setting lport 4a318f6a-b3c1-4690-8246-f7d046ccd64a up in Southbound
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.803 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.821 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[989fc9af-7c82-492b-a23c-ad91846466fc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:00.832 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.851 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[123467d3-e9bf-47b5-a513-6ea458c6ea15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain systemd-udevd[284812]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.860 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5622ad-6a0f-4d6c-b01c-6a3f2a7a026e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain NetworkManager[5965]: <info>  [1764668940.8630] manager: (tap595e1c9b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/17)
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.891 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[379a235f-0d02-4c7e-a741-884babad1ea1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.895 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[06cc937f-d64a-4a80-aca1-9be6bd2822c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap595e1c9b-71: link becomes ready
Dec 02 09:49:00 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap595e1c9b-70: link becomes ready
Dec 02 09:49:00 np0005541913.localdomain NetworkManager[5965]: <info>  [1764668940.9127] device (tap595e1c9b-70): carrier: link connected
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.915 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[8d8a5265-7337-4c36-a823-d26b81e98bdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.932 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[566e0857-e806-4b50-8ea9-40e7765328e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap595e1c9b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e8:5a:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1110306, 'reachable_time': 17832, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284849, 'error': None, 'target': 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.948 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ea0efd23-c238-4ecb-b90b-f25c4c0c6f21]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:5a19'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1110306, 'tstamp': 1110306}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 284857, 'error': None, 'target': 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.964 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[69ff36fa-7278-4a50-8625-d9f6c0614621]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap595e1c9b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e8:5a:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1110306, 'reachable_time': 17832, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 284866, 'error': None, 'target': 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:00.994 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[7b4faabd-cac2-471b-b0d1-c2c3c2e1311a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:01.058 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6bcc03f3-8883-485c-800b-f2a001ea4f68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:01.060 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap595e1c9b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:01.061 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:01.061 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap595e1c9b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.063 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:01 np0005541913.localdomain kernel: device tap595e1c9b-70 entered promiscuous mode
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.068 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:01.070 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap595e1c9b-70, col_values=(('external_ids', {'iface-id': 'd6e7da3f-8574-49e0-8ba1-2f642b3cec92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.071 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:01Z|00074|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.080 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:01.081 160221 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/595e1c9b-709c-41d2-9212-0b18b13291a8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/595e1c9b-709c-41d2-9212-0b18b13291a8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:01.083 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[15eb07b8-fc6b-4faf-98b7-3541fe483d8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:01.085 160221 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: global
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     log         /dev/log local0 debug
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     log-tag     haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     user        root
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     group       root
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     maxconn     1024
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     pidfile     /var/lib/neutron/external/pids/595e1c9b-709c-41d2-9212-0b18b13291a8.pid.haproxy
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     daemon
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: defaults
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     log global
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     mode http
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     option httplog
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     option dontlognull
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     option http-server-close
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     option forwardfor
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     retries                 3
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-request    30s
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout connect         30s
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout client          32s
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout server          32s
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-keep-alive 30s
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: listen listener
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     bind 169.254.169.254:80
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:     http-request add-header X-OVN-Network-ID 595e1c9b-709c-41d2-9212-0b18b13291a8
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 02 09:49:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:01.088 160221 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8', 'env', 'PROCESS_TAG=haproxy-595e1c9b-709c-41d2-9212-0b18b13291a8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/595e1c9b-709c-41d2-9212-0b18b13291a8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.177 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764668941.1768134, b254bb7f-2891-4b37-9c44-9700e301ce16 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.178 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] VM Resumed (Lifecycle Event)
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.189 281858 DEBUG nova.compute.manager [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.208 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.214 281858 INFO nova.virt.libvirt.driver [-] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Instance rebooted successfully.
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.215 281858 DEBUG nova.compute.manager [None req-d355f445-1690-4948-9eb0-b65105b7c945 cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.216 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.261 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] During sync_power_state the instance has a pending task (powering-on). Skip.
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.262 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764668941.178452, b254bb7f-2891-4b37-9c44-9700e301ce16 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.262 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] VM Started (Lifecycle Event)
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.293 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.302 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.516 281858 DEBUG nova.compute.manager [req-445e87c6-82f2-445d-ae68-1a346e15005b req-888e82d1-4cf6-43ea-a253-3119ad0798f9 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Received event network-vif-plugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.518 281858 DEBUG oslo_concurrency.lockutils [req-445e87c6-82f2-445d-ae68-1a346e15005b req-888e82d1-4cf6-43ea-a253-3119ad0798f9 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.519 281858 DEBUG oslo_concurrency.lockutils [req-445e87c6-82f2-445d-ae68-1a346e15005b req-888e82d1-4cf6-43ea-a253-3119ad0798f9 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.520 281858 DEBUG oslo_concurrency.lockutils [req-445e87c6-82f2-445d-ae68-1a346e15005b req-888e82d1-4cf6-43ea-a253-3119ad0798f9 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.521 281858 DEBUG nova.compute.manager [req-445e87c6-82f2-445d-ae68-1a346e15005b req-888e82d1-4cf6-43ea-a253-3119ad0798f9 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] No waiting events found dispatching network-vif-plugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.521 281858 WARNING nova.compute.manager [req-445e87c6-82f2-445d-ae68-1a346e15005b req-888e82d1-4cf6-43ea-a253-3119ad0798f9 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Received unexpected event network-vif-plugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a for instance with vm_state active and task_state None.
Dec 02 09:49:01 np0005541913.localdomain podman[284926]: 
Dec 02 09:49:01 np0005541913.localdomain podman[284926]: 2025-12-02 09:49:01.57803463 +0000 UTC m=+0.105719869 container create 4bf88e5dd3d6887471d25c63df52897e585725af1f4acc121cd653bb392a20e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 02 09:49:01 np0005541913.localdomain systemd[1]: Started libpod-conmon-4bf88e5dd3d6887471d25c63df52897e585725af1f4acc121cd653bb392a20e9.scope.
Dec 02 09:49:01 np0005541913.localdomain podman[284926]: 2025-12-02 09:49:01.523200124 +0000 UTC m=+0.050885353 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 02 09:49:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:01Z|00075|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.650 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:01 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:49:01 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3a163f6cf84ad26a81b0d772b07c48e80c4153a70edcd04a5d728b961dd8a35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 09:49:01 np0005541913.localdomain podman[284926]: 2025-12-02 09:49:01.694984904 +0000 UTC m=+0.222670133 container init 4bf88e5dd3d6887471d25c63df52897e585725af1f4acc121cd653bb392a20e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 09:49:01 np0005541913.localdomain podman[284926]: 2025-12-02 09:49:01.708398517 +0000 UTC m=+0.236083746 container start 4bf88e5dd3d6887471d25c63df52897e585725af1f4acc121cd653bb392a20e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.712 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:01Z|00076|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:49:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:01Z|00077|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:49:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:01.738 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:01 np0005541913.localdomain neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8[284940]: [NOTICE]   (284944) : New worker (284946) forked
Dec 02 09:49:01 np0005541913.localdomain neutron-haproxy-ovnmeta-595e1c9b-709c-41d2-9212-0b18b13291a8[284940]: [NOTICE]   (284944) : Loading success.
Dec 02 09:49:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:02Z|00078|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:49:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:02.607 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:02Z|00079|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 09:49:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:02.734 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:02 np0005541913.localdomain snmpd[69635]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB.
Dec 02 09:49:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:03.033 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:49:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:03.034 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:49:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:03.034 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:49:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:03.555 281858 DEBUG nova.compute.manager [req-c342436f-186e-40d9-8d18-cccac88faf73 req-5e3a6970-2563-45d0-ac97-f54bb8a5ce1b dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Received event network-vif-plugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 09:49:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:03.555 281858 DEBUG oslo_concurrency.lockutils [req-c342436f-186e-40d9-8d18-cccac88faf73 req-5e3a6970-2563-45d0-ac97-f54bb8a5ce1b dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:49:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:03.556 281858 DEBUG oslo_concurrency.lockutils [req-c342436f-186e-40d9-8d18-cccac88faf73 req-5e3a6970-2563-45d0-ac97-f54bb8a5ce1b dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:49:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:03.556 281858 DEBUG oslo_concurrency.lockutils [req-c342436f-186e-40d9-8d18-cccac88faf73 req-5e3a6970-2563-45d0-ac97-f54bb8a5ce1b dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:49:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:03.556 281858 DEBUG nova.compute.manager [req-c342436f-186e-40d9-8d18-cccac88faf73 req-5e3a6970-2563-45d0-ac97-f54bb8a5ce1b dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] No waiting events found dispatching network-vif-plugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 02 09:49:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:03.556 281858 WARNING nova.compute.manager [req-c342436f-186e-40d9-8d18-cccac88faf73 req-5e3a6970-2563-45d0-ac97-f54bb8a5ce1b dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Received unexpected event network-vif-plugged-4a318f6a-b3c1-4690-8246-f7d046ccd64a for instance with vm_state active and task_state None.
Dec 02 09:49:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48716 DF PROTO=TCP SPT=45224 DPT=9102 SEQ=2646252626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A6C3BD0000000001030307) 
Dec 02 09:49:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:49:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:49:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:49:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:49:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:49:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:49:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:49:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:49:04 np0005541913.localdomain podman[284955]: 2025-12-02 09:49:04.494636769 +0000 UTC m=+0.120303053 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:49:04 np0005541913.localdomain podman[284955]: 2025-12-02 09:49:04.512212832 +0000 UTC m=+0.137879146 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:49:04 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:49:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48717 DF PROTO=TCP SPT=45224 DPT=9102 SEQ=2646252626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A6C7E40000000001030307) 
Dec 02 09:49:05 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30377 DF PROTO=TCP SPT=40134 DPT=9102 SEQ=1121546869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A6C9E40000000001030307) 
Dec 02 09:49:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:05.619 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:49:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:49:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:49:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147735 "" "Go-http-client/1.1"
Dec 02 09:49:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:49:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17216 "" "Go-http-client/1.1"
Dec 02 09:49:06 np0005541913.localdomain sudo[284974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:49:06 np0005541913.localdomain sudo[284974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:49:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:49:06 np0005541913.localdomain sudo[284974]: pam_unix(sudo:session): session closed for user root
Dec 02 09:49:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:49:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:06.685 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:06 np0005541913.localdomain sudo[285003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:49:06 np0005541913.localdomain sudo[285003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:49:06 np0005541913.localdomain podman[284993]: 2025-12-02 09:49:06.742424522 +0000 UTC m=+0.092550962 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 02 09:49:06 np0005541913.localdomain podman[284993]: 2025-12-02 09:49:06.797520345 +0000 UTC m=+0.147646785 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 09:49:06 np0005541913.localdomain systemd[1]: tmp-crun.zU9Ivg.mount: Deactivated successfully.
Dec 02 09:49:06 np0005541913.localdomain podman[284992]: 2025-12-02 09:49:06.807677952 +0000 UTC m=+0.157662628 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:49:06 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:49:06 np0005541913.localdomain podman[284992]: 2025-12-02 09:49:06.822917294 +0000 UTC m=+0.172901940 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:49:06 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:49:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48718 DF PROTO=TCP SPT=45224 DPT=9102 SEQ=2646252626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A6CFE40000000001030307) 
Dec 02 09:49:07 np0005541913.localdomain sshd[285074]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:49:07 np0005541913.localdomain sudo[285003]: pam_unix(sudo:session): session closed for user root
Dec 02 09:49:07 np0005541913.localdomain sshd[285074]: error: kex_exchange_identification: Connection closed by remote host
Dec 02 09:49:07 np0005541913.localdomain sshd[285074]: Connection closed by 103.42.181.150 port 45702
Dec 02 09:49:07 np0005541913.localdomain sudo[285089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:49:07 np0005541913.localdomain sudo[285089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:49:07 np0005541913.localdomain sudo[285089]: pam_unix(sudo:session): session closed for user root
Dec 02 09:49:08 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22746 DF PROTO=TCP SPT=49628 DPT=9102 SEQ=420584671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A6D3E40000000001030307) 
Dec 02 09:49:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:10.621 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:11 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48719 DF PROTO=TCP SPT=45224 DPT=9102 SEQ=2646252626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A6DFA40000000001030307) 
Dec 02 09:49:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:11.730 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:15 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:15Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:26:b2:03 192.168.0.102
Dec 02 09:49:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:15.623 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:16.775 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:49:17 np0005541913.localdomain podman[285108]: 2025-12-02 09:49:17.461595069 +0000 UTC m=+0.099402701 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec 02 09:49:17 np0005541913.localdomain podman[285108]: 2025-12-02 09:49:17.470638218 +0000 UTC m=+0.108445840 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:49:17 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:49:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48720 DF PROTO=TCP SPT=45224 DPT=9102 SEQ=2646252626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A6FFE50000000001030307) 
Dec 02 09:49:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:20.190 281858 DEBUG nova.compute.manager [None req-25b1fd2e-b51e-4a48-93f6-5c9e237f556a cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 09:49:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:20.196 281858 INFO nova.compute.manager [None req-25b1fd2e-b51e-4a48-93f6-5c9e237f556a cb8b7d2a63b642aa999db12e17eeb9e4 e2d97696ab6749899bb8ba5ce29a3de2 - - default default] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Retrieving diagnostics
Dec 02 09:49:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:20.640 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:21.203 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:21.205 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Dec 02 09:49:21 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:21 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:21 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:21 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:21 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:21 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:21 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:49:21 np0005541913.localdomain podman[285127]: 2025-12-02 09:49:21.451093881 +0000 UTC m=+0.088853644 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 09:49:21 np0005541913.localdomain podman[285127]: 2025-12-02 09:49:21.455735793 +0000 UTC m=+0.093495506 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 02 09:49:21 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:49:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:21.810 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.374 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.375 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.1700296
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35664 [02/Dec/2025:09:49:21.201] listener listener/metadata 0/0/0/1173/1173 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.392 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.393 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35670 [02/Dec/2025:09:49:22.391] listener listener/metadata 0/0/0/21/21 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.413 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404  len: 297 time: 0.0200615
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.423 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.423 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.437 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35676 [02/Dec/2025:09:49:22.422] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.438 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0144057
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.443 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.443 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.456 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.456 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0131860
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35690 [02/Dec/2025:09:49:22.442] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.462 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.462 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.476 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35702 [02/Dec/2025:09:49:22.461] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.476 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 143 time: 0.0137148
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.481 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.482 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.493 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.494 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 149 time: 0.0123684
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35714 [02/Dec/2025:09:49:22.480] listener listener/metadata 0/0/0/13/13 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.499 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.499 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.511 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35724 [02/Dec/2025:09:49:22.498] listener listener/metadata 0/0/0/13/13 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.512 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 150 time: 0.0127134
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.516 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.517 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.528 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35726 [02/Dec/2025:09:49:22.516] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.529 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 139 time: 0.0117881
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.538 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.539 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.552 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35738 [02/Dec/2025:09:49:22.538] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.552 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 139 time: 0.0132070
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.560 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.561 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35754 [02/Dec/2025:09:49:22.560] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.574 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0129514
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.590 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.591 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.606 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.607 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 155 time: 0.0158966
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35760 [02/Dec/2025:09:49:22.590] listener listener/metadata 0/0/0/16/16 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.614 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.615 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.629 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.630 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0144751
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35766 [02/Dec/2025:09:49:22.613] listener listener/metadata 0/0/0/17/17 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.636 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.637 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.651 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35774 [02/Dec/2025:09:49:22.636] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.651 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200  len: 143 time: 0.0138040
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.657 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.658 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.670 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35778 [02/Dec/2025:09:49:22.656] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.670 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0127642
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.677 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.678 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.691 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35790 [02/Dec/2025:09:49:22.677] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.692 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 139 time: 0.0140796
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.699 160335 DEBUG eventlet.wsgi.server [-] (160335) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.700 160335 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Accept: */*
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Connection: close
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Content-Type: text/plain
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: Host: 169.254.169.254
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: User-Agent: curl/7.84.0
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Forwarded-For: 192.168.0.102
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: X-Ovn-Network-Id: 595e1c9b-709c-41d2-9212-0b18b13291a8 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.712 160335 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 02 09:49:22 np0005541913.localdomain haproxy-metadata-proxy-595e1c9b-709c-41d2-9212-0b18b13291a8[284946]: 192.168.0.102:35798 [02/Dec/2025:09:49:22.698] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Dec 02 09:49:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:49:22.713 160335 INFO eventlet.wsgi.server [-] 192.168.0.102,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0132546
Dec 02 09:49:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:25.673 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:49:26 np0005541913.localdomain podman[285146]: 2025-12-02 09:49:26.455128864 +0000 UTC m=+0.089613224 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git)
Dec 02 09:49:26 np0005541913.localdomain podman[285146]: 2025-12-02 09:49:26.472233085 +0000 UTC m=+0.106717425 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:49:26 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:49:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:26.847 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:49:28 np0005541913.localdomain systemd[1]: tmp-crun.D9gWTO.mount: Deactivated successfully.
Dec 02 09:49:28 np0005541913.localdomain podman[285166]: 2025-12-02 09:49:28.444523343 +0000 UTC m=+0.084357825 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:49:28 np0005541913.localdomain podman[285166]: 2025-12-02 09:49:28.45428545 +0000 UTC m=+0.094119932 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:49:28 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:49:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:30.718 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:30 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T09:49:30Z|00080|memory_trim|INFO|Detected inactivity (last active 30026 ms ago): trimming memory
Dec 02 09:49:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:31.850 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26550 DF PROTO=TCP SPT=39952 DPT=9102 SEQ=1568618815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A738EE0000000001030307) 
Dec 02 09:49:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:49:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:49:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:49:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:49:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:49:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:49:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:49:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:49:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26551 DF PROTO=TCP SPT=39952 DPT=9102 SEQ=1568618815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A73CE50000000001030307) 
Dec 02 09:49:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:49:35 np0005541913.localdomain podman[285189]: 2025-12-02 09:49:35.454590955 +0000 UTC m=+0.094085282 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 09:49:35 np0005541913.localdomain podman[285189]: 2025-12-02 09:49:35.471243964 +0000 UTC m=+0.110738221 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:49:35 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:49:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48721 DF PROTO=TCP SPT=45224 DPT=9102 SEQ=2646252626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A73FE40000000001030307) 
Dec 02 09:49:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:35.768 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:49:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:49:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:49:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147735 "" "Go-http-client/1.1"
Dec 02 09:49:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:49:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1"
Dec 02 09:49:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:36.854 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:36 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26552 DF PROTO=TCP SPT=39952 DPT=9102 SEQ=1568618815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A744E40000000001030307) 
Dec 02 09:49:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:49:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:49:37 np0005541913.localdomain systemd[1]: tmp-crun.s4UtcN.mount: Deactivated successfully.
Dec 02 09:49:37 np0005541913.localdomain podman[285209]: 2025-12-02 09:49:37.457790707 +0000 UTC m=+0.092681275 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:49:37 np0005541913.localdomain podman[285209]: 2025-12-02 09:49:37.463624591 +0000 UTC m=+0.098515159 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:49:37 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:49:37 np0005541913.localdomain podman[285210]: 2025-12-02 09:49:37.544564095 +0000 UTC m=+0.173867725 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125)
Dec 02 09:49:37 np0005541913.localdomain podman[285210]: 2025-12-02 09:49:37.583089322 +0000 UTC m=+0.212393022 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:49:37 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:49:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30378 DF PROTO=TCP SPT=40134 DPT=9102 SEQ=1121546869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A747E40000000001030307) 
Dec 02 09:49:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:40.771 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26553 DF PROTO=TCP SPT=39952 DPT=9102 SEQ=1568618815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A754A40000000001030307) 
Dec 02 09:49:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:41.893 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.523 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.524 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.552 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.553 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.554 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.773 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.787 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.788 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.788 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:49:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:45.788 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.136 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.159 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.160 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.161 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.161 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.161 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.162 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.162 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.163 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.163 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.164 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.182 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.183 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.183 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.184 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.184 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.622 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.695 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.695 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.947 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.986 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.988 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12288MB free_disk=41.8370246887207GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.989 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:49:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:46.990 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:49:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:47.139 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:49:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:47.139 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:49:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:47.140 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:49:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:47.177 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:49:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:47.665 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:49:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:47.670 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:49:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:47.685 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:49:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:47.706 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:49:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:47.706 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:49:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:49:48 np0005541913.localdomain podman[285300]: 2025-12-02 09:49:48.44258672 +0000 UTC m=+0.085824274 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 09:49:48 np0005541913.localdomain podman[285300]: 2025-12-02 09:49:48.456582139 +0000 UTC m=+0.099819703 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 09:49:48 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:49:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 09:49:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 09:49:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26554 DF PROTO=TCP SPT=39952 DPT=9102 SEQ=1568618815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A775E40000000001030307) 
Dec 02 09:49:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:50.778 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:51.993 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:49:52 np0005541913.localdomain systemd[1]: tmp-crun.74YMdT.mount: Deactivated successfully.
Dec 02 09:49:52 np0005541913.localdomain podman[285319]: 2025-12-02 09:49:52.420134715 +0000 UTC m=+0.066431912 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:49:52 np0005541913.localdomain podman[285319]: 2025-12-02 09:49:52.429061232 +0000 UTC m=+0.075358449 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 09:49:52 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:49:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:55.810 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:49:57.031 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:49:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:49:57 np0005541913.localdomain podman[285338]: 2025-12-02 09:49:57.561704107 +0000 UTC m=+0.195776284 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, version=9.6)
Dec 02 09:49:57 np0005541913.localdomain podman[285338]: 2025-12-02 09:49:57.579091885 +0000 UTC m=+0.213164032 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Dec 02 09:49:57 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:49:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:49:59 np0005541913.localdomain systemd[1]: tmp-crun.E7kisL.mount: Deactivated successfully.
Dec 02 09:49:59 np0005541913.localdomain podman[285358]: 2025-12-02 09:49:59.452079945 +0000 UTC m=+0.088214818 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:49:59 np0005541913.localdomain podman[285358]: 2025-12-02 09:49:59.467391288 +0000 UTC m=+0.103526151 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:49:59 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:50:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:00.851 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:02.074 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:50:03.034 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:50:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:50:03.034 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:50:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:50:03.035 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:50:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21187 DF PROTO=TCP SPT=46744 DPT=9102 SEQ=2091678735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A7AE1E0000000001030307) 
Dec 02 09:50:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:50:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:50:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:50:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:50:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:50:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:50:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:50:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21188 DF PROTO=TCP SPT=46744 DPT=9102 SEQ=2091678735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A7B2240000000001030307) 
Dec 02 09:50:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:05.873 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:05 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26555 DF PROTO=TCP SPT=39952 DPT=9102 SEQ=1568618815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A7B5E40000000001030307) 
Dec 02 09:50:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:50:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:50:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:50:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147735 "" "Go-http-client/1.1"
Dec 02 09:50:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:50:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Dec 02 09:50:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:50:06 np0005541913.localdomain systemd[1]: tmp-crun.vuUnHn.mount: Deactivated successfully.
Dec 02 09:50:06 np0005541913.localdomain podman[285382]: 2025-12-02 09:50:06.439856019 +0000 UTC m=+0.081891791 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 09:50:06 np0005541913.localdomain podman[285382]: 2025-12-02 09:50:06.450045288 +0000 UTC m=+0.092081020 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 02 09:50:06 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:50:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21189 DF PROTO=TCP SPT=46744 DPT=9102 SEQ=2091678735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A7BA250000000001030307) 
Dec 02 09:50:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:07.077 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48722 DF PROTO=TCP SPT=45224 DPT=9102 SEQ=2646252626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A7BDE40000000001030307) 
Dec 02 09:50:08 np0005541913.localdomain sudo[285401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:50:08 np0005541913.localdomain sudo[285401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:50:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:50:08 np0005541913.localdomain sudo[285401]: pam_unix(sudo:session): session closed for user root
Dec 02 09:50:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:50:08 np0005541913.localdomain sudo[285421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:50:08 np0005541913.localdomain sudo[285421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:50:08 np0005541913.localdomain podman[285420]: 2025-12-02 09:50:08.319856653 +0000 UTC m=+0.100135962 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec 02 09:50:08 np0005541913.localdomain podman[285420]: 2025-12-02 09:50:08.364046368 +0000 UTC m=+0.144325707 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 02 09:50:08 np0005541913.localdomain podman[285419]: 2025-12-02 09:50:08.373832896 +0000 UTC m=+0.155926872 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:50:08 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:50:08 np0005541913.localdomain podman[285419]: 2025-12-02 09:50:08.385100283 +0000 UTC m=+0.167194239 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:50:08 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:50:08 np0005541913.localdomain sudo[285421]: pam_unix(sudo:session): session closed for user root
Dec 02 09:50:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:10.918 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:11 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21190 DF PROTO=TCP SPT=46744 DPT=9102 SEQ=2091678735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A7C9E40000000001030307) 
Dec 02 09:50:11 np0005541913.localdomain sudo[285517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:50:11 np0005541913.localdomain sudo[285517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:50:11 np0005541913.localdomain sudo[285517]: pam_unix(sudo:session): session closed for user root
Dec 02 09:50:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:12.118 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:15.922 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.104 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.116 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.117 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0e54fa1-2585-4df7-8756-503f49e86e09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:50:16.105712', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e05dfb0-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.324808052, 'message_signature': '554add937cec48679c5210ee00a88a02d99aa5e5c0c4bb0a2307300d99b1b1c1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:50:16.105712', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e05ef28-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.324808052, 'message_signature': '0052c2ead7341547bb420e44b0a71f3b455237f1ade2ed02d86d2f46c8db2db2'}]}, 'timestamp': '2025-12-02 09:50:16.117832', '_unique_id': '545bc74cfc0f442d847722eb71b4ed24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.118 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.122 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9f7aee5-57a2-45a2-b96d-20fee887e2f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.120003', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e06c858-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': '3506b02b9e0b87b6a8fbf1beac3551bf5fa1196ba7b08bf5f571eec6ae37c546'}]}, 'timestamp': '2025-12-02 09:50:16.123368', '_unique_id': 'bb2d99ee08b041a1b1fe3eee0fd2ffb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.125 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38583897-022c-4437-8d4e-f9206870bae7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.125057', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e07175e-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': 'dfd61991f10811d564a0bf0827c3a60b28418b8952266e521f95c22d8b89a981'}]}, 'timestamp': '2025-12-02 09:50:16.125413', '_unique_id': '93672f6d38ba49c0ac3436c60285422c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.126 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0c3a600-5649-46cb-abd7-17132668d8c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.126857', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e075d5e-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': '82d5010a87b1ce48ecb672736ebcc8fda122b3358cdb4874f161a609a5a06c6b'}]}, 'timestamp': '2025-12-02 09:50:16.127163', '_unique_id': '7d04b3b5a8214b3295c08692127a24a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.127 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.128 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.150 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b553321b-994c-4d0b-b506-94084b79e17a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:50:16.128522', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4e0af450-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.369247024, 'message_signature': '6a482c5dbe451f6fbb43c554fe1eed163223c8af3db1c6d8291cde71812c7e4e'}]}, 'timestamp': '2025-12-02 09:50:16.150735', '_unique_id': '14f4765dc76846679ead70e5a5ed2b79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.151 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.152 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df94767f-1f7f-4bf1-96ce-4a5d7a84f5f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.152341', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e0b40f4-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': '2961f6f64439e80f50d6d8dc670d74e62862b5b546bdf06cc23d60c296b758b8'}]}, 'timestamp': '2025-12-02 09:50:16.152675', '_unique_id': '085d503e765948efb50fe07479388630'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.154 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.154 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '727ef592-a4d6-4b3a-8fc7-4a5981de15a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:50:16.154170', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e0b8820-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.324808052, 'message_signature': '5bef71a7c4a4af894ac6b499154c9971d991b55a12cec74b302d9d493add8f75'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:50:16.154170', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e0b92ac-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.324808052, 'message_signature': '0b3a7c96045d10a4e83302138583f3ebe3c0bd0a24558e7459d777ce7f260eb5'}]}, 'timestamp': '2025-12-02 09:50:16.154745', '_unique_id': 'b8c49f74db3748709e0fba383cd5367c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.156 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24586927-4b87-4496-be96-cae5b6984a2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.156106', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e0bd438-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': 'd72e6f296c29b5b9abb19ad8cd9ba208e18d2d2456cd7a8648abdc53ac0435da'}]}, 'timestamp': '2025-12-02 09:50:16.156415', '_unique_id': '2bb30df039e14f48beb72f9196d93e88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.180 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.180 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efc3b22b-dae9-46d6-bcc1-251500a91672', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:50:16.157791', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e0f8a38-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': 'a160e54033ec7a5f3a530a1a85705f84568a4cb6c15d1b2c058c0f630d1b800c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:50:16.157791', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e0f99c4-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': '29fb25f813bf8ae748c078033b9d84f158b3b5d1e182419ace792391831f73ab'}]}, 'timestamp': '2025-12-02 09:50:16.181135', '_unique_id': 'e08fbd9849af49e1bd953431ba597ec4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.182 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.183 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0a654c8-bf9c-4047-ae15-76dab10310ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.183310', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e0ffd74-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': '6c554ecd1d8d7b3d9774618e8d764c9c044d2503706c3bc55057318b00cf5a20'}]}, 'timestamp': '2025-12-02 09:50:16.183743', '_unique_id': '33dc5e51a53142b299dac81e5519b9e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.184 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.185 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.185 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90fd18ed-6e35-42b2-a5c4-ec4b09d1f894', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:50:16.185349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e104a2c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': 'b61f190199bec55615933002155256bc452b24645d08e590ea0d5de98889ebed'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:50:16.185349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e10562a-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': '9eb7cfd0c498e6f7c4e49a54d293ea828fe5b665b99f040adf638121b4756d72'}]}, 'timestamp': '2025-12-02 09:50:16.186352', '_unique_id': '05ce8171ee084e4b9eed28035b5c4e76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83751ef5-5823-4120-b6d8-370a90048290', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:50:16.187988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e10b14c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.324808052, 'message_signature': 'e6804f548cbfb50295fe3852ade25577ee10428a4482fcd59b5aead4e1b5b5ef'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:50:16.187988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e10bbd8-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.324808052, 'message_signature': '766f5a8d1dd3b56bea829ec34cc5bf62cc80fee3d591a2db1bce8e2d56af8c10'}]}, 'timestamp': '2025-12-02 09:50:16.188543', '_unique_id': 'f67c4e9afee243e48aecf31358d393c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.189 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.190 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.190 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2d6c2a0-78f2-48e5-bd98-9da8240c9528', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:50:16.190145', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e11061a-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': 'f82704d6e0efe447354a1c833b9b88ccdb4b4c9dd8a4883bdc20d8b2f8df7e1f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:50:16.190145', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e11136c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': 'f58c32fe3fd063776e44b74af80a5ec80e08e8d0a0134231770c276fe56b329a'}]}, 'timestamp': '2025-12-02 09:50:16.190811', '_unique_id': '3332c925e9fe45de89bcbfc92e1c438f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.192 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dc53f05-e361-45ce-8ad5-cc54bf1b24c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.192571', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e11668c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': '43d133f7d18c851f0f7a6aef8e1f487cb55cb4b06627655022ee63b4db76beee'}]}, 'timestamp': '2025-12-02 09:50:16.192970', '_unique_id': '8e47daaa6f5543c28e044f155a99ea14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.194 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a49c6580-4c4c-40bd-8377-1a559a5706c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.194441', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e11ad40-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': '9af88ed2dedf63546ec98d3ad2272385ffb3abce99c7e0f1743100e0e5d84c08'}]}, 'timestamp': '2025-12-02 09:50:16.194766', '_unique_id': '92d4e8a5b8444ddfa4347fce1000085a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.196 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '599a3987-ce19-4311-9219-f29dbcb95a81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:50:16.196319', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e11f818-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': '6e7837fc81d6b730675815f50bc05b87faa8f7eedf88ff2ce60ce3a3e3d3e847'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:50:16.196319', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e120696-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': '3a64092aa32df51b3cee55852bdf17f006732988c26aa4af30b07c78b9e2e805'}]}, 'timestamp': '2025-12-02 09:50:16.197021', '_unique_id': '5fd943736ff94de8b1ec26fc388a15fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.198 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.198 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 12100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3c427cb-66b0-4efd-aa9c-68e071b252dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12100000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:50:16.198602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4e125380-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.369247024, 'message_signature': '93c966c4125a60a6d923dc094bd9db27926118452c9668b34b94dbb27463075e'}]}, 'timestamp': '2025-12-02 09:50:16.199069', '_unique_id': '582e60739c6f4a47aca2873a38d9dc0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.200 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.201 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0be7df4-d4bf-4ca3-a2cb-1afdb5cb64e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:50:16.200823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e12a678-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': '0fa1d3652a942d89ade56b1cc8ccf486bae8e674526e356a60fe8b4fb8888672'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:50:16.200823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e12b2da-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': '2b2d6b132ce99ca0356570c4b8f9e76c25225fbcf14ccd58a06f4e73566cb194'}]}, 'timestamp': '2025-12-02 09:50:16.201472', '_unique_id': '59c208a47e874865b08c29cd26203a1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '110ad6c6-fd9c-4677-8dbb-67afe2881a23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:50:16.203293', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e13071c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': 'b0486c27e70dcc768611637ba8286d8521743131a16e58afa30dc6193ff2f6bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:50:16.203293', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e131392-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.376846415, 'message_signature': '36207a093461f7cbc2c991684a6f2a11c2beccc2880a9059af52f8d73e5cdc0d'}]}, 'timestamp': '2025-12-02 09:50:16.203900', '_unique_id': '683009b601ea4b7d8597485a16b691f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.205 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5458f3d4-225f-477e-a45a-65375a9d1af6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.205403', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e13599c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': 'e7b52cbbf6a2066d8e7161551c126641c606caa42b3af5d7191a5b4c8ec2eafb'}]}, 'timestamp': '2025-12-02 09:50:16.205745', '_unique_id': '95e588b7447a4c2facccc8d22567e94c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.206 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.207 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '786dc63b-ef01-4e96-a1bb-afd3339dc25f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:50:16.207221', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4e13a122-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11178.339074939, 'message_signature': 'a14f35e4c35dea2ec85b79fa205f4e9c04e3470086a7b8b54179d8d41f5cbd65'}]}, 'timestamp': '2025-12-02 09:50:16.207586', '_unique_id': '67459c88f3fa4e6a994a71ca34a283af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:50:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:50:16.209 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:50:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:17.164 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:19 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21191 DF PROTO=TCP SPT=46744 DPT=9102 SEQ=2091678735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A7E9E50000000001030307) 
Dec 02 09:50:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:50:19 np0005541913.localdomain podman[285535]: 2025-12-02 09:50:19.55469616 +0000 UTC m=+0.190096615 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 09:50:19 np0005541913.localdomain podman[285535]: 2025-12-02 09:50:19.565201366 +0000 UTC m=+0.200601871 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:50:19 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:50:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:20.926 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:22.195 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:50:23 np0005541913.localdomain podman[285554]: 2025-12-02 09:50:23.452731787 +0000 UTC m=+0.086332317 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 09:50:23 np0005541913.localdomain podman[285554]: 2025-12-02 09:50:23.46116107 +0000 UTC m=+0.094761520 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Dec 02 09:50:23 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:50:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:25.979 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:27.225 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:50:28 np0005541913.localdomain podman[285574]: 2025-12-02 09:50:28.439543836 +0000 UTC m=+0.078199883 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 09:50:28 np0005541913.localdomain podman[285574]: 2025-12-02 09:50:28.454600004 +0000 UTC m=+0.093256001 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal)
Dec 02 09:50:28 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:50:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:50:30 np0005541913.localdomain podman[285594]: 2025-12-02 09:50:30.437567583 +0000 UTC m=+0.081445928 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:50:30 np0005541913.localdomain podman[285594]: 2025-12-02 09:50:30.514146533 +0000 UTC m=+0.158024828 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:50:30 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:50:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:31.016 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:32.256 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:33 np0005541913.localdomain sshd[285617]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:50:33 np0005541913.localdomain sshd[285617]: Accepted publickey for zuul from 38.102.83.114 port 45662 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:50:33 np0005541913.localdomain systemd-logind[757]: New session 62 of user zuul.
Dec 02 09:50:33 np0005541913.localdomain systemd[1]: Started Session 62 of User zuul.
Dec 02 09:50:33 np0005541913.localdomain sshd[285617]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 09:50:33 np0005541913.localdomain sudo[285637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-louabuokmopstaoqwwgfowlwrztmkvgg ; /usr/bin/python3
Dec 02 09:50:33 np0005541913.localdomain sudo[285637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 09:50:33 np0005541913.localdomain python3[285639]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:50:33 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49892 DF PROTO=TCP SPT=50452 DPT=9102 SEQ=637600805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A8234E0000000001030307) 
Dec 02 09:50:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:50:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:50:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:50:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:50:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:50:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:50:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:50:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:50:34 np0005541913.localdomain subscription-manager[285640]: Unregistered machine with identity: d1b4d74d-2a0e-41d6-a299-a10b4d7396a9
Dec 02 09:50:34 np0005541913.localdomain sudo[285637]: pam_unix(sudo:session): session closed for user root
Dec 02 09:50:34 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49893 DF PROTO=TCP SPT=50452 DPT=9102 SEQ=637600805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A827640000000001030307) 
Dec 02 09:50:35 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21192 DF PROTO=TCP SPT=46744 DPT=9102 SEQ=2091678735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A829E50000000001030307) 
Dec 02 09:50:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:36.019 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:50:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:50:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:50:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147735 "" "Go-http-client/1.1"
Dec 02 09:50:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:50:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17230 "" "Go-http-client/1.1"
Dec 02 09:50:37 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49894 DF PROTO=TCP SPT=50452 DPT=9102 SEQ=637600805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A82F640000000001030307) 
Dec 02 09:50:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:37.296 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:50:37 np0005541913.localdomain podman[285642]: 2025-12-02 09:50:37.458220424 +0000 UTC m=+0.095994023 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Dec 02 09:50:37 np0005541913.localdomain podman[285642]: 2025-12-02 09:50:37.50129442 +0000 UTC m=+0.139068019 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:50:37 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:50:38 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26556 DF PROTO=TCP SPT=39952 DPT=9102 SEQ=1568618815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A833E50000000001030307) 
Dec 02 09:50:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:50:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:50:39 np0005541913.localdomain podman[285662]: 2025-12-02 09:50:39.189201219 +0000 UTC m=+0.099811943 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:50:39 np0005541913.localdomain podman[285661]: 2025-12-02 09:50:39.160994815 +0000 UTC m=+0.081268854 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:50:39 np0005541913.localdomain podman[285662]: 2025-12-02 09:50:39.235430678 +0000 UTC m=+0.146041372 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 09:50:39 np0005541913.localdomain podman[285661]: 2025-12-02 09:50:39.241850597 +0000 UTC m=+0.162124646 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:50:39 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:50:39 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:50:41 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49895 DF PROTO=TCP SPT=50452 DPT=9102 SEQ=637600805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A83F240000000001030307) 
Dec 02 09:50:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:41.066 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:42.339 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:46.113 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:47.344 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:47.708 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:50:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:47.709 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:50:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:47.709 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:50:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:47.709 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:50:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:47.770 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:50:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:47.770 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:50:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:47.771 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:50:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:47.771 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.528 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.546 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.546 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.547 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.547 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.548 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.548 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.549 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.549 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.550 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.550 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.572 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.572 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.573 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.573 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:50:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:48.574 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.011 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.072 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.073 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.274 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.278 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12293MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.278 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.279 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.346 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.346 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.347 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.387 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:50:49 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49896 DF PROTO=TCP SPT=50452 DPT=9102 SEQ=637600805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A85FE40000000001030307) 
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.791 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.797 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.820 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.822 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:50:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:49.822 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:50:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:50:50 np0005541913.localdomain podman[285751]: 2025-12-02 09:50:50.448116636 +0000 UTC m=+0.084130364 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 09:50:50 np0005541913.localdomain podman[285751]: 2025-12-02 09:50:50.461652867 +0000 UTC m=+0.097666575 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:50:50 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:50:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:51.169 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:52.371 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:50:54 np0005541913.localdomain podman[285771]: 2025-12-02 09:50:54.442493802 +0000 UTC m=+0.077841053 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 09:50:54 np0005541913.localdomain podman[285771]: 2025-12-02 09:50:54.449321705 +0000 UTC m=+0.084668946 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 09:50:54 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:50:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:56.217 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:50:57.403 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:50:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:50:59 np0005541913.localdomain podman[285789]: 2025-12-02 09:50:59.463210642 +0000 UTC m=+0.101882818 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Dec 02 09:50:59 np0005541913.localdomain podman[285789]: 2025-12-02 09:50:59.483454044 +0000 UTC m=+0.122126210 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, managed_by=edpm_ansible, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 09:50:59 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:51:00 np0005541913.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 02 09:51:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:51:00 np0005541913.localdomain systemd[1]: tmp-crun.m1WLTI.mount: Deactivated successfully.
Dec 02 09:51:00 np0005541913.localdomain podman[285811]: 2025-12-02 09:51:00.929727599 +0000 UTC m=+0.102319199 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:51:00 np0005541913.localdomain podman[285811]: 2025-12-02 09:51:00.940031105 +0000 UTC m=+0.112622515 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:51:00 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:51:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:01.247 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:02.439 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:51:03.035 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:51:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:51:03.036 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:51:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:51:03.037 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:51:03 np0005541913.localdomain sudo[285835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:51:03 np0005541913.localdomain sudo[285835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:03 np0005541913.localdomain sudo[285835]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:03 np0005541913.localdomain sudo[285853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 09:51:03 np0005541913.localdomain sudo[285853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:03 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3477 DF PROTO=TCP SPT=54006 DPT=9102 SEQ=2645994212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A8987E0000000001030307) 
Dec 02 09:51:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:51:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:51:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:51:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:51:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:51:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:51:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:51:04 np0005541913.localdomain sudo[285853]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:04 np0005541913.localdomain sudo[285893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:04 np0005541913.localdomain sudo[285893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:04 np0005541913.localdomain sudo[285893]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:04 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3478 DF PROTO=TCP SPT=54006 DPT=9102 SEQ=2645994212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A89CA40000000001030307) 
Dec 02 09:51:05 np0005541913.localdomain sudo[285911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:05 np0005541913.localdomain sudo[285911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:05 np0005541913.localdomain sudo[285911]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:05 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49897 DF PROTO=TCP SPT=50452 DPT=9102 SEQ=637600805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A89FE40000000001030307) 
Dec 02 09:51:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:51:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:51:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:51:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147735 "" "Go-http-client/1.1"
Dec 02 09:51:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:51:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17240 "" "Go-http-client/1.1"
Dec 02 09:51:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:06.277 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:06 np0005541913.localdomain sudo[285929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:06 np0005541913.localdomain sudo[285929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:06 np0005541913.localdomain sudo[285929]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3479 DF PROTO=TCP SPT=54006 DPT=9102 SEQ=2645994212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A8A4A40000000001030307) 
Dec 02 09:51:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:07.484 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:07 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21193 DF PROTO=TCP SPT=46744 DPT=9102 SEQ=2091678735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A8A7E40000000001030307) 
Dec 02 09:51:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:51:08 np0005541913.localdomain podman[285947]: 2025-12-02 09:51:08.443131338 +0000 UTC m=+0.082934940 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 09:51:08 np0005541913.localdomain podman[285947]: 2025-12-02 09:51:08.450530457 +0000 UTC m=+0.090334049 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Dec 02 09:51:08 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:51:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:51:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:51:09 np0005541913.localdomain podman[285965]: 2025-12-02 09:51:09.442183966 +0000 UTC m=+0.080373823 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:51:09 np0005541913.localdomain systemd[1]: tmp-crun.S6i61e.mount: Deactivated successfully.
Dec 02 09:51:09 np0005541913.localdomain podman[285966]: 2025-12-02 09:51:09.505580812 +0000 UTC m=+0.138909738 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Dec 02 09:51:09 np0005541913.localdomain podman[285965]: 2025-12-02 09:51:09.53317349 +0000 UTC m=+0.171363337 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:51:09 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:51:09 np0005541913.localdomain podman[285966]: 2025-12-02 09:51:09.572956735 +0000 UTC m=+0.206285661 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 09:51:09 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:51:09 np0005541913.localdomain sshd[286013]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:51:09 np0005541913.localdomain sshd[286013]: Accepted publickey for tripleo-admin from 192.168.122.11 port 59070 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:51:09 np0005541913.localdomain systemd-logind[757]: New session 63 of user tripleo-admin.
Dec 02 09:51:09 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 1003.
Dec 02 09:51:09 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Dec 02 09:51:09 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Dec 02 09:51:09 np0005541913.localdomain systemd[1]: Starting User Manager for UID 1003...
Dec 02 09:51:09 np0005541913.localdomain systemd[286017]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Queued start job for default target Main User Target.
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Created slice User Application Slice.
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 09:51:10 np0005541913.localdomain systemd-journald[47611]: Field hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Dec 02 09:51:10 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 09:51:10 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Reached target Paths.
Dec 02 09:51:10 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Reached target Timers.
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Starting D-Bus User Message Bus Socket...
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Starting Create User's Volatile Files and Directories...
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Listening on D-Bus User Message Bus Socket.
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Reached target Sockets.
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Finished Create User's Volatile Files and Directories.
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Reached target Basic System.
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Reached target Main User Target.
Dec 02 09:51:10 np0005541913.localdomain systemd[286017]: Startup finished in 156ms.
Dec 02 09:51:10 np0005541913.localdomain systemd[1]: Started User Manager for UID 1003.
Dec 02 09:51:10 np0005541913.localdomain systemd[1]: Started Session 63 of User tripleo-admin.
Dec 02 09:51:10 np0005541913.localdomain sshd[286013]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 02 09:51:10 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 09:51:10 np0005541913.localdomain sudo[286158]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtjpdaqftrjuqbmjryjcqerurlpvefbc ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764669070.2569573-60004-122622583195724/AnsiballZ_blockinfile.py
Dec 02 09:51:10 np0005541913.localdomain sudo[286158]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 09:51:10 np0005541913.localdomain python3[286160]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"
                                                          # 100 ceph_dashboard (8443)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"
                                                          # 100 ceph_grafana (3100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"
                                                          # 100 ceph_prometheus (9092)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"
                                                          # 100 ceph_rgw (8080)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"
                                                          # 110 ceph_mon (6789, 3300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"
                                                          # 112 ceph_mds (6800-7300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"
                                                          # 113 ceph_mgr (6800-7300, 8444)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"
                                                          # 120 ceph_nfs (2049, 12049)
                                                          add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"
                                                          # 123 ceph_dashboard (9090, 9094, 9283)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"
                                                           insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:51:10 np0005541913.localdomain sudo[286158]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:11 np0005541913.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:58:a5:f0 MACDST=fa:16:3e:48:4f:22 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3480 DF PROTO=TCP SPT=54006 DPT=9102 SEQ=2645994212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A47A8B4640000000001030307) 
Dec 02 09:51:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:11.315 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:11 np0005541913.localdomain sudo[286302]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unouwyylytleqexjmjooivqpvjeglvbb ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764669071.0829005-60018-203287583968740/AnsiballZ_systemd.py
Dec 02 09:51:11 np0005541913.localdomain sudo[286302]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 09:51:11 np0005541913.localdomain python3[286304]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 09:51:11 np0005541913.localdomain systemd[1]: Stopping Netfilter Tables...
Dec 02 09:51:11 np0005541913.localdomain systemd[1]: nftables.service: Deactivated successfully.
Dec 02 09:51:11 np0005541913.localdomain systemd[1]: Stopped Netfilter Tables.
Dec 02 09:51:11 np0005541913.localdomain systemd[1]: Starting Netfilter Tables...
Dec 02 09:51:12 np0005541913.localdomain systemd[1]: Finished Netfilter Tables.
Dec 02 09:51:12 np0005541913.localdomain sudo[286302]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:12.541 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:16.347 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:16 np0005541913.localdomain sudo[286329]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:51:16 np0005541913.localdomain sudo[286329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:16 np0005541913.localdomain sudo[286329]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:17 np0005541913.localdomain sudo[286347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:51:17 np0005541913.localdomain sudo[286347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:17.582 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:17 np0005541913.localdomain sudo[286347]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:18 np0005541913.localdomain sudo[286397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:18 np0005541913.localdomain sudo[286397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:18 np0005541913.localdomain sudo[286397]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:18 np0005541913.localdomain sudo[286415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:18 np0005541913.localdomain sudo[286415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:18 np0005541913.localdomain sudo[286415]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:20 np0005541913.localdomain sudo[286433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:20 np0005541913.localdomain sudo[286433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:20 np0005541913.localdomain sudo[286433]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:51:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:21.405 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:21 np0005541913.localdomain sudo[286452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:21 np0005541913.localdomain sudo[286452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:21 np0005541913.localdomain sudo[286452]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:21 np0005541913.localdomain podman[286451]: 2025-12-02 09:51:21.480985491 +0000 UTC m=+0.064712803 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 09:51:21 np0005541913.localdomain podman[286451]: 2025-12-02 09:51:21.489781906 +0000 UTC m=+0.073509208 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible)
Dec 02 09:51:21 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:51:22 np0005541913.localdomain sudo[286488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:22 np0005541913.localdomain sudo[286488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:22 np0005541913.localdomain sudo[286488]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:22.616 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:23 np0005541913.localdomain sudo[286506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:23 np0005541913.localdomain sudo[286506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:23 np0005541913.localdomain sudo[286506]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:51:25 np0005541913.localdomain podman[286524]: 2025-12-02 09:51:25.438815221 +0000 UTC m=+0.074948717 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:51:25 np0005541913.localdomain podman[286524]: 2025-12-02 09:51:25.472057301 +0000 UTC m=+0.108190807 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Dec 02 09:51:25 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:51:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:26.409 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:27 np0005541913.localdomain sudo[286542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:51:27 np0005541913.localdomain sudo[286542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:27 np0005541913.localdomain sudo[286542]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:27 np0005541913.localdomain sudo[286560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:51:27 np0005541913.localdomain sudo[286560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:27.637 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:28 np0005541913.localdomain podman[286622]: 
Dec 02 09:51:28 np0005541913.localdomain podman[286622]: 2025-12-02 09:51:28.149678722 +0000 UTC m=+0.084877064 container create a5a41c85c45658c818e59584bdf4810c91db2325e22f77aff7095b6a364d80ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_euclid, io.buildah.version=1.41.4, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1763362218, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: Started libpod-conmon-a5a41c85c45658c818e59584bdf4810c91db2325e22f77aff7095b6a364d80ca.scope.
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:51:28 np0005541913.localdomain podman[286622]: 2025-12-02 09:51:28.115472505 +0000 UTC m=+0.050670837 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:51:28 np0005541913.localdomain podman[286622]: 2025-12-02 09:51:28.22884525 +0000 UTC m=+0.164043572 container init a5a41c85c45658c818e59584bdf4810c91db2325e22f77aff7095b6a364d80ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_euclid, io.openshift.tags=rhceph ceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph)
Dec 02 09:51:28 np0005541913.localdomain podman[286622]: 2025-12-02 09:51:28.243852651 +0000 UTC m=+0.179050983 container start a5a41c85c45658c818e59584bdf4810c91db2325e22f77aff7095b6a364d80ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_euclid, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, name=rhceph, architecture=x86_64, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_CLEAN=True, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:51:28 np0005541913.localdomain podman[286622]: 2025-12-02 09:51:28.244123218 +0000 UTC m=+0.179321550 container attach a5a41c85c45658c818e59584bdf4810c91db2325e22f77aff7095b6a364d80ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_euclid, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph)
Dec 02 09:51:28 np0005541913.localdomain zen_euclid[286637]: 167 167
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: libpod-a5a41c85c45658c818e59584bdf4810c91db2325e22f77aff7095b6a364d80ca.scope: Deactivated successfully.
Dec 02 09:51:28 np0005541913.localdomain podman[286622]: 2025-12-02 09:51:28.248102865 +0000 UTC m=+0.183301237 container died a5a41c85c45658c818e59584bdf4810c91db2325e22f77aff7095b6a364d80ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_euclid, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, distribution-scope=public, vcs-type=git, name=rhceph, GIT_BRANCH=main)
Dec 02 09:51:28 np0005541913.localdomain podman[286644]: 2025-12-02 09:51:28.354812551 +0000 UTC m=+0.092428324 container remove a5a41c85c45658c818e59584bdf4810c91db2325e22f77aff7095b6a364d80ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_euclid, version=7, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, name=rhceph, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64)
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: libpod-conmon-a5a41c85c45658c818e59584bdf4810c91db2325e22f77aff7095b6a364d80ca.scope: Deactivated successfully.
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:51:28 np0005541913.localdomain systemd-sysv-generator[286689]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:51:28 np0005541913.localdomain systemd-rc-local-generator[286686]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4995067a505adc822eda7872cf00cb503647bd3a98f04b7ef8ac1961b1ea9a55-merged.mount: Deactivated successfully.
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:51:28 np0005541913.localdomain systemd-rc-local-generator[286725]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:51:28 np0005541913.localdomain systemd-sysv-generator[286730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:28 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:51:29 np0005541913.localdomain systemd[1]: Starting Ceph mds.mds.np0005541913.maexpe for c7c8e171-a193-56fb-95fa-8879fcfa7074...
Dec 02 09:51:29 np0005541913.localdomain podman[286790]: 
Dec 02 09:51:29 np0005541913.localdomain podman[286790]: 2025-12-02 09:51:29.463259496 +0000 UTC m=+0.052829995 container create 588592dabdb0941a0740c1cf0ce8f9e94be3ea185621923b001e2ef4eee03fa9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mds-mds-np0005541913-maexpe, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1763362218, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-type=git, version=7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:51:29 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2e9ca40f5b00254365b2f578b608dfdd4e3b1686706b09cb19c7f704b9eba89/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 09:51:29 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2e9ca40f5b00254365b2f578b608dfdd4e3b1686706b09cb19c7f704b9eba89/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 09:51:29 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2e9ca40f5b00254365b2f578b608dfdd4e3b1686706b09cb19c7f704b9eba89/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 09:51:29 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2e9ca40f5b00254365b2f578b608dfdd4e3b1686706b09cb19c7f704b9eba89/merged/var/lib/ceph/mds/ceph-mds.np0005541913.maexpe supports timestamps until 2038 (0x7fffffff)
Dec 02 09:51:29 np0005541913.localdomain podman[286790]: 2025-12-02 09:51:29.520710823 +0000 UTC m=+0.110281302 container init 588592dabdb0941a0740c1cf0ce8f9e94be3ea185621923b001e2ef4eee03fa9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mds-mds-np0005541913-maexpe, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph)
Dec 02 09:51:29 np0005541913.localdomain podman[286790]: 2025-12-02 09:51:29.527812054 +0000 UTC m=+0.117382543 container start 588592dabdb0941a0740c1cf0ce8f9e94be3ea185621923b001e2ef4eee03fa9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mds-mds-np0005541913-maexpe, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, RELEASE=main, distribution-scope=public, GIT_BRANCH=main, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:51:29 np0005541913.localdomain bash[286790]: 588592dabdb0941a0740c1cf0ce8f9e94be3ea185621923b001e2ef4eee03fa9
Dec 02 09:51:29 np0005541913.localdomain podman[286790]: 2025-12-02 09:51:29.438130974 +0000 UTC m=+0.027701463 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:51:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:51:29 np0005541913.localdomain systemd[1]: Started Ceph mds.mds.np0005541913.maexpe for c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 09:51:29 np0005541913.localdomain ceph-mds[286809]: set uid:gid to 167:167 (ceph:ceph)
Dec 02 09:51:29 np0005541913.localdomain ceph-mds[286809]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2
Dec 02 09:51:29 np0005541913.localdomain ceph-mds[286809]: main not setting numa affinity
Dec 02 09:51:29 np0005541913.localdomain ceph-mds[286809]: pidfile_write: ignore empty --pid-file
Dec 02 09:51:29 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mds-mds-np0005541913-maexpe[286805]: starting mds.mds.np0005541913.maexpe at 
Dec 02 09:51:29 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Updating MDS map to version 7 from mon.1
Dec 02 09:51:29 np0005541913.localdomain sudo[286560]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:29 np0005541913.localdomain podman[286810]: 2025-12-02 09:51:29.61701708 +0000 UTC m=+0.065355700 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 09:51:29 np0005541913.localdomain podman[286810]: 2025-12-02 09:51:29.632190407 +0000 UTC m=+0.080529007 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 09:51:29 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:51:29 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Updating MDS map to version 8 from mon.1
Dec 02 09:51:29 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Monitors have assigned me to become a standby.
Dec 02 09:51:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:51:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:31.450 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:31 np0005541913.localdomain podman[286848]: 2025-12-02 09:51:31.46096941 +0000 UTC m=+0.096395882 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:51:31 np0005541913.localdomain podman[286848]: 2025-12-02 09:51:31.473958857 +0000 UTC m=+0.109385249 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:51:31 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:51:32 np0005541913.localdomain sudo[286871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:32 np0005541913.localdomain sudo[286871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:32 np0005541913.localdomain sudo[286871]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:32.676 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:32 np0005541913.localdomain sudo[286889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:51:32 np0005541913.localdomain sudo[286889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:32 np0005541913.localdomain sudo[286889]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:32 np0005541913.localdomain sudo[286907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:51:32 np0005541913.localdomain sudo[286907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:33 np0005541913.localdomain podman[286998]: 2025-12-02 09:51:33.588377474 +0000 UTC m=+0.099067572 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.buildah.version=1.41.4, name=rhceph, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=1763362218, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:51:33 np0005541913.localdomain podman[286998]: 2025-12-02 09:51:33.675012912 +0000 UTC m=+0.185703000 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_BRANCH=main, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, RELEASE=main, ceph=True, version=7, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Dec 02 09:51:34 np0005541913.localdomain sudo[286907]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:51:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:51:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:51:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:51:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:51:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:51:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:51:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:51:34 np0005541913.localdomain sshd[285620]: Received disconnect from 38.102.83.114 port 45662:11: disconnected by user
Dec 02 09:51:34 np0005541913.localdomain sshd[285620]: Disconnected from user zuul 38.102.83.114 port 45662
Dec 02 09:51:34 np0005541913.localdomain sshd[285617]: pam_unix(sshd:session): session closed for user zuul
Dec 02 09:51:34 np0005541913.localdomain systemd[1]: session-62.scope: Deactivated successfully.
Dec 02 09:51:34 np0005541913.localdomain systemd-logind[757]: Session 62 logged out. Waiting for processes to exit.
Dec 02 09:51:34 np0005541913.localdomain systemd-logind[757]: Removed session 62.
Dec 02 09:51:34 np0005541913.localdomain sudo[287084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:34 np0005541913.localdomain sudo[287084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:34 np0005541913.localdomain sudo[287084]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:35 np0005541913.localdomain sudo[287102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:51:35 np0005541913.localdomain sudo[287102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:51:35 np0005541913.localdomain sudo[287102]: pam_unix(sudo:session): session closed for user root
Dec 02 09:51:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:51:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:51:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:51:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149882 "" "Go-http-client/1.1"
Dec 02 09:51:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:51:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17717 "" "Go-http-client/1.1"
Dec 02 09:51:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:36.452 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:37.711 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:51:39 np0005541913.localdomain podman[287120]: 2025-12-02 09:51:39.155787448 +0000 UTC m=+0.073861378 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:51:39 np0005541913.localdomain podman[287120]: 2025-12-02 09:51:39.191098572 +0000 UTC m=+0.109172532 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 09:51:39 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:51:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:51:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:51:40 np0005541913.localdomain podman[287139]: 2025-12-02 09:51:40.429863324 +0000 UTC m=+0.074213907 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:51:40 np0005541913.localdomain podman[287139]: 2025-12-02 09:51:40.439149223 +0000 UTC m=+0.083499786 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:51:40 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:51:40 np0005541913.localdomain podman[287140]: 2025-12-02 09:51:40.519228076 +0000 UTC m=+0.161676588 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Dec 02 09:51:40 np0005541913.localdomain podman[287140]: 2025-12-02 09:51:40.547809561 +0000 UTC m=+0.190258133 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 02 09:51:40 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:51:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:41.454 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:42.750 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:42.935 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:42.959 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:42.959 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:43.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:43.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:43.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:51:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:44.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:44.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:44.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:51:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:44.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.218 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.218 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.219 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.219 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.637 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.657 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.657 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.658 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.674 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.675 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.675 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.676 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:51:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:45.676 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.107 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.262 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.263 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.456 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.462 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.464 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12277MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.465 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.465 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.550 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.551 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.551 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:51:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:46.594 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:51:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:47.047 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:51:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:47.056 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:51:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:47.211 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:51:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:47.213 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:51:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:47.214 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.749s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:51:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:47.384 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:47.385 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:51:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:47.779 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:51.458 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:51:52 np0005541913.localdomain podman[287229]: 2025-12-02 09:51:52.428914008 +0000 UTC m=+0.069487101 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 02 09:51:52 np0005541913.localdomain podman[287229]: 2025-12-02 09:51:52.438715209 +0000 UTC m=+0.079288362 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 02 09:51:52 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:51:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:52.814 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:51:56 np0005541913.localdomain podman[287248]: 2025-12-02 09:51:56.446829935 +0000 UTC m=+0.087228245 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 09:51:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:56.495 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:51:56 np0005541913.localdomain podman[287248]: 2025-12-02 09:51:56.499188697 +0000 UTC m=+0.139586967 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 02 09:51:56 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:51:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:51:57.817 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:52:00 np0005541913.localdomain podman[287266]: 2025-12-02 09:52:00.445548909 +0000 UTC m=+0.089768963 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, version=9.6, config_id=edpm, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Dec 02 09:52:00 np0005541913.localdomain podman[287266]: 2025-12-02 09:52:00.485094188 +0000 UTC m=+0.129314232 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Dec 02 09:52:00 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:52:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:01.498 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:52:02 np0005541913.localdomain podman[287287]: 2025-12-02 09:52:02.446249823 +0000 UTC m=+0.088037776 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:52:02 np0005541913.localdomain podman[287287]: 2025-12-02 09:52:02.45099081 +0000 UTC m=+0.092778783 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:52:02 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:52:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:02.821 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:52:03.036 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:52:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:52:03.036 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:52:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:52:03.037 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:52:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:52:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:52:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:52:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:52:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:52:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:52:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:52:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:52:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:52:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:52:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149882 "" "Go-http-client/1.1"
Dec 02 09:52:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:52:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17721 "" "Go-http-client/1.1"
Dec 02 09:52:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:06.502 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:07 np0005541913.localdomain sudo[287310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:52:07 np0005541913.localdomain sudo[287310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:07 np0005541913.localdomain sudo[287310]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:07.822 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:52:09 np0005541913.localdomain systemd[1]: tmp-crun.Bavsoi.mount: Deactivated successfully.
Dec 02 09:52:09 np0005541913.localdomain podman[287328]: 2025-12-02 09:52:09.393003644 +0000 UTC m=+0.083722482 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=multipathd)
Dec 02 09:52:09 np0005541913.localdomain podman[287328]: 2025-12-02 09:52:09.431538355 +0000 UTC m=+0.122257153 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Dec 02 09:52:09 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:52:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:52:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:52:11 np0005541913.localdomain podman[287350]: 2025-12-02 09:52:11.455902393 +0000 UTC m=+0.091049058 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:52:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:11.506 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:11 np0005541913.localdomain systemd[1]: tmp-crun.YN0ypF.mount: Deactivated successfully.
Dec 02 09:52:11 np0005541913.localdomain podman[287350]: 2025-12-02 09:52:11.522856574 +0000 UTC m=+0.158003229 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:52:11 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:52:11 np0005541913.localdomain podman[287349]: 2025-12-02 09:52:11.527639352 +0000 UTC m=+0.164762191 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:52:11 np0005541913.localdomain podman[287349]: 2025-12-02 09:52:11.612244126 +0000 UTC m=+0.249366955 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:52:11 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:52:11 np0005541913.localdomain sshd[286033]: Received disconnect from 192.168.122.11 port 59070:11: disconnected by user
Dec 02 09:52:11 np0005541913.localdomain sshd[286033]: Disconnected from user tripleo-admin 192.168.122.11 port 59070
Dec 02 09:52:11 np0005541913.localdomain sshd[286013]: pam_unix(sshd:session): session closed for user tripleo-admin
Dec 02 09:52:11 np0005541913.localdomain systemd[1]: session-63.scope: Deactivated successfully.
Dec 02 09:52:11 np0005541913.localdomain systemd[1]: session-63.scope: Consumed 1.322s CPU time.
Dec 02 09:52:11 np0005541913.localdomain systemd-logind[757]: Session 63 logged out. Waiting for processes to exit.
Dec 02 09:52:11 np0005541913.localdomain systemd-logind[757]: Removed session 63.
Dec 02 09:52:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:12.853 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.101 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.106 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f68abb7a-6d65-46b4-ae25-ba06ad2826c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.102512', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '958ae40c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': '96cc0523df0953e21713103908ad541f6d618aa9f201a876b55ef60d1da16dea'}]}, 'timestamp': '2025-12-02 09:52:16.107525', '_unique_id': '1eebfdd7c28c4348a82991a22094dce9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.109 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.122 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.122 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '957d602c-dab8-4f17-a4fe-d71d9d767324', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:52:16.110852', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '958d3234-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.32996604, 'message_signature': '7bee86e91c787717a78d06b25765919a507e826d949be49596c2eea407b4ea60'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:52:16.110852', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '958d47ce-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.32996604, 'message_signature': '31502fb1d8622b1e2f840cfc11dc43227d9a77e83370d4f423a63f27d00e38d5'}]}, 'timestamp': '2025-12-02 09:52:16.123096', '_unique_id': 'b612ddedbe8d410ab72f7ae4fb168123'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.124 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.125 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f7c3830-23cb-45c6-be9d-891e01ddf9bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.125689', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '958dbfc4-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': '97e02d78b1bbc3374973c9b9c41d9d3e14ac230b07919bdb519400041ef67495'}]}, 'timestamp': '2025-12-02 09:52:16.126193', '_unique_id': '0ce50e725c654695a016287fba175c7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.127 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.128 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c55467e-26fd-4f39-94bf-300425f2ba4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.128446', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '958e2d56-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': '406edc9addeaafa1300c7fc873da141031354eb67b89210926adad222563ee49'}]}, 'timestamp': '2025-12-02 09:52:16.128997', '_unique_id': '962d95a9bc404f99bd206acd2d546523'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.130 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.161 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.162 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f43817e-9587-431b-8321-eb96e79295ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:52:16.131650', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9593423c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': 'e48f89f65265954f13a3289cb4ef4d15dd09a75c415ccc4ed6eb31aa8278cb0c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:52:16.131650', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '959359ac-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': 'c5a3ba69bcde6537a8947b9c6808ce88c2eaf3edf5f36a1704af433c17928424'}]}, 'timestamp': '2025-12-02 09:52:16.162954', '_unique_id': 'b9b163d3f9a644b3926b89dc3949db76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.165 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63df1fd9-413f-4d6f-9a72-5b1e128c4178', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.165819', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '9593df80-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': 'd03bf24facf1a5a7f818e827edc7ca9aea0c174fcd4069c6cb3d4a226d397c0a'}]}, 'timestamp': '2025-12-02 09:52:16.166323', '_unique_id': '5e559a76283740cfaf0c4fc739eb719e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.167 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.168 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.169 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f85f63bf-2e46-40af-baa6-9fc44bdba892', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:52:16.168691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '95944eca-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': '89dae6af691f27d0f4add2971b91ca18ddc716051cfc16a9b5c794b3ff6e6b0e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:52:16.168691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '95946126-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': '3d8dbeb966b3a50b4de7b7469ff5c25b6f0daa0f64d7976ca5be1541f26c4228'}]}, 'timestamp': '2025-12-02 09:52:16.169644', '_unique_id': '5cee180819b0404f8a2680239b06a38b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.172 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.172 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97fac221-566c-4379-9207-7177dfd3997e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:52:16.171943', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9594cfee-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': '2bae40fc2f8f1861bd699eef94cdcab385ebdfe8cb20247efe4a0f6a70fefa8d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:52:16.171943', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9594e1f0-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': '24677901739101c8b11916c452189314a6a18123bba44546adda812233880aea'}]}, 'timestamp': '2025-12-02 09:52:16.172907', '_unique_id': '33cccf8512324cf99ec600b41805dac4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.175 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.175 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb89c13a-1cfd-4130-ae83-2d4d1ba6562f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:52:16.175381', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '95955586-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.32996604, 'message_signature': '987ba902dd2eeb10531f834142af5384520fc297e79b1342c90abcf7148d1ba4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:52:16.175381', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '959567a6-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.32996604, 'message_signature': 'a07287f66cc1ab3cbe23172332b0ac1e32e40247ab61d19155193dfed635f92d'}]}, 'timestamp': '2025-12-02 09:52:16.176321', '_unique_id': '11cca913134f42e2a07d43d906dfb404'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.177 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.178 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.178 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.179 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78f55d7c-9f7b-474e-bb7c-81d202b7330f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:52:16.178780', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9595d89e-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': 'ddf0aaf40afb16c9f5a740758ec310b058a5df5b5ea63f7c8744887962f1a530'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:52:16.178780', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9595e85c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': 'e44b3eba060aa07dd8688f09cd4d377503ebaebfa2abf14f3b6defcd059911d6'}]}, 'timestamp': '2025-12-02 09:52:16.179652', '_unique_id': '8fbce8031c5646dcb0249f1ba2faa9ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.182 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.182 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de3afa04-0fba-46bc-945d-15364ed5dbae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:52:16.182018', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '95965724-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': '165c7c9c974c9d20e94499b6e051dc47a76268ac45c2df5ab3b3776259d0d078'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:52:16.182018', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '95966868-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': '5557dc59330d7df857fd907fd75d24620caf9c5169cb5d0da030f1fbd37569fd'}]}, 'timestamp': '2025-12-02 09:52:16.182898', '_unique_id': '332f450cca0144889858d62eb36a10a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.184 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bff2ba62-0237-4b44-8e6c-f9fc4a9487a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:52:16.185099', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '959933d6-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.419716882, 'message_signature': '2c877ae6d793b8a73bde3ca77f9b057f09f25528420fb59d2210dbec93fb1591'}]}, 'timestamp': '2025-12-02 09:52:16.201228', '_unique_id': 'd410d1e50b064737bb0866101cb86b73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.204 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecdfa490-5949-4548-8879-3efe7f754933', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:52:16.203523', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9599a226-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': 'f91031e95e355f6eabb6065a261cf7f9b421f6af90e00d3baea95a786987004a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:52:16.203523', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9599b2b6-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.350742317, 'message_signature': '3891d3e3cca8a84887a38a78b354b73e8f4741a6d4f8d60f503313c17a65761b'}]}, 'timestamp': '2025-12-02 09:52:16.204457', '_unique_id': '7397defa58b54f5f841af961545b3f09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36e9c54d-a0ae-4c58-87a3-1b7eef4715dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.206822', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '959a1ff8-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': '01a73dc9f3388ee0111daef93d5d7ce7e9c747409711a1e9008a08e69000c7ec'}]}, 'timestamp': '2025-12-02 09:52:16.207282', '_unique_id': '7e9289fd34db40d9a9d55c71440905a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.209 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0261bdac-b1bf-43f1-a8af-725867a7335f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.209462', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '959a88a8-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': '930998623cf5e98c166f19676b0080244059b65c3451bdfeffe687ca9b2f00a7'}]}, 'timestamp': '2025-12-02 09:52:16.209967', '_unique_id': 'a03a263ef3b84d9c9e3329450e6ae87b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.212 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.212 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 12730000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a39441cf-3291-4b31-a1c1-058f4b0f42d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12730000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:52:16.212197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '959af1a8-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.419716882, 'message_signature': 'fcce274d67905f64394c5214e06710b64ffa199d9e1a91f19b344c9efc5db0ee'}]}, 'timestamp': '2025-12-02 09:52:16.212676', '_unique_id': 'f72d2abaf4404761ba4c32cdc4c45c8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f5c6705-1cce-4e5e-9217-25f3c9e92cc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.214976', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '959b5f44-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': 'b020445a9bee67893fa9fbcac5f86e63aa1a3a522aa5f2cb5cb0ddcd75fb13f0'}]}, 'timestamp': '2025-12-02 09:52:16.215504', '_unique_id': '00e11962e3254311bffd89c5cf4e89f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.217 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d08797b-a11e-4ac5-b30b-5883788a1589', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.217720', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '959bc9de-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': '026b27fd7ee3f97987d19e6c19746bef7b7bbb3bcd54d77a459818a6e1ebc4f1'}]}, 'timestamp': '2025-12-02 09:52:16.218195', '_unique_id': '7bc387e0c5614c55bae590784d737a17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.220 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '651519ca-19a0-4769-8d0b-7e65d9383b8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.220697', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '959c3ef0-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': '45c34360727ef1e6fc9624f4cfc3be00396d1f20cb1f31ed1eb134d94b6b2817'}]}, 'timestamp': '2025-12-02 09:52:16.221185', '_unique_id': '7fe343cbda4f4d539d4f47e25e4244b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9086512-f357-486d-a494-669bc2a840cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:52:16.223351', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '959ca836-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.321565376, 'message_signature': 'fd6b8bcce6fa97e9a3c359ff7300bba9a2ea4747bdb2cd8974a66fbc0e9d4aea'}]}, 'timestamp': '2025-12-02 09:52:16.223887', '_unique_id': '8fbd11e8f97e45329157bcbf4d70811c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.224 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.226 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.226 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '667c5d5b-701e-42bb-94a0-62de42dcd8cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:52:16.226189', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '959d14ec-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.32996604, 'message_signature': 'f956a5907a324d5e014b42ddd747677865209199d7dcc1f989854e8ae46a2a9e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:52:16.226189', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '959d26da-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11298.32996604, 'message_signature': '5d447f19dd97449d45663ea1fcf036a18528880acf9d78042caf5338baa25245'}]}, 'timestamp': '2025-12-02 09:52:16.227128', '_unique_id': 'abd5a8a4855b474da8a096a5f8489d97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:52:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:52:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:52:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:16.531 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:17.896 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:21.568 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 1003...
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Activating special unit Exit the Session...
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Stopped target Main User Target.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Stopped target Basic System.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Stopped target Paths.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Stopped target Sockets.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Stopped target Timers.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Closed D-Bus User Message Bus Socket.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Stopped Create User's Volatile Files and Directories.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Removed slice User Application Slice.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Reached target Shutdown.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Finished Exit the Session.
Dec 02 09:52:22 np0005541913.localdomain systemd[286017]: Reached target Exit the Session.
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 1003.
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: user-1003.slice: Consumed 1.672s CPU time.
Dec 02 09:52:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:22.919 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:52:23 np0005541913.localdomain podman[287396]: 2025-12-02 09:52:23.016365727 +0000 UTC m=+0.072364798 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:52:23 np0005541913.localdomain podman[287396]: 2025-12-02 09:52:23.029966501 +0000 UTC m=+0.085965562 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 09:52:23 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:52:25 np0005541913.localdomain sudo[287415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:52:25 np0005541913.localdomain sudo[287415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:25 np0005541913.localdomain sudo[287415]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:25 np0005541913.localdomain sudo[287433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:52:25 np0005541913.localdomain sudo[287433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:25 np0005541913.localdomain sudo[287433]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:25 np0005541913.localdomain sudo[287451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:52:25 np0005541913.localdomain sudo[287451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:26 np0005541913.localdomain sudo[287451]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:26.571 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:52:27 np0005541913.localdomain podman[287501]: 2025-12-02 09:52:27.441704808 +0000 UTC m=+0.079197670 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 09:52:27 np0005541913.localdomain podman[287501]: 2025-12-02 09:52:27.45298343 +0000 UTC m=+0.090476352 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 09:52:27 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:52:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:27.958 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:28 np0005541913.localdomain sudo[287519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:52:28 np0005541913.localdomain sudo[287519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:28 np0005541913.localdomain sudo[287519]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:28 np0005541913.localdomain sudo[287537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:52:28 np0005541913.localdomain sudo[287537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:28 np0005541913.localdomain sudo[287537]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:52:31 np0005541913.localdomain systemd[1]: tmp-crun.W410F3.mount: Deactivated successfully.
Dec 02 09:52:31 np0005541913.localdomain podman[287555]: 2025-12-02 09:52:31.448216341 +0000 UTC m=+0.079383795 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm)
Dec 02 09:52:31 np0005541913.localdomain podman[287555]: 2025-12-02 09:52:31.465197426 +0000 UTC m=+0.096364860 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Dec 02 09:52:31 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:52:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:31.611 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:32.963 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:52:33 np0005541913.localdomain systemd[1]: tmp-crun.4lXzKy.mount: Deactivated successfully.
Dec 02 09:52:33 np0005541913.localdomain podman[287575]: 2025-12-02 09:52:33.461956894 +0000 UTC m=+0.094046408 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:52:33 np0005541913.localdomain podman[287575]: 2025-12-02 09:52:33.500789654 +0000 UTC m=+0.132879138 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:52:33 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:52:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:52:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:52:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:52:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:52:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:52:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:52:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:52:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:52:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:52:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:52:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:52:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149882 "" "Go-http-client/1.1"
Dec 02 09:52:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:52:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17721 "" "Go-http-client/1.1"
Dec 02 09:52:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:36.655 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:37.968 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:52:40 np0005541913.localdomain podman[287598]: 2025-12-02 09:52:40.429401668 +0000 UTC m=+0.073852387 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 02 09:52:40 np0005541913.localdomain podman[287598]: 2025-12-02 09:52:40.444098631 +0000 UTC m=+0.088549400 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 09:52:40 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:52:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:40.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:40.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 02 09:52:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:40.845 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 02 09:52:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:40.846 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:40.846 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 02 09:52:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:40.861 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:41.690 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:41.872 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:52:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:52:42 np0005541913.localdomain podman[287617]: 2025-12-02 09:52:42.423494704 +0000 UTC m=+0.067543668 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:52:42 np0005541913.localdomain podman[287617]: 2025-12-02 09:52:42.435893656 +0000 UTC m=+0.079942630 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:52:42 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:52:42 np0005541913.localdomain podman[287618]: 2025-12-02 09:52:42.512392694 +0000 UTC m=+0.153662424 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 09:52:42 np0005541913.localdomain podman[287618]: 2025-12-02 09:52:42.573244572 +0000 UTC m=+0.214514392 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 09:52:42 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:52:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:42.970 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:43.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:43.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:52:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:44.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:44.829 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:52:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:44.829 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.237 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.238 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.238 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.238 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.728 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.745 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.745 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.746 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.747 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.851 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.851 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.852 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.852 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:52:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:45.852 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.292 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.366 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.367 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.571 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.573 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=12275MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.573 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.573 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.685 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.685 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.686 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.693 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.748 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.823 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.824 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.844 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.864 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_ABM,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 09:52:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:46.902 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:52:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:47.349 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:52:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:47.354 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:52:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:47.367 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:52:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:47.368 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:52:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:47.369 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:52:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:47.973 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:51.733 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:52 np0005541913.localdomain sudo[287709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:52:52 np0005541913.localdomain sudo[287709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:52 np0005541913.localdomain sudo[287709]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:52.976 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:52:53 np0005541913.localdomain podman[287727]: 2025-12-02 09:52:53.446285392 +0000 UTC m=+0.088349016 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Dec 02 09:52:53 np0005541913.localdomain podman[287727]: 2025-12-02 09:52:53.455914929 +0000 UTC m=+0.097978503 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 09:52:53 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:52:53 np0005541913.localdomain sudo[287742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:52:53 np0005541913.localdomain sudo[287742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:53 np0005541913.localdomain sudo[287742]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:54 np0005541913.localdomain sudo[287763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:52:54 np0005541913.localdomain sudo[287763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:54 np0005541913.localdomain sudo[287763]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:56.774 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:52:58.000 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:52:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:52:58 np0005541913.localdomain systemd[1]: tmp-crun.Xy3x27.mount: Deactivated successfully.
Dec 02 09:52:58 np0005541913.localdomain podman[287781]: 2025-12-02 09:52:58.446429968 +0000 UTC m=+0.087155085 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:52:58 np0005541913.localdomain podman[287781]: 2025-12-02 09:52:58.481148087 +0000 UTC m=+0.121873134 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 02 09:52:58 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:52:58 np0005541913.localdomain sudo[287799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:52:58 np0005541913.localdomain sudo[287799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:58 np0005541913.localdomain sudo[287799]: pam_unix(sudo:session): session closed for user root
Dec 02 09:52:58 np0005541913.localdomain sudo[287817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:52:58 np0005541913.localdomain sudo[287817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:52:59 np0005541913.localdomain podman[287875]: 
Dec 02 09:52:59 np0005541913.localdomain podman[287875]: 2025-12-02 09:52:59.303145725 +0000 UTC m=+0.058948169 container create 26ed9e67552f8a49a72c11884745c5fb0ecfcee2138bda262a54b323b510dd48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lovelace, RELEASE=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.openshift.expose-services=, release=1763362218, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: Started libpod-conmon-26ed9e67552f8a49a72c11884745c5fb0ecfcee2138bda262a54b323b510dd48.scope.
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:52:59 np0005541913.localdomain podman[287875]: 2025-12-02 09:52:59.364368193 +0000 UTC m=+0.120170637 container init 26ed9e67552f8a49a72c11884745c5fb0ecfcee2138bda262a54b323b510dd48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lovelace, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, release=1763362218, GIT_CLEAN=True, distribution-scope=public, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, vcs-type=git)
Dec 02 09:52:59 np0005541913.localdomain podman[287875]: 2025-12-02 09:52:59.271809526 +0000 UTC m=+0.027612020 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:52:59 np0005541913.localdomain podman[287875]: 2025-12-02 09:52:59.374535636 +0000 UTC m=+0.130338050 container start 26ed9e67552f8a49a72c11884745c5fb0ecfcee2138bda262a54b323b510dd48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lovelace, name=rhceph, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public)
Dec 02 09:52:59 np0005541913.localdomain podman[287875]: 2025-12-02 09:52:59.374750412 +0000 UTC m=+0.130552896 container attach 26ed9e67552f8a49a72c11884745c5fb0ecfcee2138bda262a54b323b510dd48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lovelace, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, release=1763362218, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, ceph=True)
Dec 02 09:52:59 np0005541913.localdomain nostalgic_lovelace[287891]: 167 167
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: libpod-26ed9e67552f8a49a72c11884745c5fb0ecfcee2138bda262a54b323b510dd48.scope: Deactivated successfully.
Dec 02 09:52:59 np0005541913.localdomain podman[287875]: 2025-12-02 09:52:59.379050787 +0000 UTC m=+0.134853221 container died 26ed9e67552f8a49a72c11884745c5fb0ecfcee2138bda262a54b323b510dd48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lovelace, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, release=1763362218, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-07fe789c99b3bce0fc75b062fef4d36fa8bf82031a174529546fd1d578ba69ff-merged.mount: Deactivated successfully.
Dec 02 09:52:59 np0005541913.localdomain podman[287896]: 2025-12-02 09:52:59.461443182 +0000 UTC m=+0.069220204 container remove 26ed9e67552f8a49a72c11884745c5fb0ecfcee2138bda262a54b323b510dd48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lovelace, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, architecture=x86_64)
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: libpod-conmon-26ed9e67552f8a49a72c11884745c5fb0ecfcee2138bda262a54b323b510dd48.scope: Deactivated successfully.
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:52:59 np0005541913.localdomain systemd-rc-local-generator[287941]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:52:59 np0005541913.localdomain systemd-sysv-generator[287945]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:52:59 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:52:59 np0005541913.localdomain systemd-sysv-generator[287979]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:52:59 np0005541913.localdomain systemd-rc-local-generator[287976]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: Starting Ceph mgr.np0005541913.mfesdm for c7c8e171-a193-56fb-95fa-8879fcfa7074...
Dec 02 09:53:00 np0005541913.localdomain podman[288041]: 
Dec 02 09:53:00 np0005541913.localdomain podman[288041]: 2025-12-02 09:53:00.583258404 +0000 UTC m=+0.078517693 container create e85a1b86b1d305678ab89219478fd7faa55f027101f5d8de4368bedf666c21c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.openshift.expose-services=, version=7, GIT_BRANCH=main, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:53:00 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d0959075e92bb626c58513d9bc964dcc957e5b9ceee4c73af40e41da8eecae7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:00 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d0959075e92bb626c58513d9bc964dcc957e5b9ceee4c73af40e41da8eecae7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:00 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d0959075e92bb626c58513d9bc964dcc957e5b9ceee4c73af40e41da8eecae7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:00 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d0959075e92bb626c58513d9bc964dcc957e5b9ceee4c73af40e41da8eecae7/merged/var/lib/ceph/mgr/ceph-np0005541913.mfesdm supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:00 np0005541913.localdomain podman[288041]: 2025-12-02 09:53:00.638167353 +0000 UTC m=+0.133426632 container init e85a1b86b1d305678ab89219478fd7faa55f027101f5d8de4368bedf666c21c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat Ceph Storage 7, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, version=7, vendor=Red Hat, Inc., name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:53:00 np0005541913.localdomain podman[288041]: 2025-12-02 09:53:00.643704242 +0000 UTC m=+0.138963531 container start e85a1b86b1d305678ab89219478fd7faa55f027101f5d8de4368bedf666c21c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-type=git, name=rhceph, distribution-scope=public, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:53:00 np0005541913.localdomain bash[288041]: e85a1b86b1d305678ab89219478fd7faa55f027101f5d8de4368bedf666c21c8
Dec 02 09:53:00 np0005541913.localdomain podman[288041]: 2025-12-02 09:53:00.551208686 +0000 UTC m=+0.046467995 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:53:00 np0005541913.localdomain systemd[1]: Started Ceph mgr.np0005541913.mfesdm for c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 09:53:00 np0005541913.localdomain ceph-mgr[288059]: set uid:gid to 167:167 (ceph:ceph)
Dec 02 09:53:00 np0005541913.localdomain ceph-mgr[288059]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Dec 02 09:53:00 np0005541913.localdomain ceph-mgr[288059]: pidfile_write: ignore empty --pid-file
Dec 02 09:53:00 np0005541913.localdomain sudo[287817]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:00 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'alerts'
Dec 02 09:53:00 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 02 09:53:00 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'balancer'
Dec 02 09:53:00 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:00.772+0000 7f789d852140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 02 09:53:00 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 02 09:53:00 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'cephadm'
Dec 02 09:53:00 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:00.839+0000 7f789d852140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 02 09:53:01 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'crash'
Dec 02 09:53:01 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 02 09:53:01 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'dashboard'
Dec 02 09:53:01 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:01.468+0000 7f789d852140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 02 09:53:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:01.818 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:01 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'devicehealth'
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'diskprediction_local'
Dec 02 09:53:02 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:02.056+0000 7f789d852140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 02 09:53:02 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 02 09:53:02 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 02 09:53:02 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]:   from numpy import show_config as show_numpy_config
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'influx'
Dec 02 09:53:02 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:02.199+0000 7f789d852140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'insights'
Dec 02 09:53:02 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:02.260+0000 7f789d852140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'iostat'
Dec 02 09:53:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'k8sevents'
Dec 02 09:53:02 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:02.375+0000 7f789d852140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 02 09:53:02 np0005541913.localdomain podman[288089]: 2025-12-02 09:53:02.424841199 +0000 UTC m=+0.066616414 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, release=1755695350, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, version=9.6, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:53:02 np0005541913.localdomain podman[288089]: 2025-12-02 09:53:02.43905967 +0000 UTC m=+0.080834935 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 09:53:02 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'localpool'
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'mds_autoscaler'
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'mirroring'
Dec 02 09:53:02 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'nfs'
Dec 02 09:53:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:03.002 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:53:03.037 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:53:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:53:03.037 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:53:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:53:03.038 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'orchestrator'
Dec 02 09:53:03 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:03.128+0000 7f789d852140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'osd_perf_query'
Dec 02 09:53:03 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:03.272+0000 7f789d852140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'osd_support'
Dec 02 09:53:03 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:03.335+0000 7f789d852140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:03.390+0000 7f789d852140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'pg_autoscaler'
Dec 02 09:53:03 np0005541913.localdomain sudo[288109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:53:03 np0005541913.localdomain sudo[288109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:03 np0005541913.localdomain sudo[288109]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:03 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:03.458+0000 7f789d852140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'progress'
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'prometheus'
Dec 02 09:53:03 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:03.517+0000 7f789d852140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain sudo[288127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:53:03 np0005541913.localdomain sudo[288127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:03 np0005541913.localdomain sudo[288127]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:53:03 np0005541913.localdomain sudo[288151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:53:03 np0005541913.localdomain systemd[1]: tmp-crun.cnjDIn.mount: Deactivated successfully.
Dec 02 09:53:03 np0005541913.localdomain podman[288145]: 2025-12-02 09:53:03.69794011 +0000 UTC m=+0.063164071 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:53:03 np0005541913.localdomain sudo[288151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:03 np0005541913.localdomain podman[288145]: 2025-12-02 09:53:03.733129862 +0000 UTC m=+0.098353873 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:53:03 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:03.834+0000 7f789d852140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'rbd_support'
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 02 09:53:03 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'restful'
Dec 02 09:53:03 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:03.922+0000 7f789d852140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 02 09:53:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:53:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:53:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:53:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:53:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:53:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:53:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:53:04 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'rgw'
Dec 02 09:53:04 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 02 09:53:04 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'rook'
Dec 02 09:53:04 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:04.319+0000 7f789d852140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 02 09:53:04 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 02 09:53:04 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'selftest'
Dec 02 09:53:04 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:04.853+0000 7f789d852140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 02 09:53:04 np0005541913.localdomain podman[288252]: 2025-12-02 09:53:04.890815064 +0000 UTC m=+0.173040262 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, io.buildah.version=1.41.4, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 09:53:04 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 02 09:53:04 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'snap_schedule'
Dec 02 09:53:04 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:04.957+0000 7f789d852140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain podman[288252]: 2025-12-02 09:53:05.035943199 +0000 UTC m=+0.318168327 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vendor=Red Hat, Inc.)
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'stats'
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'status'
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'telegraf'
Dec 02 09:53:05 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:05.196+0000 7f789d852140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'telemetry'
Dec 02 09:53:05 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:05.258+0000 7f789d852140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'test_orchestrator'
Dec 02 09:53:05 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:05.398+0000 7f789d852140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'volumes'
Dec 02 09:53:05 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:05.566+0000 7f789d852140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain sudo[288151]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'zabbix'
Dec 02 09:53:05 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:05.771+0000 7f789d852140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:53:05.830+0000 7f789d852140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x564503731600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Dec 02 09:53:05 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.103:6800/3096645673
Dec 02 09:53:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:53:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:53:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:53:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 152018 "" "Go-http-client/1.1"
Dec 02 09:53:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:53:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18201 "" "Go-http-client/1.1"
Dec 02 09:53:06 np0005541913.localdomain sudo[288355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:53:06 np0005541913.localdomain sudo[288355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:06 np0005541913.localdomain sudo[288355]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:06 np0005541913.localdomain sudo[288373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:53:06 np0005541913.localdomain sudo[288373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:06 np0005541913.localdomain sudo[288373]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:06.869 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:07 np0005541913.localdomain sudo[288391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:53:07 np0005541913.localdomain sudo[288391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:07 np0005541913.localdomain sudo[288391]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:08.044 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:11 np0005541913.localdomain sudo[288409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:53:11 np0005541913.localdomain sudo[288409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:53:11 np0005541913.localdomain sudo[288409]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain podman[288427]: 2025-12-02 09:53:11.163389895 +0000 UTC m=+0.072365788 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 02 09:53:11 np0005541913.localdomain podman[288427]: 2025-12-02 09:53:11.173931508 +0000 UTC m=+0.082907401 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS)
Dec 02 09:53:11 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:53:11 np0005541913.localdomain sudo[288446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:53:11 np0005541913.localdomain sudo[288446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288446]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:53:11 np0005541913.localdomain sudo[288464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288464]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:11 np0005541913.localdomain sudo[288482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288482]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:11 np0005541913.localdomain sudo[288500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288500]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:11 np0005541913.localdomain sudo[288518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288518]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:11 np0005541913.localdomain sudo[288552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288552]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:11 np0005541913.localdomain sudo[288570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288570]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:53:11 np0005541913.localdomain sudo[288588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288588]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:11 np0005541913.localdomain sudo[288606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288606]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:11 np0005541913.localdomain sudo[288624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288624]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:11.872 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:11 np0005541913.localdomain sudo[288642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:11 np0005541913.localdomain sudo[288642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288642]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:11 np0005541913.localdomain sudo[288660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:11 np0005541913.localdomain sudo[288660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:11 np0005541913.localdomain sudo[288660]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain sudo[288678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:12 np0005541913.localdomain sudo[288678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain sudo[288678]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain sudo[288712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:12 np0005541913.localdomain sudo[288712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain sudo[288712]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain sudo[288730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:12 np0005541913.localdomain sudo[288730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain sudo[288730]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain sudo[288748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:12 np0005541913.localdomain sudo[288748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain sudo[288748]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain sudo[288766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:53:12 np0005541913.localdomain sudo[288766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:53:12 np0005541913.localdomain sudo[288766]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain sudo[288790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:53:12 np0005541913.localdomain sudo[288790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain sudo[288790]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain podman[288784]: 2025-12-02 09:53:12.577365687 +0000 UTC m=+0.093229076 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:53:12 np0005541913.localdomain podman[288784]: 2025-12-02 09:53:12.614198692 +0000 UTC m=+0.130062061 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:53:12 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:53:12 np0005541913.localdomain sudo[288818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:53:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:53:12 np0005541913.localdomain sudo[288818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain sudo[288818]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain systemd[1]: tmp-crun.JkDkmD.mount: Deactivated successfully.
Dec 02 09:53:12 np0005541913.localdomain podman[288842]: 2025-12-02 09:53:12.74265627 +0000 UTC m=+0.083667220 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 09:53:12 np0005541913.localdomain sudo[288849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:12 np0005541913.localdomain sudo[288849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain sudo[288849]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain podman[288842]: 2025-12-02 09:53:12.829397932 +0000 UTC m=+0.170408902 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Dec 02 09:53:12 np0005541913.localdomain sudo[288883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:53:12 np0005541913.localdomain sudo[288883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:53:12 np0005541913.localdomain sudo[288883]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:12 np0005541913.localdomain sudo[288920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:53:12 np0005541913.localdomain sudo[288920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:12 np0005541913.localdomain sudo[288920]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:13.044 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:13 np0005541913.localdomain sudo[288938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:53:13 np0005541913.localdomain sudo[288938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[288938]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain sudo[288956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:13 np0005541913.localdomain sudo[288956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[288956]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain sudo[288974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:13 np0005541913.localdomain sudo[288974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[288974]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain sudo[288992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:13 np0005541913.localdomain sudo[288992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[288992]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain sudo[289010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:53:13 np0005541913.localdomain sudo[289010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[289010]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain sudo[289028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:13 np0005541913.localdomain sudo[289028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[289028]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain sudo[289046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:53:13 np0005541913.localdomain sudo[289046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[289046]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain sudo[289080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:53:13 np0005541913.localdomain sudo[289080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[289080]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain sudo[289098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:53:13 np0005541913.localdomain sudo[289098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[289098]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:13 np0005541913.localdomain sudo[289116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:13 np0005541913.localdomain sudo[289116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:13 np0005541913.localdomain sudo[289116]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:14 np0005541913.localdomain sudo[289134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:53:14 np0005541913.localdomain sudo[289134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:14 np0005541913.localdomain sudo[289134]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:16.913 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:18.085 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:19 np0005541913.localdomain sudo[289152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:53:19 np0005541913.localdomain sudo[289152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:19 np0005541913.localdomain sudo[289152]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:19 np0005541913.localdomain sudo[289170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:19 np0005541913.localdomain sudo[289170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:19 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x564503730f20 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Dec 02 09:53:20 np0005541913.localdomain podman[289232]: 
Dec 02 09:53:20 np0005541913.localdomain podman[289232]: 2025-12-02 09:53:20.147007318 +0000 UTC m=+0.119971742 container create 2af7e1a5ec274f4dd7fe026a3e1fa8d050bbe96f9e795163bd1b95ccfe414ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_carver, version=7, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:53:20 np0005541913.localdomain podman[289232]: 2025-12-02 09:53:20.06223851 +0000 UTC m=+0.035202874 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: Started libpod-conmon-2af7e1a5ec274f4dd7fe026a3e1fa8d050bbe96f9e795163bd1b95ccfe414ba5.scope.
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:53:20 np0005541913.localdomain podman[289232]: 2025-12-02 09:53:20.224143903 +0000 UTC m=+0.197108277 container init 2af7e1a5ec274f4dd7fe026a3e1fa8d050bbe96f9e795163bd1b95ccfe414ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_carver, release=1763362218, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.component=rhceph-container)
Dec 02 09:53:20 np0005541913.localdomain podman[289232]: 2025-12-02 09:53:20.234162211 +0000 UTC m=+0.207126595 container start 2af7e1a5ec274f4dd7fe026a3e1fa8d050bbe96f9e795163bd1b95ccfe414ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_carver, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-type=git, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, name=rhceph, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 02 09:53:20 np0005541913.localdomain podman[289232]: 2025-12-02 09:53:20.23450408 +0000 UTC m=+0.207468444 container attach 2af7e1a5ec274f4dd7fe026a3e1fa8d050bbe96f9e795163bd1b95ccfe414ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_carver, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z)
Dec 02 09:53:20 np0005541913.localdomain blissful_carver[289247]: 167 167
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: libpod-2af7e1a5ec274f4dd7fe026a3e1fa8d050bbe96f9e795163bd1b95ccfe414ba5.scope: Deactivated successfully.
Dec 02 09:53:20 np0005541913.localdomain podman[289232]: 2025-12-02 09:53:20.238420225 +0000 UTC m=+0.211384599 container died 2af7e1a5ec274f4dd7fe026a3e1fa8d050bbe96f9e795163bd1b95ccfe414ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_carver, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-type=git, GIT_BRANCH=main, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:53:20 np0005541913.localdomain podman[289252]: 2025-12-02 09:53:20.3376447 +0000 UTC m=+0.087633296 container remove 2af7e1a5ec274f4dd7fe026a3e1fa8d050bbe96f9e795163bd1b95ccfe414ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_carver, name=rhceph, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1763362218)
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: libpod-conmon-2af7e1a5ec274f4dd7fe026a3e1fa8d050bbe96f9e795163bd1b95ccfe414ba5.scope: Deactivated successfully.
Dec 02 09:53:20 np0005541913.localdomain podman[289269]: 
Dec 02 09:53:20 np0005541913.localdomain podman[289269]: 2025-12-02 09:53:20.438813657 +0000 UTC m=+0.069633464 container create 9e453dc76cbbf1996bcf91ec021b1a1078d8ed75d49edf498e46666457530689 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wiles, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: Started libpod-conmon-9e453dc76cbbf1996bcf91ec021b1a1078d8ed75d49edf498e46666457530689.scope.
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:53:20 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f84d1774558842f4b09f84ede41a5cf8226db5b4ec68ad4fe3cec4e8b8ff7ad0/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:20 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f84d1774558842f4b09f84ede41a5cf8226db5b4ec68ad4fe3cec4e8b8ff7ad0/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:20 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f84d1774558842f4b09f84ede41a5cf8226db5b4ec68ad4fe3cec4e8b8ff7ad0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:20 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f84d1774558842f4b09f84ede41a5cf8226db5b4ec68ad4fe3cec4e8b8ff7ad0/merged/var/lib/ceph/mon/ceph-np0005541913 supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:20 np0005541913.localdomain podman[289269]: 2025-12-02 09:53:20.4960495 +0000 UTC m=+0.126869307 container init 9e453dc76cbbf1996bcf91ec021b1a1078d8ed75d49edf498e46666457530689 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wiles, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, io.buildah.version=1.41.4, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:53:20 np0005541913.localdomain podman[289269]: 2025-12-02 09:53:20.511104792 +0000 UTC m=+0.141924589 container start 9e453dc76cbbf1996bcf91ec021b1a1078d8ed75d49edf498e46666457530689 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wiles, release=1763362218, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc.)
Dec 02 09:53:20 np0005541913.localdomain podman[289269]: 2025-12-02 09:53:20.5113776 +0000 UTC m=+0.142197387 container attach 9e453dc76cbbf1996bcf91ec021b1a1078d8ed75d49edf498e46666457530689 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wiles, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218)
Dec 02 09:53:20 np0005541913.localdomain podman[289269]: 2025-12-02 09:53:20.415520264 +0000 UTC m=+0.046340101 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: libpod-9e453dc76cbbf1996bcf91ec021b1a1078d8ed75d49edf498e46666457530689.scope: Deactivated successfully.
Dec 02 09:53:20 np0005541913.localdomain podman[289269]: 2025-12-02 09:53:20.593866907 +0000 UTC m=+0.224686714 container died 9e453dc76cbbf1996bcf91ec021b1a1078d8ed75d49edf498e46666457530689 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wiles, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Dec 02 09:53:20 np0005541913.localdomain podman[289310]: 2025-12-02 09:53:20.67579413 +0000 UTC m=+0.071772812 container remove 9e453dc76cbbf1996bcf91ec021b1a1078d8ed75d49edf498e46666457530689 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wiles, release=1763362218, io.openshift.expose-services=, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: libpod-conmon-9e453dc76cbbf1996bcf91ec021b1a1078d8ed75d49edf498e46666457530689.scope: Deactivated successfully.
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:53:20 np0005541913.localdomain systemd-sysv-generator[289354]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:53:20 np0005541913.localdomain systemd-rc-local-generator[289349]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:20 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-1687bcfc1bb24bb3e2eb5968a8a8366954a907b3b6f50d31836b44880826458d-merged.mount: Deactivated successfully.
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: tmp-crun.wB9kTf.mount: Deactivated successfully.
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:53:21 np0005541913.localdomain systemd-rc-local-generator[289389]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:53:21 np0005541913.localdomain systemd-sysv-generator[289395]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: Starting Ceph mon.np0005541913 for c7c8e171-a193-56fb-95fa-8879fcfa7074...
Dec 02 09:53:21 np0005541913.localdomain podman[289455]: 
Dec 02 09:53:21 np0005541913.localdomain podman[289455]: 2025-12-02 09:53:21.865688394 +0000 UTC m=+0.051778517 container create 36af0ed2ef00d05ae4aad6f924c8b496242c4f2361918e4bd57717905928e70b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mon-np0005541913, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, com.redhat.component=rhceph-container, name=rhceph, build-date=2025-11-26T19:44:28Z, architecture=x86_64, release=1763362218, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 02 09:53:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:21.917 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:21 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/393e1e54e92e7ce105bdb9ae967dcd71a5af0f60b460340c2a56d8deb0a84a42/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:21 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/393e1e54e92e7ce105bdb9ae967dcd71a5af0f60b460340c2a56d8deb0a84a42/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:21 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/393e1e54e92e7ce105bdb9ae967dcd71a5af0f60b460340c2a56d8deb0a84a42/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:21 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/393e1e54e92e7ce105bdb9ae967dcd71a5af0f60b460340c2a56d8deb0a84a42/merged/var/lib/ceph/mon/ceph-np0005541913 supports timestamps until 2038 (0x7fffffff)
Dec 02 09:53:21 np0005541913.localdomain podman[289455]: 2025-12-02 09:53:21.934062294 +0000 UTC m=+0.120152427 container init 36af0ed2ef00d05ae4aad6f924c8b496242c4f2361918e4bd57717905928e70b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mon-np0005541913, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, name=rhceph, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4)
Dec 02 09:53:21 np0005541913.localdomain podman[289455]: 2025-12-02 09:53:21.943484486 +0000 UTC m=+0.129574629 container start 36af0ed2ef00d05ae4aad6f924c8b496242c4f2361918e4bd57717905928e70b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mon-np0005541913, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, RELEASE=main, name=rhceph, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=)
Dec 02 09:53:21 np0005541913.localdomain bash[289455]: 36af0ed2ef00d05ae4aad6f924c8b496242c4f2361918e4bd57717905928e70b
Dec 02 09:53:21 np0005541913.localdomain podman[289455]: 2025-12-02 09:53:21.844643131 +0000 UTC m=+0.030733284 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:53:21 np0005541913.localdomain systemd[1]: Started Ceph mon.np0005541913 for c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 09:53:21 np0005541913.localdomain sudo[289170]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: set uid:gid to 167:167 (ceph:ceph)
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: pidfile_write: ignore empty --pid-file
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: load: jerasure load: lrc 
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: RocksDB version: 7.9.2
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: Git sha 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: DB SUMMARY
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: DB Session ID:  OW4D0W92HOAH7R2F6LZX
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: CURRENT file:  CURRENT
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: IDENTITY file:  IDENTITY
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005541913/store.db dir, Total Num: 0, files: 
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005541913/store.db: 000004.log size: 761 ; 
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                         Options.error_if_exists: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                       Options.create_if_missing: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                         Options.paranoid_checks: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                                     Options.env: 0x561a19c049e0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                                Options.info_log: 0x561a1ab74d20
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.max_file_opening_threads: 16
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                              Options.statistics: (nil)
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                               Options.use_fsync: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                       Options.max_log_file_size: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                         Options.allow_fallocate: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                        Options.use_direct_reads: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:          Options.create_missing_column_families: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                              Options.db_log_dir: 
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                                 Options.wal_dir: 
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                   Options.advise_random_on_open: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                    Options.write_buffer_manager: 0x561a1ab85540
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                            Options.rate_limiter: (nil)
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.unordered_write: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                               Options.row_cache: None
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                              Options.wal_filter: None
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.allow_ingest_behind: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.two_write_queues: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.manual_wal_flush: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.wal_compression: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.atomic_flush: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                 Options.log_readahead_size: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.allow_data_in_errors: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.db_host_id: __hostname__
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.max_background_jobs: 2
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.max_background_compactions: -1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.max_subcompactions: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.max_total_wal_size: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                          Options.max_open_files: -1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                          Options.bytes_per_sync: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:       Options.compaction_readahead_size: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.max_background_flushes: -1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: Compression algorithms supported:
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         kZSTD supported: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         kXpressCompression supported: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         kBZip2Compression supported: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         kLZ4Compression supported: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         kZlibCompression supported: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         kLZ4HCCompression supported: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         kSnappyCompression supported: 1
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005541913/store.db/MANIFEST-000005
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:           Options.merge_operator: 
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:        Options.compaction_filter: None
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561a1ab74980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x561a1ab71350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:        Options.write_buffer_size: 33554432
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:  Options.max_write_buffer_number: 2
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:          Options.compression: NoCompression
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 09:53:21 np0005541913.localdomain ceph-mon[289473]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.num_levels: 7
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                   Options.table_properties_collectors: 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                           Options.bloom_locality: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                               Options.ttl: 2592000
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                       Options.enable_blob_files: false
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                           Options.min_blob_size: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005541913/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669202000538, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669202003028, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669201, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669202003158, "job": 1, "event": "recovery_finished"}
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x561a1ab98e00
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: DB pointer 0x561a1ac8e000
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913 does not exist in monmap, will attempt to join an existing cluster
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.84 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.84 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x561a1ab71350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.8e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: using public_addr v2:172.18.0.107:0/0 -> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0]
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: starting mon.np0005541913 rank -1 at public addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] at bind addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005541913 fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(???) e0 preinit fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(synchronizing) e4 sync_obtain_latest_monmap
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(synchronizing) e4 sync_obtain_latest_monmap obtained monmap e4
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(synchronizing).mds e16 new map
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        15
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-12-02T08:05:53.424954+0000
                                                           modified        2025-12-02T09:52:13.505190+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        84
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26573}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26573 members: 26573
                                                           [mds.mds.np0005541912.ghcwcm{0:26573} state up:active seq 13 addr [v2:172.18.0.106:6808/955707462,v1:172.18.0.106:6809/955707462] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005541914.sqgqkj{-1:16923} state up:standby seq 1 addr [v2:172.18.0.108:6808/2216063099,v1:172.18.0.108:6809/2216063099] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005541913.maexpe{-1:26386} state up:standby seq 1 addr [v2:172.18.0.107:6808/3746047079,v1:172.18.0.107:6809/3746047079] compat {c=[1],r=[1],i=[17ff]}]
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(synchronizing).osd e85 crush map has features 3314933000852226048, adjusting msgr requires
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3921: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.106:0/3860598798' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.106:0/2713840862' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3922: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17058 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541912.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label mgr to host np0005541912.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3923: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17064 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541913.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label mgr to host np0005541913.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17070 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541914.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label mgr to host np0005541914.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3924: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17076 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Saving service mgr spec with placement label:mgr
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Deploying daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3925: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17082 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mgr", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Deploying daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3926: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17094 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541909.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label mon to host np0005541909.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17100 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541909.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label _admin to host np0005541909.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Deploying daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3927: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17109 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541910.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label mon to host np0005541910.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17115 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541910.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label _admin to host np0005541910.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Standby manager daemon np0005541912.qwddia started
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3928: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mgrmap e12: np0005541909.kfesnk(active, since 2h), standbys: np0005541911.adcgiw, np0005541910.kzipdo, np0005541912.qwddia
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mgr metadata", "who": "np0005541912.qwddia", "id": "np0005541912.qwddia"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.32:0/1830186127' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.32:0/1830186127' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17127 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541911.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label mon to host np0005541911.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Standby manager daemon np0005541913.mfesdm started
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3929: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17142 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541911.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label _admin to host np0005541911.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17148 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541912.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mgrmap e13: np0005541909.kfesnk(active, since 2h), standbys: np0005541911.adcgiw, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mgr metadata", "who": "np0005541913.mfesdm", "id": "np0005541913.mfesdm"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label mon to host np0005541912.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3930: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Standby manager daemon np0005541914.lljzmk started
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17154 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541912.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label _admin to host np0005541912.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mgrmap e14: np0005541909.kfesnk(active, since 2h), standbys: np0005541911.adcgiw, np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mgr metadata", "who": "np0005541914.lljzmk", "id": "np0005541914.lljzmk"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3931: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17160 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541913.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label mon to host np0005541913.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17166 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541913.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label _admin to host np0005541913.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3932: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17172 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541914.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label mon to host np0005541914.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3933: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17178 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005541914.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Added label _admin to host np0005541914.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17184 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Saving service mon spec with placement label:mon
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3934: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='client.17190 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541912", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: Deploying daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: pgmap v3935: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3
Dec 02 09:53:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:23.088 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:53:24 np0005541913.localdomain podman[289512]: 2025-12-02 09:53:24.453900549 +0000 UTC m=+0.095869467 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Dec 02 09:53:24 np0005541913.localdomain podman[289512]: 2025-12-02 09:53:24.465817418 +0000 UTC m=+0.107786396 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 02 09:53:24 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:53:26 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x5645037311e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Dec 02 09:53:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:26.962 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:27 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(probing) e4  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Dec 02 09:53:27 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(probing) e4  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Dec 02 09:53:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:28.132 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:28 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@-1(probing) e5  my rank is now 4 (was -1)
Dec 02 09:53:28 np0005541913.localdomain ceph-mon[289473]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:53:28 np0005541913.localdomain ceph-mon[289473]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 02 09:53:28 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:53:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:53:29 np0005541913.localdomain podman[289531]: 2025-12-02 09:53:29.445778133 +0000 UTC m=+0.085299214 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 09:53:29 np0005541913.localdomain podman[289531]: 2025-12-02 09:53:29.480073101 +0000 UTC m=+0.119594182 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 09:53:29 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:53:29 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e5 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e5 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: pgmap v3936: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: Deploying daemon mon.np0005541913 on np0005541913.localdomain
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541909"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541910"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541909 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541910 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: pgmap v3937: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541914 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: pgmap v3938: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541909 is new leader, mons np0005541909,np0005541911,np0005541910,np0005541914 in quorum (ranks 0,1,2,3)
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: monmap epoch 4
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:53:19.558333+0000
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541909
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541910
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: osdmap e85: 6 total, 6 up, 6 in
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mgrmap e14: np0005541909.kfesnk(active, since 2h), standbys: np0005541911.adcgiw, np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: overall HEALTH_OK
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: Deploying daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mgrc update_daemon_metadata mon.np0005541913 metadata {addrs=[v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005541913.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005541913.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541909"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541910"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541910 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541909 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541914 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: pgmap v3940: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: pgmap v3941: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541909 is new leader, mons np0005541909,np0005541911,np0005541910,np0005541914,np0005541913 in quorum (ranks 0,1,2,3,4)
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: monmap epoch 5
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:53:26.303070+0000
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541909
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541910
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005541913
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: osdmap e85: 6 total, 6 up, 6 in
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mgrmap e14: np0005541909.kfesnk(active, since 2h), standbys: np0005541911.adcgiw, np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: overall HEALTH_OK
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Dec 02 09:53:31 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x564503730f20 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: paxos.4).electionLogic(22) init, last seen epoch 22
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:53:31 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:53:31 np0005541913.localdomain sudo[289550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:53:31 np0005541913.localdomain sudo[289550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:31 np0005541913.localdomain sudo[289550]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:32.007 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:32 np0005541913.localdomain sudo[289568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:53:32 np0005541913.localdomain sudo[289568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:32 np0005541913.localdomain sudo[289568]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:32 np0005541913.localdomain sudo[289586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:53:32 np0005541913.localdomain sudo[289586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:53:32 np0005541913.localdomain podman[289648]: 2025-12-02 09:53:32.875639706 +0000 UTC m=+0.062375241 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:53:32 np0005541913.localdomain podman[289648]: 2025-12-02 09:53:32.888484329 +0000 UTC m=+0.075219854 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Dec 02 09:53:32 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:53:33 np0005541913.localdomain systemd[1]: tmp-crun.Ur2mDe.mount: Deactivated successfully.
Dec 02 09:53:33 np0005541913.localdomain podman[289699]: 2025-12-02 09:53:33.075560286 +0000 UTC m=+0.077965328 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:53:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:33.172 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:33 np0005541913.localdomain podman[289699]: 2025-12-02 09:53:33.236083251 +0000 UTC m=+0.238488303 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1763362218, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, version=7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc.)
Dec 02 09:53:33 np0005541913.localdomain sudo[289586]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:53:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:53:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:53:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:53:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:53:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:53:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:53:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:53:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:53:34 np0005541913.localdomain systemd[1]: tmp-crun.GRMT26.mount: Deactivated successfully.
Dec 02 09:53:34 np0005541913.localdomain podman[289819]: 2025-12-02 09:53:34.468279817 +0000 UTC m=+0.105483864 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:53:34 np0005541913.localdomain podman[289819]: 2025-12-02 09:53:34.475821159 +0000 UTC m=+0.113025206 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:53:34 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:53:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:53:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:53:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:53:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:53:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:53:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18689 "" "Go-http-client/1.1"
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: pgmap v3942: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541909"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541910"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mon.np0005541909 calling monitor election
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mon.np0005541910 calling monitor election
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 calling monitor election
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mon.np0005541914 calling monitor election
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913 calling monitor election
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='client.17200 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541912", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: pgmap v3943: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mon.np0005541912 calling monitor election
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: pgmap v3944: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mon.np0005541909 is new leader, mons np0005541909,np0005541911,np0005541910,np0005541914,np0005541913,np0005541912 in quorum (ranks 0,1,2,3,4,5)
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: monmap epoch 6
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:53:31.525725+0000
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541909
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541910
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005541913
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005541912
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: osdmap e85: 6 total, 6 up, 6 in
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: mgrmap e14: np0005541909.kfesnk(active, since 2h), standbys: np0005541911.adcgiw, np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: overall HEALTH_OK
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:36 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:36 np0005541913.localdomain sudo[289843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:53:36 np0005541913.localdomain sudo[289843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:36 np0005541913.localdomain sudo[289843]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:36 np0005541913.localdomain sudo[289861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:53:36 np0005541913.localdomain sudo[289861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:37.011 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:37 np0005541913.localdomain sudo[289861]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:37 np0005541913.localdomain sudo[289909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:53:37 np0005541913.localdomain sudo[289909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:37 np0005541913.localdomain sudo[289909]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:37 np0005541913.localdomain sudo[289927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:53:37 np0005541913.localdomain sudo[289927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:37 np0005541913.localdomain sudo[289927]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:37 np0005541913.localdomain sudo[289945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:37 np0005541913.localdomain sudo[289945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:37 np0005541913.localdomain sudo[289945]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:37 np0005541913.localdomain sudo[289963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:37 np0005541913.localdomain sudo[289963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:37 np0005541913.localdomain sudo[289963]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[289981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:38 np0005541913.localdomain sudo[289981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[289981]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:38 np0005541913.localdomain sudo[290015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290015]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:38.220 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:38 np0005541913.localdomain sudo[290033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:38 np0005541913.localdomain sudo[290033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290033]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:53:38 np0005541913.localdomain sudo[290051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290051]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:38 np0005541913.localdomain sudo[290069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290069]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:38 np0005541913.localdomain sudo[290087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290087]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:38 np0005541913.localdomain sudo[290105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290105]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:38 np0005541913.localdomain sudo[290123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290123]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:38 np0005541913.localdomain sudo[290141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290141]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:38 np0005541913.localdomain sudo[290175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain ceph-mon[289473]: pgmap v3945: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:38 np0005541913.localdomain ceph-mon[289473]: Updating np0005541909.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:38 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:38 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:38 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:38 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:38 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:38 np0005541913.localdomain ceph-mon[289473]: from='client.17208 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541912", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:53:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:38 np0005541913.localdomain sudo[290175]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:38 np0005541913.localdomain sudo[290193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290193]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:38 np0005541913.localdomain sudo[290211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:38 np0005541913.localdomain sudo[290211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:38 np0005541913.localdomain sudo[290211]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:39 np0005541913.localdomain sudo[290229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:53:39 np0005541913.localdomain sudo[290229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:39 np0005541913.localdomain sudo[290229]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: Updating np0005541909.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.843556) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669219843696, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 10010, "num_deletes": 255, "total_data_size": 10633068, "memory_usage": 10933432, "flush_reason": "Manual Compaction"}
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669219905488, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 9090960, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 10015, "table_properties": {"data_size": 9037841, "index_size": 28245, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23813, "raw_key_size": 250141, "raw_average_key_size": 26, "raw_value_size": 8876134, "raw_average_value_size": 934, "num_data_blocks": 1084, "num_entries": 9501, "num_filter_entries": 9501, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669202, "oldest_key_time": 1764669202, "file_creation_time": 1764669219, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 62028 microseconds, and 21923 cpu microseconds.
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.905581) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 9090960 bytes OK
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.905643) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.907785) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.907812) EVENT_LOG_v1 {"time_micros": 1764669219907804, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.907835) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 10563943, prev total WAL file size 10563943, number of live WAL files 2.
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.909858) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end)
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(8877KB) 8(1887B)]
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669219909998, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 9092847, "oldest_snapshot_seqno": -1}
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9249 keys, 9087051 bytes, temperature: kUnknown
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669219979600, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 9087051, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9034600, "index_size": 28222, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23173, "raw_key_size": 245334, "raw_average_key_size": 26, "raw_value_size": 8876192, "raw_average_value_size": 959, "num_data_blocks": 1083, "num_entries": 9249, "num_filter_entries": 9249, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669201, "oldest_key_time": 0, "file_creation_time": 1764669219, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.979961) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 9087051 bytes
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.981680) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 130.7 rd, 130.7 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(8.7, 0.0 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 9506, records dropped: 257 output_compression: NoCompression
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.981703) EVENT_LOG_v1 {"time_micros": 1764669219981692, "job": 4, "event": "compaction_finished", "compaction_time_micros": 69550, "compaction_time_cpu_micros": 30272, "output_level": 6, "num_output_files": 1, "total_output_size": 9087051, "num_input_records": 9506, "num_output_records": 9249, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669219982884, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669219982931, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec 02 09:53:39 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:53:39.909679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:53:40 np0005541913.localdomain ceph-mon[289473]: pgmap v3946: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:40 np0005541913.localdomain ceph-mon[289473]: from='client.17214 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541913", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:53:40 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541909 (monmap changed)...
Dec 02 09:53:40 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541909 on np0005541909.localdomain
Dec 02 09:53:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541909.kfesnk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:53:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:53:41 np0005541913.localdomain podman[290248]: 2025-12-02 09:53:41.450022965 +0000 UTC m=+0.081875911 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 02 09:53:41 np0005541913.localdomain podman[290248]: 2025-12-02 09:53:41.487252223 +0000 UTC m=+0.119105169 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 02 09:53:41 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:53:41 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541909.kfesnk (monmap changed)...
Dec 02 09:53:41 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541909.kfesnk on np0005541909.localdomain
Dec 02 09:53:41 np0005541913.localdomain ceph-mon[289473]: from='client.34107 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541914", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:53:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541909.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:53:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:42.065 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='client.? 172.18.0.103:0/3005476938' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: pgmap v3947: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541909 (monmap changed)...
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541909 on np0005541909.localdomain
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541910.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:42 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.103:0/3005476938' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:53:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:43.261 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:53:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:53:43 np0005541913.localdomain podman[290266]: 2025-12-02 09:53:43.457554742 +0000 UTC m=+0.089566338 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec 02 09:53:43 np0005541913.localdomain podman[290266]: 2025-12-02 09:53:43.50308805 +0000 UTC m=+0.135099666 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:53:43 np0005541913.localdomain systemd[1]: tmp-crun.qJ6DeV.mount: Deactivated successfully.
Dec 02 09:53:43 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:53:43 np0005541913.localdomain podman[290265]: 2025-12-02 09:53:43.51092567 +0000 UTC m=+0.146075290 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:53:43 np0005541913.localdomain podman[290265]: 2025-12-02 09:53:43.594207569 +0000 UTC m=+0.229357159 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:53:43 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:53:43 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541910 (monmap changed)...
Dec 02 09:53:43 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541910 on np0005541910.localdomain
Dec 02 09:53:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:53:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:53:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:43 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.103:0/3224647752' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 02 09:53:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:44.369 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:44.388 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:44.388 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:44.388 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:53:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:44.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:44.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:53:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:44.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: pgmap v3948: 177 pgs: 177 active+clean; 104 MiB data, 544 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541910 (monmap changed)...
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541910 on np0005541910.localdomain
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541910.kzipdo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' 
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:53:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.239 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.239 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.240 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.240 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).osd e85 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).osd e85 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).osd e86 e86: 6 total, 6 up, 6 in
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541909"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541909"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541910"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541910"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541911"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541912"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541913"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005541914.sqgqkj"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mds metadata", "who": "mds.np0005541914.sqgqkj"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).mds e16 all = 0
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005541913.maexpe"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mds metadata", "who": "mds.np0005541913.maexpe"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).mds e16 all = 0
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005541912.ghcwcm"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mds metadata", "who": "mds.np0005541912.ghcwcm"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).mds e16 all = 0
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005541911.adcgiw", "id": "np0005541911.adcgiw"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541911.adcgiw", "id": "np0005541911.adcgiw"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005541914.lljzmk", "id": "np0005541914.lljzmk"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541914.lljzmk", "id": "np0005541914.lljzmk"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005541910.kzipdo", "id": "np0005541910.kzipdo"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541910.kzipdo", "id": "np0005541910.kzipdo"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005541913.mfesdm", "id": "np0005541913.mfesdm"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541913.mfesdm", "id": "np0005541913.mfesdm"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005541912.qwddia", "id": "np0005541912.qwddia"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541912.qwddia", "id": "np0005541912.qwddia"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mds metadata"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mds metadata"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).mds e16 all = 1
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mon metadata"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain sshd[25930]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-25.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain sshd[26085]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-27.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain sshd[26121]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-27.scope: Consumed 3min 25.944s CPU time.
Dec 02 09:53:45 np0005541913.localdomain sshd[26066]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-26.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain sshd[26102]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-24.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain sshd[26028]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-17.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain sshd[25971]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain sshd[26047]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain sshd[25912]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain sshd[25990]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain sshd[25952]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain sshd[26009]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-21.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-20.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-15.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-23.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-19.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-18.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: session-22.scope: Deactivated successfully.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 17 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 18 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 25 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 27 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 26 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 24 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 22 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 23 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 20 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 19 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 21 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Session 15 logged out. Waiting for processes to exit.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 25.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 27.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 26.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 24.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 17.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 21.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 20.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 15.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 23.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 19.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 18.
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: Removed session 22.
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541911.adcgiw/mirror_snapshot_schedule"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541911.adcgiw/mirror_snapshot_schedule"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541911.adcgiw/trash_purge_schedule"} v 0)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541911.adcgiw/trash_purge_schedule"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.590 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.605 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.605 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:53:45 np0005541913.localdomain sshd[290313]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:53:45 np0005541913.localdomain sshd[290313]: Accepted publickey for ceph-admin from 192.168.122.105 port 51822 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 09:53:45 np0005541913.localdomain systemd-logind[757]: New session 65 of user ceph-admin.
Dec 02 09:53:45 np0005541913.localdomain systemd[1]: Started Session 65 of User ceph-admin.
Dec 02 09:53:45 np0005541913.localdomain sshd[290313]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 09:53:45 np0005541913.localdomain sudo[290317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:53:45 np0005541913.localdomain sudo[290317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:45 np0005541913.localdomain sudo[290317]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:45.829 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:45 np0005541913.localdomain sudo[290335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:53:45 np0005541913.localdomain sudo[290335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541910.kzipdo (monmap changed)...
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541910.kzipdo on np0005541910.localdomain
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541911 (monmap changed)...
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14120 172.18.0.103:0/408290768' entity='mgr.np0005541909.kfesnk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.103:0/1327578721' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: Activating manager daemon np0005541911.adcgiw
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: osdmap e86: 6 total, 6 up, 6 in
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.103:0/1327578721' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: mgrmap e15: np0005541911.adcgiw(active, starting, since 0.0533978s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541909"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541910"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mds metadata", "who": "mds.np0005541914.sqgqkj"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mds metadata", "who": "mds.np0005541913.maexpe"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mds metadata", "who": "mds.np0005541912.ghcwcm"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541911.adcgiw", "id": "np0005541911.adcgiw"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541914.lljzmk", "id": "np0005541914.lljzmk"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541910.kzipdo", "id": "np0005541910.kzipdo"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541913.mfesdm", "id": "np0005541913.mfesdm"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541912.qwddia", "id": "np0005541912.qwddia"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mds metadata"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd metadata"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: Manager daemon np0005541911.adcgiw is now available
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541911.adcgiw/mirror_snapshot_schedule"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541911.adcgiw/mirror_snapshot_schedule"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541911.adcgiw/trash_purge_schedule"} : dispatch
Dec 02 09:53:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541911.adcgiw/trash_purge_schedule"} : dispatch
Dec 02 09:53:46 np0005541913.localdomain systemd[1]: tmp-crun.iYHfDB.mount: Deactivated successfully.
Dec 02 09:53:46 np0005541913.localdomain podman[290424]: 2025-12-02 09:53:46.584800283 +0000 UTC m=+0.069811169 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Dec 02 09:53:46 np0005541913.localdomain podman[290424]: 2025-12-02 09:53:46.662512023 +0000 UTC m=+0.147522939 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.41.4, ceph=True, release=1763362218, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:53:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:46.825 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:46.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:46.849 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:53:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:46.850 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:53:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:46.851 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:53:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:46.851 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:53:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:46.851 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).osd e86 _set_new_cache_sizes cache_size:1019817753 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain.devices.0}] v 0)
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain}] v 0)
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.105 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:47 np0005541913.localdomain sudo[290335]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541909.localdomain.devices.0}] v 0)
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.326 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mgrmap e16: np0005541911.adcgiw(active, since 1.07358s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:53:46] ENGINE Bus STARTING
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.107:0/969090503' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541910.localdomain.devices.0}] v 0)
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.382 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.383 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541909.localdomain}] v 0)
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541910.localdomain}] v 0)
Dec 02 09:53:47 np0005541913.localdomain sudo[290567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:53:47 np0005541913.localdomain sudo[290567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:47 np0005541913.localdomain sudo[290567]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.556 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.557 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11837MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.557 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.558 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:53:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:53:47 np0005541913.localdomain sudo[290585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:53:47 np0005541913.localdomain sudo[290585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.689 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.690 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.690 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:53:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:47.760 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:53:48 np0005541913.localdomain sudo[290585]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:48.266 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:53:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:48.308 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:53:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:48.310 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:48.326 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:53:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:48.328 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:53:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:48.328 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:53:46] ENGINE Serving on https://172.18.0.105:7150
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:53:46] ENGINE Client ('172.18.0.105', 60410) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:53:47] ENGINE Serving on http://172.18.0.105:8765
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:53:47] ENGINE Bus STARTED
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.107:0/22063984' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain.devices.0}] v 0)
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain}] v 0)
Dec 02 09:53:48 np0005541913.localdomain sudo[290658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005541911", "name": "osd_memory_target"} v 0)
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd/host:np0005541911", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:48 np0005541913.localdomain sudo[290658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:48 np0005541913.localdomain sudo[290658]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:48 np0005541913.localdomain sudo[290676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 09:53:48 np0005541913.localdomain sudo[290676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541910.localdomain.devices.0}] v 0)
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541910.localdomain}] v 0)
Dec 02 09:53:48 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541909.localdomain.devices.0}] v 0)
Dec 02 09:53:48 np0005541913.localdomain sudo[290676]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005541910", "name": "osd_memory_target"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd/host:np0005541910", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541909.localdomain}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005541909", "name": "osd_memory_target"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd/host:np0005541909", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain sudo[290713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:53:49 np0005541913.localdomain sudo[290713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290713]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:49.329 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:53:49 np0005541913.localdomain sudo[290731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:53:49 np0005541913.localdomain sudo[290731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290731]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain sudo[290749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:49 np0005541913.localdomain sudo[290749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290749]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mgrmap e17: np0005541911.adcgiw(active, since 3s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd/host:np0005541911", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd/host:np0005541911", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.108:0/1424767511' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd/host:np0005541910", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd/host:np0005541910", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd/host:np0005541909", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd/host:np0005541909", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain sudo[290767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:49 np0005541913.localdomain sudo[290767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290767]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain sudo[290785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:49 np0005541913.localdomain sudo[290785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290785]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:53:49 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3338051788' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:53:49 np0005541913.localdomain sudo[290819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:49 np0005541913.localdomain sudo[290819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290819]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain sudo[290837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:53:49 np0005541913.localdomain sudo[290837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290837]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain sudo[290855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:53:49 np0005541913.localdomain sudo[290855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290855]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain sudo[290873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:49 np0005541913.localdomain sudo[290873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290873]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:49 np0005541913.localdomain sudo[290891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:49 np0005541913.localdomain sudo[290891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:49 np0005541913.localdomain sudo[290891]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[290909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:50 np0005541913.localdomain sudo[290909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[290909]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[290927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:50 np0005541913.localdomain sudo[290927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[290927]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[290945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:50 np0005541913.localdomain sudo[290945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[290945]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[290979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:50 np0005541913.localdomain sudo[290979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[290979]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[290997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:53:50 np0005541913.localdomain sudo[290997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[290997]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[291015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:50 np0005541913.localdomain sudo[291015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[291015]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[291033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:53:50 np0005541913.localdomain sudo[291033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Adjusting osd_memory_target on np0005541913.localdomain to 836.6M
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Adjusting osd_memory_target on np0005541912.localdomain to 836.6M
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Adjusting osd_memory_target on np0005541914.localdomain to 836.6M
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Unable to set osd_memory_target on np0005541913.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Unable to set osd_memory_target on np0005541912.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Unable to set osd_memory_target on np0005541914.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Updating np0005541909.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.108:0/3338051788' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: mgrmap e18: np0005541911.adcgiw(active, since 5s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia
Dec 02 09:53:50 np0005541913.localdomain ceph-mon[289473]: Standby manager daemon np0005541909.kfesnk started
Dec 02 09:53:50 np0005541913.localdomain sudo[291033]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[291051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:53:50 np0005541913.localdomain sudo[291051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[291051]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[291069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:53:50 np0005541913.localdomain sudo[291069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[291069]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[291087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:50 np0005541913.localdomain sudo[291087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[291087]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[291105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:53:50 np0005541913.localdomain sudo[291105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[291105]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[291139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:53:50 np0005541913.localdomain sudo[291139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[291139]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[291158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:53:50 np0005541913.localdomain sudo[291158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[291158]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:50 np0005541913.localdomain sudo[291176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:50 np0005541913.localdomain sudo[291176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:50 np0005541913.localdomain sudo[291176]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:51 np0005541913.localdomain sudo[291194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:51 np0005541913.localdomain sudo[291194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:51 np0005541913.localdomain sudo[291194]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:51 np0005541913.localdomain sudo[291212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:53:51 np0005541913.localdomain sudo[291212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:51 np0005541913.localdomain sudo[291212]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:51 np0005541913.localdomain sudo[291230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:53:51 np0005541913.localdomain sudo[291230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:51 np0005541913.localdomain sudo[291230]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:51 np0005541913.localdomain sudo[291248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:53:51 np0005541913.localdomain sudo[291248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:51 np0005541913.localdomain sudo[291248]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:51 np0005541913.localdomain sudo[291266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:53:51 np0005541913.localdomain sudo[291266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:51 np0005541913.localdomain sudo[291266]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005541909.kfesnk", "id": "np0005541909.kfesnk"} v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541909.kfesnk", "id": "np0005541909.kfesnk"} : dispatch
Dec 02 09:53:51 np0005541913.localdomain sudo[291300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:53:51 np0005541913.localdomain sudo[291300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:51 np0005541913.localdomain sudo[291300]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:51 np0005541913.localdomain sudo[291318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:53:51 np0005541913.localdomain sudo[291318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:51 np0005541913.localdomain sudo[291318]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain.devices.0}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain}] v 0)
Dec 02 09:53:51 np0005541913.localdomain sudo[291336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:51 np0005541913.localdomain sudo[291336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:51 np0005541913.localdomain sudo[291336]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541909.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541909.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mgrmap e19: np0005541911.adcgiw(active, since 6s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr metadata", "who": "np0005541909.kfesnk", "id": "np0005541909.kfesnk"} : dispatch
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.106:0/3009783839' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541909.localdomain.devices.0}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541909.localdomain}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541910.localdomain.devices.0}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541910.localdomain}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 02 09:53:51 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).osd e86 _set_new_cache_sizes cache_size:1020050624 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:53:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:52.153 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:52 np0005541913.localdomain sudo[291354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:53:52 np0005541913.localdomain sudo[291354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:53:52 np0005541913.localdomain sudo[291354]: pam_unix(sudo:session): session closed for user root
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: Updating np0005541909.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 0 B/s wr, 23 op/s
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.106:0/4070629179' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:53:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain.devices.0}] v 0)
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain}] v 0)
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005541911.adcgiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541911.adcgiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:53.310 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541911 (monmap changed)...
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541911.adcgiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541911.adcgiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain.devices.0}] v 0)
Dec 02 09:53:53 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain}] v 0)
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541911.adcgiw (monmap changed)...
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541911.adcgiw on np0005541911.localdomain
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain.devices.0}] v 0)
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541911.localdomain}] v 0)
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:53:54 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 02 09:53:55 np0005541913.localdomain podman[291372]: 2025-12-02 09:53:55.419164263 +0000 UTC m=+0.061659881 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:53:55 np0005541913.localdomain podman[291372]: 2025-12-02 09:53:55.434169794 +0000 UTC m=+0.076665392 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 09:53:55 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541911 (monmap changed)...
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541911 on np0005541911.localdomain
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:53:55 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.2 (monmap changed)...
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:53:56 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:57 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054660 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:53:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:57.155 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:57 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:53:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:53:58.338 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:53:58 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:58 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:58 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:53:58 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:58 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:53:58 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 02 09:53:58 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:53:58 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:53:58 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.5 (monmap changed)...
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='client.34161 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:53:59 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:54:00 np0005541913.localdomain podman[291392]: 2025-12-02 09:54:00.418931049 +0000 UTC m=+0.061839466 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 02 09:54:00 np0005541913.localdomain podman[291392]: 2025-12-02 09:54:00.452034315 +0000 UTC m=+0.094942642 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:54:00 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 ' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:54:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:01 np0005541913.localdomain sudo[291411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:01 np0005541913.localdomain sudo[291411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:01 np0005541913.localdomain sudo[291411]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:01 np0005541913.localdomain sudo[291429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:01 np0005541913.localdomain sudo[291429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "quorum_status"} : dispatch
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e6 handle_command mon_command({"prefix": "mon rm", "name": "np0005541909"} v 0)
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [INF] : from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon rm", "name": "np0005541909"} : dispatch
Dec 02 09:54:01 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x564503730f20 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Dec 02 09:54:01 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0
Dec 02 09:54:01 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@4(peon) e7  my rank is now 3 (was 4)
Dec 02 09:54:01 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0
Dec 02 09:54:01 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0
Dec 02 09:54:01 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x564503731600 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: paxos.3).electionLogic(26) init, last seen epoch 26
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:01 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:02 np0005541913.localdomain podman[291463]: 
Dec 02 09:54:02 np0005541913.localdomain podman[291463]: 2025-12-02 09:54:02.018814296 +0000 UTC m=+0.063700436 container create 39ee22f6ba3bffc7b09c003968c022dcd3c328395e2c35d25692411ecd6f64a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_shamir, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, version=7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:54:02 np0005541913.localdomain systemd[1]: Started libpod-conmon-39ee22f6ba3bffc7b09c003968c022dcd3c328395e2c35d25692411ecd6f64a1.scope.
Dec 02 09:54:02 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:02 np0005541913.localdomain podman[291463]: 2025-12-02 09:54:02.0892096 +0000 UTC m=+0.134095780 container init 39ee22f6ba3bffc7b09c003968c022dcd3c328395e2c35d25692411ecd6f64a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_shamir, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_BRANCH=main)
Dec 02 09:54:02 np0005541913.localdomain podman[291463]: 2025-12-02 09:54:02.000422203 +0000 UTC m=+0.045308393 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:02 np0005541913.localdomain podman[291463]: 2025-12-02 09:54:02.102876206 +0000 UTC m=+0.147762386 container start 39ee22f6ba3bffc7b09c003968c022dcd3c328395e2c35d25692411ecd6f64a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_shamir, io.openshift.tags=rhceph ceph, vcs-type=git, architecture=x86_64, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=1763362218, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:54:02 np0005541913.localdomain podman[291463]: 2025-12-02 09:54:02.103356738 +0000 UTC m=+0.148242948 container attach 39ee22f6ba3bffc7b09c003968c022dcd3c328395e2c35d25692411ecd6f64a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_shamir, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:54:02 np0005541913.localdomain systemd[1]: libpod-39ee22f6ba3bffc7b09c003968c022dcd3c328395e2c35d25692411ecd6f64a1.scope: Deactivated successfully.
Dec 02 09:54:02 np0005541913.localdomain wonderful_shamir[291478]: 167 167
Dec 02 09:54:02 np0005541913.localdomain podman[291463]: 2025-12-02 09:54:02.110939941 +0000 UTC m=+0.155826111 container died 39ee22f6ba3bffc7b09c003968c022dcd3c328395e2c35d25692411ecd6f64a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_shamir, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, RELEASE=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:54:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:02.158 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:02 np0005541913.localdomain podman[291483]: 2025-12-02 09:54:02.218450118 +0000 UTC m=+0.092729162 container remove 39ee22f6ba3bffc7b09c003968c022dcd3c328395e2c35d25692411ecd6f64a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_shamir, name=rhceph, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, release=1763362218, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True)
Dec 02 09:54:02 np0005541913.localdomain systemd[1]: libpod-conmon-39ee22f6ba3bffc7b09c003968c022dcd3c328395e2c35d25692411ecd6f64a1.scope: Deactivated successfully.
Dec 02 09:54:02 np0005541913.localdomain sudo[291429]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3e7778c046b51d74afe22d0b31a3bcdc570c202d83282664786bbf2e80d7bb91-merged.mount: Deactivated successfully.
Dec 02 09:54:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:54:03.038 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:54:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:54:03.039 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:54:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:54:03.041 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:54:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:54:03 np0005541913.localdomain podman[291500]: 2025-12-02 09:54:03.105673393 +0000 UTC m=+0.055830945 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 09:54:03 np0005541913.localdomain podman[291500]: 2025-12-02 09:54:03.12314575 +0000 UTC m=+0.073303382 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, version=9.6, config_id=edpm)
Dec 02 09:54:03 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:54:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:03.343 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:54:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:54:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:54:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:54:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:54:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:54:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:54:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:54:05 np0005541913.localdomain podman[291520]: 2025-12-02 09:54:05.446327324 +0000 UTC m=+0.078280546 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:54:05 np0005541913.localdomain podman[291520]: 2025-12-02 09:54:05.483076927 +0000 UTC m=+0.115030139 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:54:05 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:54:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:54:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:54:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:54:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:54:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:54:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18702 "" "Go-http-client/1.1"
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='client.34179 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005541909"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Remove daemons mon.np0005541909
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "quorum_status"} : dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Safe to remove mon.np0005541909: new quorum should be ['np0005541911', 'np0005541910', 'np0005541914', 'np0005541913', 'np0005541912'] (from ['np0005541911', 'np0005541910', 'np0005541914', 'np0005541913', 'np0005541912'])
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Removing monitor np0005541909 from monmap...
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon rm", "name": "np0005541909"} : dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Removing daemon mon.np0005541909 from np0005541909.localdomain -- ports []
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: mon.np0005541910 calling monitor election
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: mon.np0005541912 calling monitor election
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 calling monitor election
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913 calling monitor election
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541910"} : dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 is new leader, mons np0005541911,np0005541910,np0005541913,np0005541912 in quorum (ranks 0,1,3,4)
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: monmap epoch 7
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:54:01.619349+0000
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541910
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005541913
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005541912
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: osdmap e86: 6 total, 6 up, 6 in
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: mgrmap e19: np0005541911.adcgiw(active, since 21s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Health check failed: 1/5 mons down, quorum np0005541911,np0005541910,np0005541913,np0005541912 (MON_DOWN)
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005541911,np0005541910,np0005541913,np0005541912
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005541911,np0005541910,np0005541913,np0005541912
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]:     mon.np0005541914 (rank 2) addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] is down (out of quorum)
Dec 02 09:54:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:06 np0005541913.localdomain sudo[291543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:06 np0005541913.localdomain sudo[291543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:06 np0005541913.localdomain sudo[291543]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:06 np0005541913.localdomain sudo[291561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:06 np0005541913.localdomain sudo[291561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:07 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054730 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:54:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:07.207 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:07 np0005541913.localdomain podman[291596]: 
Dec 02 09:54:07 np0005541913.localdomain podman[291596]: 2025-12-02 09:54:07.35296415 +0000 UTC m=+0.038225655 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:08 np0005541913.localdomain podman[291596]: 2025-12-02 09:54:08.085156405 +0000 UTC m=+0.770417830 container create 32b8df09b8068be6d015ad8d703309ff874c3c369a6c4084cf6ec5bf8e438340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_jennings, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.41.4, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, description=Red Hat Ceph Storage 7, version=7, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True)
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.0 (monmap changed)...
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.32:0/3328361141' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.32:0/3328361141' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 09:54:08 np0005541913.localdomain systemd[1]: Started libpod-conmon-32b8df09b8068be6d015ad8d703309ff874c3c369a6c4084cf6ec5bf8e438340.scope.
Dec 02 09:54:08 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:08 np0005541913.localdomain podman[291596]: 2025-12-02 09:54:08.170097639 +0000 UTC m=+0.855359064 container init 32b8df09b8068be6d015ad8d703309ff874c3c369a6c4084cf6ec5bf8e438340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_jennings, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, version=7, distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Dec 02 09:54:08 np0005541913.localdomain podman[291596]: 2025-12-02 09:54:08.182574702 +0000 UTC m=+0.867836127 container start 32b8df09b8068be6d015ad8d703309ff874c3c369a6c4084cf6ec5bf8e438340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_jennings, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, name=rhceph, ceph=True, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:54:08 np0005541913.localdomain podman[291596]: 2025-12-02 09:54:08.182952852 +0000 UTC m=+0.868214277 container attach 32b8df09b8068be6d015ad8d703309ff874c3c369a6c4084cf6ec5bf8e438340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_jennings, ceph=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, release=1763362218)
Dec 02 09:54:08 np0005541913.localdomain silly_jennings[291612]: 167 167
Dec 02 09:54:08 np0005541913.localdomain podman[291596]: 2025-12-02 09:54:08.187884184 +0000 UTC m=+0.873145659 container died 32b8df09b8068be6d015ad8d703309ff874c3c369a6c4084cf6ec5bf8e438340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_jennings, io.openshift.tags=rhceph ceph, distribution-scope=public, release=1763362218, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:54:08 np0005541913.localdomain systemd[1]: libpod-32b8df09b8068be6d015ad8d703309ff874c3c369a6c4084cf6ec5bf8e438340.scope: Deactivated successfully.
Dec 02 09:54:08 np0005541913.localdomain podman[291617]: 2025-12-02 09:54:08.301868595 +0000 UTC m=+0.099681649 container remove 32b8df09b8068be6d015ad8d703309ff874c3c369a6c4084cf6ec5bf8e438340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_jennings, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True)
Dec 02 09:54:08 np0005541913.localdomain systemd[1]: libpod-conmon-32b8df09b8068be6d015ad8d703309ff874c3c369a6c4084cf6ec5bf8e438340.scope: Deactivated successfully.
Dec 02 09:54:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:08.374 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:08 np0005541913.localdomain sudo[291561]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:08 np0005541913.localdomain sudo[291641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:08 np0005541913.localdomain sudo[291641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:08 np0005541913.localdomain sudo[291641]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:08 np0005541913.localdomain sudo[291659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:08 np0005541913.localdomain sudo[291659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:08 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ae2a51311215695b268b5158db2a55777facf085935d077086282611735a4371-merged.mount: Deactivated successfully.
Dec 02 09:54:09 np0005541913.localdomain podman[291695]: 
Dec 02 09:54:09 np0005541913.localdomain podman[291695]: 2025-12-02 09:54:09.167998394 +0000 UTC m=+0.070196789 container create 1794b9e1e66269ba361d09df5796d617cf6d5ca4ca7bbb4cb1bb71c7ce90724d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_curie, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, architecture=x86_64, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.)
Dec 02 09:54:09 np0005541913.localdomain systemd[1]: Started libpod-conmon-1794b9e1e66269ba361d09df5796d617cf6d5ca4ca7bbb4cb1bb71c7ce90724d.scope.
Dec 02 09:54:09 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:09 np0005541913.localdomain podman[291695]: 2025-12-02 09:54:09.226947692 +0000 UTC m=+0.129146087 container init 1794b9e1e66269ba361d09df5796d617cf6d5ca4ca7bbb4cb1bb71c7ce90724d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_curie, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.41.4, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, com.redhat.component=rhceph-container)
Dec 02 09:54:09 np0005541913.localdomain compassionate_curie[291712]: 167 167
Dec 02 09:54:09 np0005541913.localdomain systemd[1]: libpod-1794b9e1e66269ba361d09df5796d617cf6d5ca4ca7bbb4cb1bb71c7ce90724d.scope: Deactivated successfully.
Dec 02 09:54:09 np0005541913.localdomain podman[291695]: 2025-12-02 09:54:09.241711347 +0000 UTC m=+0.143909762 container start 1794b9e1e66269ba361d09df5796d617cf6d5ca4ca7bbb4cb1bb71c7ce90724d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_curie, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main)
Dec 02 09:54:09 np0005541913.localdomain podman[291695]: 2025-12-02 09:54:09.242679132 +0000 UTC m=+0.144877527 container attach 1794b9e1e66269ba361d09df5796d617cf6d5ca4ca7bbb4cb1bb71c7ce90724d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_curie, architecture=x86_64, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, release=1763362218, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True)
Dec 02 09:54:09 np0005541913.localdomain podman[291695]: 2025-12-02 09:54:09.144703181 +0000 UTC m=+0.046901596 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:09 np0005541913.localdomain podman[291695]: 2025-12-02 09:54:09.245640482 +0000 UTC m=+0.147838947 container died 1794b9e1e66269ba361d09df5796d617cf6d5ca4ca7bbb4cb1bb71c7ce90724d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_curie, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 09:54:09 np0005541913.localdomain podman[291717]: 2025-12-02 09:54:09.334372326 +0000 UTC m=+0.084518272 container remove 1794b9e1e66269ba361d09df5796d617cf6d5ca4ca7bbb4cb1bb71c7ce90724d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_curie, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, architecture=x86_64, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph)
Dec 02 09:54:09 np0005541913.localdomain systemd[1]: libpod-conmon-1794b9e1e66269ba361d09df5796d617cf6d5ca4ca7bbb4cb1bb71c7ce90724d.scope: Deactivated successfully.
Dec 02 09:54:09 np0005541913.localdomain sudo[291659]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: mon.np0005541914 calling monitor election
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: from='client.34169 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541909.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: Removed label mon from host np0005541909.localdomain
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.3 (monmap changed)...
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: mon.np0005541910 calling monitor election
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 calling monitor election
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 is new leader, mons np0005541911,np0005541910,np0005541914,np0005541913,np0005541912 in quorum (ranks 0,1,2,3,4)
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: monmap epoch 7
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:54:01.619349+0000
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541910
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005541913
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005541912
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: osdmap e86: 6 total, 6 up, 6 in
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: mgrmap e19: np0005541911.adcgiw(active, since 23s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005541911,np0005541910,np0005541913,np0005541912)
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: Cluster is now healthy
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: overall HEALTH_OK
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:09 np0005541913.localdomain sudo[291740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:09 np0005541913.localdomain sudo[291740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:09 np0005541913.localdomain sudo[291740]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:09 np0005541913.localdomain sudo[291758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:09 np0005541913.localdomain sudo[291758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:10 np0005541913.localdomain podman[291793]: 
Dec 02 09:54:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-85ab00339e723e85db99cab163831b14e6f4325c881d2bdb3aff735755c76dc1-merged.mount: Deactivated successfully.
Dec 02 09:54:10 np0005541913.localdomain podman[291793]: 2025-12-02 09:54:10.103256484 +0000 UTC m=+0.078809270 container create f49d98988209681cea4bdd37bb15c8498fc901847d26c824f12751965d18a632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_meitner, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, vcs-type=git, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container)
Dec 02 09:54:10 np0005541913.localdomain systemd[1]: Started libpod-conmon-f49d98988209681cea4bdd37bb15c8498fc901847d26c824f12751965d18a632.scope.
Dec 02 09:54:10 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:10 np0005541913.localdomain podman[291793]: 2025-12-02 09:54:10.069422309 +0000 UTC m=+0.044975085 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:10 np0005541913.localdomain podman[291793]: 2025-12-02 09:54:10.179521765 +0000 UTC m=+0.155074521 container init f49d98988209681cea4bdd37bb15c8498fc901847d26c824f12751965d18a632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_meitner, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, RELEASE=main, vcs-type=git, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, description=Red Hat Ceph Storage 7)
Dec 02 09:54:10 np0005541913.localdomain busy_meitner[291808]: 167 167
Dec 02 09:54:10 np0005541913.localdomain podman[291793]: 2025-12-02 09:54:10.190088098 +0000 UTC m=+0.165640874 container start f49d98988209681cea4bdd37bb15c8498fc901847d26c824f12751965d18a632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_meitner, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, release=1763362218, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=)
Dec 02 09:54:10 np0005541913.localdomain podman[291793]: 2025-12-02 09:54:10.193089408 +0000 UTC m=+0.168642184 container attach f49d98988209681cea4bdd37bb15c8498fc901847d26c824f12751965d18a632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_meitner, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:54:10 np0005541913.localdomain systemd[1]: libpod-f49d98988209681cea4bdd37bb15c8498fc901847d26c824f12751965d18a632.scope: Deactivated successfully.
Dec 02 09:54:10 np0005541913.localdomain podman[291793]: 2025-12-02 09:54:10.196274554 +0000 UTC m=+0.171827360 container died f49d98988209681cea4bdd37bb15c8498fc901847d26c824f12751965d18a632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_meitner, com.redhat.component=rhceph-container, GIT_CLEAN=True, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4)
Dec 02 09:54:10 np0005541913.localdomain podman[291813]: 2025-12-02 09:54:10.297105102 +0000 UTC m=+0.092628950 container remove f49d98988209681cea4bdd37bb15c8498fc901847d26c824f12751965d18a632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_meitner, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container)
Dec 02 09:54:10 np0005541913.localdomain systemd[1]: libpod-conmon-f49d98988209681cea4bdd37bb15c8498fc901847d26c824f12751965d18a632.scope: Deactivated successfully.
Dec 02 09:54:10 np0005541913.localdomain sudo[291758]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:10 np0005541913.localdomain sudo[291829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:10 np0005541913.localdomain sudo[291829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:10 np0005541913.localdomain sudo[291829]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:10 np0005541913.localdomain sudo[291847]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:10 np0005541913.localdomain sudo[291847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: from='client.34203 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541909.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: Removed label mgr from host np0005541909.localdomain
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:54:10 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:10 np0005541913.localdomain podman[291881]: 
Dec 02 09:54:10 np0005541913.localdomain podman[291881]: 2025-12-02 09:54:10.97970071 +0000 UTC m=+0.052130946 container create b659bba46845dbb9bd3b97097830fccaf43c81fbbca8dea2cc9fb4be15ea5de6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bhabha, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, CEPH_POINT_RELEASE=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:54:11 np0005541913.localdomain systemd[1]: Started libpod-conmon-b659bba46845dbb9bd3b97097830fccaf43c81fbbca8dea2cc9fb4be15ea5de6.scope.
Dec 02 09:54:11 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:11 np0005541913.localdomain podman[291881]: 2025-12-02 09:54:11.020279736 +0000 UTC m=+0.092709972 container init b659bba46845dbb9bd3b97097830fccaf43c81fbbca8dea2cc9fb4be15ea5de6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bhabha, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4)
Dec 02 09:54:11 np0005541913.localdomain interesting_bhabha[291896]: 167 167
Dec 02 09:54:11 np0005541913.localdomain podman[291881]: 2025-12-02 09:54:11.027997533 +0000 UTC m=+0.100427769 container start b659bba46845dbb9bd3b97097830fccaf43c81fbbca8dea2cc9fb4be15ea5de6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bhabha, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, name=rhceph, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container)
Dec 02 09:54:11 np0005541913.localdomain podman[291881]: 2025-12-02 09:54:11.028212878 +0000 UTC m=+0.100643114 container attach b659bba46845dbb9bd3b97097830fccaf43c81fbbca8dea2cc9fb4be15ea5de6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bhabha, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, release=1763362218, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:54:11 np0005541913.localdomain systemd[1]: libpod-b659bba46845dbb9bd3b97097830fccaf43c81fbbca8dea2cc9fb4be15ea5de6.scope: Deactivated successfully.
Dec 02 09:54:11 np0005541913.localdomain podman[291881]: 2025-12-02 09:54:11.030297534 +0000 UTC m=+0.102727750 container died b659bba46845dbb9bd3b97097830fccaf43c81fbbca8dea2cc9fb4be15ea5de6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bhabha, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 09:54:11 np0005541913.localdomain podman[291881]: 2025-12-02 09:54:10.956983071 +0000 UTC m=+0.029413347 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6489660dc58f8ccd2802868b9ab7e4f2cd3205b7827f6e9dfd3c157917e2ee6e-merged.mount: Deactivated successfully.
Dec 02 09:54:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3fe33b4ffe6dcbd3e9c97c3cc95d8163b0e9aaed797836668f4857574d4acff2-merged.mount: Deactivated successfully.
Dec 02 09:54:11 np0005541913.localdomain podman[291901]: 2025-12-02 09:54:11.174441272 +0000 UTC m=+0.133754041 container remove b659bba46845dbb9bd3b97097830fccaf43c81fbbca8dea2cc9fb4be15ea5de6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bhabha, ceph=True, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, release=1763362218)
Dec 02 09:54:11 np0005541913.localdomain systemd[1]: libpod-conmon-b659bba46845dbb9bd3b97097830fccaf43c81fbbca8dea2cc9fb4be15ea5de6.scope: Deactivated successfully.
Dec 02 09:54:11 np0005541913.localdomain sudo[291847]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:11 np0005541913.localdomain sudo[291919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:11 np0005541913.localdomain sudo[291919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:11 np0005541913.localdomain sudo[291919]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:11 np0005541913.localdomain sudo[291937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:11 np0005541913.localdomain sudo[291937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: from='client.34184 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541909.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: Removed label _admin from host np0005541909.localdomain
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:54:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:54:11 np0005541913.localdomain podman[291971]: 2025-12-02 09:54:11.861465008 +0000 UTC m=+0.070625111 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Dec 02 09:54:11 np0005541913.localdomain podman[291979]: 
Dec 02 09:54:11 np0005541913.localdomain podman[291971]: 2025-12-02 09:54:11.902126796 +0000 UTC m=+0.111286869 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125)
Dec 02 09:54:11 np0005541913.localdomain podman[291979]: 2025-12-02 09:54:11.909781831 +0000 UTC m=+0.108081324 container create 7a03194a013b93c0b68ab81fe12de1d3ac89f930cf7e521ae0f021917b054c66 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_chebyshev, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, name=rhceph, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, release=1763362218)
Dec 02 09:54:11 np0005541913.localdomain podman[291979]: 2025-12-02 09:54:11.826592795 +0000 UTC m=+0.024892268 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:11 np0005541913.localdomain systemd[1]: Started libpod-conmon-7a03194a013b93c0b68ab81fe12de1d3ac89f930cf7e521ae0f021917b054c66.scope.
Dec 02 09:54:11 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:12 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:54:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:12.210 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:12 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:54:12 np0005541913.localdomain podman[291979]: 2025-12-02 09:54:12.236578076 +0000 UTC m=+0.434877569 container init 7a03194a013b93c0b68ab81fe12de1d3ac89f930cf7e521ae0f021917b054c66 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_chebyshev, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., release=1763362218, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, architecture=x86_64, version=7, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:54:12 np0005541913.localdomain podman[291979]: 2025-12-02 09:54:12.244761935 +0000 UTC m=+0.443061418 container start 7a03194a013b93c0b68ab81fe12de1d3ac89f930cf7e521ae0f021917b054c66 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_chebyshev, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main)
Dec 02 09:54:12 np0005541913.localdomain podman[291979]: 2025-12-02 09:54:12.245043903 +0000 UTC m=+0.443343436 container attach 7a03194a013b93c0b68ab81fe12de1d3ac89f930cf7e521ae0f021917b054c66 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_chebyshev, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph)
Dec 02 09:54:12 np0005541913.localdomain practical_chebyshev[292005]: 167 167
Dec 02 09:54:12 np0005541913.localdomain systemd[1]: libpod-7a03194a013b93c0b68ab81fe12de1d3ac89f930cf7e521ae0f021917b054c66.scope: Deactivated successfully.
Dec 02 09:54:12 np0005541913.localdomain podman[291979]: 2025-12-02 09:54:12.248181567 +0000 UTC m=+0.446481050 container died 7a03194a013b93c0b68ab81fe12de1d3ac89f930cf7e521ae0f021917b054c66 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_chebyshev, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4)
Dec 02 09:54:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-da8646c33ff3f47a07a3651a2f7b8a5cc736c7c37b7ef76d1db0b144f92da39b-merged.mount: Deactivated successfully.
Dec 02 09:54:12 np0005541913.localdomain podman[292010]: 2025-12-02 09:54:12.346166219 +0000 UTC m=+0.093790770 container remove 7a03194a013b93c0b68ab81fe12de1d3ac89f930cf7e521ae0f021917b054c66 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_chebyshev, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, release=1763362218, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:54:12 np0005541913.localdomain systemd[1]: libpod-conmon-7a03194a013b93c0b68ab81fe12de1d3ac89f930cf7e521ae0f021917b054c66.scope: Deactivated successfully.
Dec 02 09:54:12 np0005541913.localdomain sudo[291937]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:12 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541913 (monmap changed)...
Dec 02 09:54:12 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541913 on np0005541913.localdomain
Dec 02 09:54:12 np0005541913.localdomain ceph-mon[289473]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:12 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:12 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:12 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:54:12 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:13.377 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:13 np0005541913.localdomain sshd[292027]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:54:13 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541914 (monmap changed)...
Dec 02 09:54:13 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541914 on np0005541914.localdomain
Dec 02 09:54:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:54:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:13 np0005541913.localdomain sshd[292027]: error: kex_exchange_identification: Connection closed by remote host
Dec 02 09:54:13 np0005541913.localdomain sshd[292027]: Connection closed by 43.251.161.76 port 50958
Dec 02 09:54:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:54:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:54:14 np0005541913.localdomain podman[292028]: 2025-12-02 09:54:14.457104904 +0000 UTC m=+0.085377507 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:54:14 np0005541913.localdomain podman[292028]: 2025-12-02 09:54:14.473140583 +0000 UTC m=+0.101413146 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:54:14 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:54:14 np0005541913.localdomain podman[292029]: 2025-12-02 09:54:14.552826425 +0000 UTC m=+0.181006995 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:54:14 np0005541913.localdomain podman[292029]: 2025-12-02 09:54:14.656918191 +0000 UTC m=+0.285098821 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 09:54:14 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:54:14 np0005541913.localdomain ceph-mon[289473]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:14 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.1 (monmap changed)...
Dec 02 09:54:14 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:54:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 02 09:54:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:15 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.4 (monmap changed)...
Dec 02 09:54:15 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:54:15 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:15 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:15 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:54:15 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.103 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.134 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.134 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d541d84-cbb6-4005-b733-d36ade130ce3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:54:16.104368', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd15a046-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': 'fbee31c758f4ee472b69df937fb94df26f5c5d16220cd4aca3b3a6d0bc11f596'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:54:16.104368', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd15b2ca-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': 'f848ba4ec9efc6e4ac4c07e9fe702ab2261dccf1589cfa52b1121f10916b0656'}]}, 'timestamp': '2025-12-02 09:54:16.135353', '_unique_id': 'efb326e67b7246e79d7407980908e0c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.136 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.138 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.138 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0fe9216-5639-4367-be0f-8bf92edef5c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:54:16.138200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd163326-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': '5ea084b6f9c90767c4b953f15dddf77a7546bc2b2634678f4e51701440422047'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:54:16.138200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd1644ec-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': '534dcacb174fd53e7c19e5845797794de5720e562b515222d2834d848976fda3'}]}, 'timestamp': '2025-12-02 09:54:16.139084', '_unique_id': '7dd55e0ea6ae406aabae1b1a4c397881'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.140 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.141 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.141 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1495ef8-ca7a-47a6-be44-f78f68d65d92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:54:16.141298', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd16abe4-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': 'f74fbea2a8b208c3ab8170046dc380aa32aae667c120b41d48a0f7b1df4914b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:54:16.141298', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd16bd0a-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': '02efb35f7f0f51a57c348a7bd7737dcf398845240a8ce9529b6541a8630d38c1'}]}, 'timestamp': '2025-12-02 09:54:16.142153', '_unique_id': 'aaf7496edd6e4d77ada19e843715dac6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.143 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.144 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.162 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 13350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '729b53e8-29cc-488b-bba6-d50cf1ecbc90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13350000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:54:16.144262', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'dd19d972-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.381069609, 'message_signature': '0f2fa06e9c30b2bfe4a4d16ec85e5aa4c9a757b44dc2daca39470ab4eea0eab1'}]}, 'timestamp': '2025-12-02 09:54:16.162558', '_unique_id': 'aacf3e2b931e4c1b8a32e3eee041ede1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.176 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.177 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15f79ac5-d995-4cb1-8279-03310f8b3a5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:54:16.165058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd1c116a-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.384150372, 'message_signature': '4325d9be482660e60d75eec8c6dd740e660f6fc7cae60486e85fc9a60e112192'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:54:16.165058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd1c227c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.384150372, 'message_signature': 'd9892791bf75011fa1762fa8e98043854afb9a903d28732ef838ffcdb9e96e67'}]}, 'timestamp': '2025-12-02 09:54:16.177524', '_unique_id': '0a5d3d2ea8fe4e52863dc2574c4d4e37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.183 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3390e43d-797a-4d5e-98ee-7bf9db9f9c71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.180001', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd1d12ae-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': '076942a880ef583bc3c24661aa5c7341ab5e3e7a96eedf880beeb273433481da'}]}, 'timestamp': '2025-12-02 09:54:16.183872', '_unique_id': 'cd934a25a7924d29bf35adbb974313a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.186 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.187 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13626cc6-c422-45dc-970a-44334c0554fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:54:16.186581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd1d96d4-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.384150372, 'message_signature': '4165fc617518c096c2c4629bf32131ffab171b41d8e559f6aa7c4aa1db7ca629'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:54:16.186581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd1da962-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.384150372, 'message_signature': 'add70e4876cdb53a9c83939bd4a70ea6585ebb7105e161b66e159a7aedf9d8ed'}]}, 'timestamp': '2025-12-02 09:54:16.187529', '_unique_id': '88152f438db84adb8df4cd8e37667336'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.190 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.190 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ee60d88-6422-4498-8ab1-1eb7cd76bd2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:54:16.190058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd1e1d5c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': 'ea8cb696c0d6abbea6ddba4bad5d7c937b955f117e944baef2befcf65ba4d3b5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:54:16.190058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd1e2f18-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': '94814e785e04164fd96cad3d0b0d7df20c8d60f0b7570c699e092d89dacda2a3'}]}, 'timestamp': '2025-12-02 09:54:16.190957', '_unique_id': 'f0a2a7d886d844b78f63c8687dd59330'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.193 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb6ef30f-f748-48e4-9981-73d6e93b8847', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.193648', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd1eaa42-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': 'f303330f5755819618feb4fbdb87e5b6fa5eec6544ee1f5034f340a1a74d6cc1'}]}, 'timestamp': '2025-12-02 09:54:16.194155', '_unique_id': '8d8c9253369f4f5e9d21cf05a5edbb1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d067ea4-5574-4223-9c17-51ac15e87650', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.196426', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd1f1838-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': '8c2c17917163790c91a86fcba300a3250fdd5361dc8558ba857f6e365e69ed99'}]}, 'timestamp': '2025-12-02 09:54:16.197091', '_unique_id': '5b525bf103aa41e787972e68e044fcb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.199 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f43e17d-659e-4ed3-864b-f36d20dd6134', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.199979', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd1fa56e-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': 'b280428baa3873b246e429f363dc0815c2640a90d1c743da694b0e3b1a7260dd'}]}, 'timestamp': '2025-12-02 09:54:16.200803', '_unique_id': '78e8a6fc6446490dbfa1c3ddc11c875a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9afa8b90-2c50-424d-9fce-536850cdf53a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.203357', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd20270a-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': 'ed2ce932eef102ae9de67cd8bdfecec1135e53089f09b8c3c12bf611581d649f'}]}, 'timestamp': '2025-12-02 09:54:16.203895', '_unique_id': 'bef7bdc7448d4e28951f392885f8ff7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '237a5e49-8b50-4347-8b87-c7923e77814e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:54:16.206518', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'dd20a2b6-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.381069609, 'message_signature': '033be59db4cd5b4a7b91593f716b3cf26753d9e7b5c5203b35e1c73cbd04ee3d'}]}, 'timestamp': '2025-12-02 09:54:16.207069', '_unique_id': '994fae2f1b1c4317b0dca9ba00f7d5a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.209 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a49b0195-c36d-4945-a11f-dcb00040a208', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.209686', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd211c96-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': 'a11d699c67e7a4ce2a0a15376adc91481f1473ab61c672585e22558f04163a3b'}]}, 'timestamp': '2025-12-02 09:54:16.210188', '_unique_id': '9f6d1b351c894858ae2f0caf76a2e9b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.212 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '158432e9-67ec-4f48-89f6-846d4edc11c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.212384', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd218802-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': '71bf494cea669b465acf07eaa57f06468a017e7f0f4f8d352abfa9af972f6ae8'}]}, 'timestamp': '2025-12-02 09:54:16.212919', '_unique_id': '412ca99a79864c7f8f9207fd0287f12f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c6bbe2f-4d50-409e-9598-2167cc8d5b68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.215077', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd21ee78-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': 'd3f969ec8045cae1e9ef82447dddb45031d504b38f4db05009cf7ccc4fa93c2d'}]}, 'timestamp': '2025-12-02 09:54:16.215538', '_unique_id': '5ea9633e18f748cc92c24ec2e8c224c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.217 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.218 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12ea32dc-3232-40d5-8d08-1766ffbd3f20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:54:16.217694', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd2251a6-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.384150372, 'message_signature': '4faa72146f75585fa1629b0cc04b97826efbb2af27f36fdda859fca0bece26b8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:54:16.217694', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd225e76-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.384150372, 'message_signature': '7a218dcf9565ba7ecc5d3642347d170aabc910c86e2f3f294365cc83d02d0326'}]}, 'timestamp': '2025-12-02 09:54:16.218343', '_unique_id': 'a1f784cdbd9b4845a9b6b8e685be3003'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.220 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.220 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a1a2fb1-d1b1-41f6-b21f-ae3180107ff9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:54:16.220141', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd22b268-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': '09912e7c0583b8f93cca994853ab4c407891fa4dfa842e30e4e9caf07798af0b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:54:16.220141', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd22c1a4-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': '67da88bf3525d9833eaefa82939780f20ad0700feaf82a8d802a4b3e2943eeb4'}]}, 'timestamp': '2025-12-02 09:54:16.220900', '_unique_id': '244dec082223405091908ab8e6cd3f4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.222 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f08a63e1-90eb-443a-9313-06966e7ccfce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.222892', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd231dde-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': '9304f5cff1efaf722b471f51fca18bbdf75a64dc93ce27079f5453fa16572b05'}]}, 'timestamp': '2025-12-02 09:54:16.223268', '_unique_id': '229dfa199bdd4538b6add847140c20cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.224 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1318373b-ef4c-4a9e-854a-dc247ad34dd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:54:16.224861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd236906-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': '8f501cc8a627da50cc7953b174e439b7d1b5bd00765b75c00cbd4972e20bc0ba'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:54:16.224861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd237298-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.323441607, 'message_signature': '5b4b814705237155fb16973f471856db1691c33737dd4eb490f5f975e68c784e'}]}, 'timestamp': '2025-12-02 09:54:16.225367', '_unique_id': 'b520f126090340a5a84aac083b539e38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.226 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b1e88bd-68b9-4965-b8b4-ea90da1ccb1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:54:16.226729', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'dd23b21c-cf64-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11418.399084592, 'message_signature': '7f849aa259040d83fe5c47dcbae7fb61ae47e189b41df2e70e1f68e2b8094ddd'}]}, 'timestamp': '2025-12-02 09:54:16.227011', '_unique_id': '080921cc203c4f01bf303f6cf6bb1506'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:54:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:54:16.228 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:54:16 np0005541913.localdomain ceph-mon[289473]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:16 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541914.sqgqkj (monmap changed)...
Dec 02 09:54:16 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541914.sqgqkj on np0005541914.localdomain
Dec 02 09:54:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:54:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:54:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:17 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:54:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:17.250 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:17 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541914.lljzmk (monmap changed)...
Dec 02 09:54:17 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:54:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:54:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:54:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:18.428 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:18 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541914 (monmap changed)...
Dec 02 09:54:18 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:54:18 np0005541913.localdomain ceph-mon[289473]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:18 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:18 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:19 np0005541913.localdomain sudo[292074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:54:19 np0005541913.localdomain sudo[292074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:19 np0005541913.localdomain sudo[292074]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:19 np0005541913.localdomain sudo[292092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:54:19 np0005541913.localdomain sudo[292092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:19 np0005541913.localdomain sudo[292092]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:19 np0005541913.localdomain sudo[292110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:54:19 np0005541913.localdomain sudo[292110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:19 np0005541913.localdomain sudo[292110]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:19 np0005541913.localdomain sudo[292128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:19 np0005541913.localdomain sudo[292128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:19 np0005541913.localdomain sudo[292128]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:19 np0005541913.localdomain sudo[292146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:54:19 np0005541913.localdomain sudo[292146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:19 np0005541913.localdomain sudo[292146]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:19 np0005541913.localdomain sudo[292180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:54:19 np0005541913.localdomain sudo[292180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:19 np0005541913.localdomain sudo[292180]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:19 np0005541913.localdomain sudo[292198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:54:19 np0005541913.localdomain sudo[292198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:19 np0005541913.localdomain sudo[292198]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:20 np0005541913.localdomain sudo[292216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:54:20 np0005541913.localdomain sudo[292216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:20 np0005541913.localdomain sudo[292216]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:20 np0005541913.localdomain sudo[292234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:54:20 np0005541913.localdomain sudo[292234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:20 np0005541913.localdomain sudo[292234]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:20 np0005541913.localdomain sudo[292252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:54:20 np0005541913.localdomain sudo[292252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:20 np0005541913.localdomain sudo[292252]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:20 np0005541913.localdomain sudo[292270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:54:20 np0005541913.localdomain sudo[292270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:20 np0005541913.localdomain sudo[292270]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:20 np0005541913.localdomain sudo[292288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:20 np0005541913.localdomain sudo[292288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:20 np0005541913.localdomain sudo[292288]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: Removing np0005541909.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: Removing np0005541909.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: Removing np0005541909.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:20 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:20 np0005541913.localdomain sudo[292306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:54:20 np0005541913.localdomain sudo[292306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:20 np0005541913.localdomain sudo[292306]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:20 np0005541913.localdomain sudo[292340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:54:20 np0005541913.localdomain sudo[292340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:20 np0005541913.localdomain sudo[292340]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:20 np0005541913.localdomain sudo[292358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:54:20 np0005541913.localdomain sudo[292358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:20 np0005541913.localdomain sudo[292358]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:20 np0005541913.localdomain sudo[292376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:20 np0005541913.localdomain sudo[292376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:20 np0005541913.localdomain sudo[292376]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:21 np0005541913.localdomain ceph-mon[289473]: Removing daemon mgr.np0005541909.kfesnk from np0005541909.localdomain -- ports [9283, 8765]
Dec 02 09:54:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:54:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:22.273 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:22 np0005541913.localdomain ceph-mon[289473]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:22 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:23 np0005541913.localdomain sudo[292394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:54:23 np0005541913.localdomain sudo[292394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:23 np0005541913.localdomain sudo[292394]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:23.472 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:23 np0005541913.localdomain ceph-mon[289473]: from='client.26645 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005541909.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:54:23 np0005541913.localdomain ceph-mon[289473]: Added label _no_schedule to host np0005541909.localdomain
Dec 02 09:54:23 np0005541913.localdomain ceph-mon[289473]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005541909.localdomain
Dec 02 09:54:23 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth rm", "entity": "mgr.np0005541909.kfesnk"} : dispatch
Dec 02 09:54:23 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005541909.kfesnk"}]': finished
Dec 02 09:54:23 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:23 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:23 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:54:24 np0005541913.localdomain ceph-mon[289473]: Removing key for mgr.np0005541909.kfesnk
Dec 02 09:54:24 np0005541913.localdomain ceph-mon[289473]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:24 np0005541913.localdomain ceph-mon[289473]: from='client.26801 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005541909.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:54:24 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:24 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:24 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:24 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:54:24 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:25 np0005541913.localdomain sudo[292412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:54:25 np0005541913.localdomain sudo[292412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:54:25 np0005541913.localdomain sudo[292412]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:25 np0005541913.localdomain podman[292429]: 2025-12-02 09:54:25.557084375 +0000 UTC m=+0.078781199 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 02 09:54:25 np0005541913.localdomain podman[292429]: 2025-12-02 09:54:25.568946763 +0000 UTC m=+0.090643587 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 09:54:25 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: Removing daemon crash.np0005541909 from np0005541909.localdomain -- ports []
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541909.localdomain"} : dispatch
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005541909.localdomain"}]': finished
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth rm", "entity": "client.crash.np0005541909.localdomain"} : dispatch
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005541909.localdomain"}]': finished
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:54:25 np0005541913.localdomain sudo[292450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:54:25 np0005541913.localdomain sudo[292450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:25 np0005541913.localdomain sudo[292450]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: from='client.34182 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005541909.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: Removed host np0005541909.localdomain
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: Removing key for client.crash.np0005541909.localdomain
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541910.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:54:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:54:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:27.275 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541910 (monmap changed)...
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541910 on np0005541910.localdomain
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541910 (monmap changed)...
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541910 on np0005541910.localdomain
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541910.kzipdo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:54:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:28.474 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:28 np0005541913.localdomain ceph-mon[289473]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:28 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541910.kzipdo (monmap changed)...
Dec 02 09:54:28 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541910.kzipdo on np0005541910.localdomain
Dec 02 09:54:28 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541911 (monmap changed)...
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541911.adcgiw (monmap changed)...
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541911.adcgiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:29 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541911.adcgiw on np0005541911.localdomain
Dec 02 09:54:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:54:31 np0005541913.localdomain systemd[1]: tmp-crun.lbXHEm.mount: Deactivated successfully.
Dec 02 09:54:31 np0005541913.localdomain podman[292468]: 2025-12-02 09:54:31.455310118 +0000 UTC m=+0.095209359 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 02 09:54:31 np0005541913.localdomain podman[292468]: 2025-12-02 09:54:31.484970861 +0000 UTC m=+0.124870142 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:54:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:31 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541911 (monmap changed)...
Dec 02 09:54:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:54:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:31 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541911 on np0005541911.localdomain
Dec 02 09:54:31 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:54:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:32.279 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:54:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:54:33 np0005541913.localdomain podman[292486]: 2025-12-02 09:54:33.427646102 +0000 UTC m=+0.070421196 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, config_id=edpm, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 09:54:33 np0005541913.localdomain podman[292486]: 2025-12-02 09:54:33.441729208 +0000 UTC m=+0.084504252 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Dec 02 09:54:33 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:54:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:33.477 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:33 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.2 (monmap changed)...
Dec 02 09:54:33 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:54:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:54:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:54:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:54:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:54:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:54:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:54:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:54:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:54:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:54:34 np0005541913.localdomain ceph-mon[289473]: from='client.34233 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:54:34 np0005541913.localdomain ceph-mon[289473]: Saving service mon spec with placement label:mon
Dec 02 09:54:34 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.5 (monmap changed)...
Dec 02 09:54:34 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:54:34 np0005541913.localdomain ceph-mon[289473]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:54:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: from='client.34187 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541912", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:35 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x5645037311e0 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: paxos.3).electionLogic(32) init, last seen epoch 32
Dec 02 09:54:35 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:54:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:54:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:54:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:54:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:54:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18704 "" "Go-http-client/1.1"
Dec 02 09:54:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:54:36 np0005541913.localdomain podman[292506]: 2025-12-02 09:54:36.451882486 +0000 UTC m=+0.086593368 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:54:36 np0005541913.localdomain podman[292506]: 2025-12-02 09:54:36.464153255 +0000 UTC m=+0.098864087 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:54:36 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:54:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:37.281 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:38.483 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:40 np0005541913.localdomain ceph-mon[289473]: paxos.3).electionLogic(33) init, last seen epoch 33, mid-election, bumping
Dec 02 09:54:40 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:40 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:40 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:40 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:40 np0005541913.localdomain sudo[292529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:40 np0005541913.localdomain sudo[292529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:40 np0005541913.localdomain sudo[292529]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:40 np0005541913.localdomain sudo[292547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:40 np0005541913.localdomain sudo[292547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:41 np0005541913.localdomain podman[292583]: 
Dec 02 09:54:41 np0005541913.localdomain podman[292583]: 2025-12-02 09:54:41.308498201 +0000 UTC m=+0.056097862 container create af67c11c156a0e3bae599c7f9492e1928a7b153afce1d33cf9f60cc4be751b3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_dirac, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, release=1763362218, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:54:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-af67c11c156a0e3bae599c7f9492e1928a7b153afce1d33cf9f60cc4be751b3f.scope.
Dec 02 09:54:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:41 np0005541913.localdomain podman[292583]: 2025-12-02 09:54:41.283237665 +0000 UTC m=+0.030837306 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:41 np0005541913.localdomain podman[292583]: 2025-12-02 09:54:41.383706974 +0000 UTC m=+0.131306595 container init af67c11c156a0e3bae599c7f9492e1928a7b153afce1d33cf9f60cc4be751b3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_dirac, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, version=7, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, GIT_BRANCH=main, architecture=x86_64, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container)
Dec 02 09:54:41 np0005541913.localdomain podman[292583]: 2025-12-02 09:54:41.396175247 +0000 UTC m=+0.143774878 container start af67c11c156a0e3bae599c7f9492e1928a7b153afce1d33cf9f60cc4be751b3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_dirac, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:54:41 np0005541913.localdomain podman[292583]: 2025-12-02 09:54:41.396534037 +0000 UTC m=+0.144133678 container attach af67c11c156a0e3bae599c7f9492e1928a7b153afce1d33cf9f60cc4be751b3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_dirac, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:54:41 np0005541913.localdomain suspicious_dirac[292598]: 167 167
Dec 02 09:54:41 np0005541913.localdomain systemd[1]: libpod-af67c11c156a0e3bae599c7f9492e1928a7b153afce1d33cf9f60cc4be751b3f.scope: Deactivated successfully.
Dec 02 09:54:41 np0005541913.localdomain podman[292583]: 2025-12-02 09:54:41.4015134 +0000 UTC m=+0.149113141 container died af67c11c156a0e3bae599c7f9492e1928a7b153afce1d33cf9f60cc4be751b3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_dirac, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, name=rhceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, release=1763362218, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:54:41 np0005541913.localdomain podman[292603]: 2025-12-02 09:54:41.50576787 +0000 UTC m=+0.091826428 container remove af67c11c156a0e3bae599c7f9492e1928a7b153afce1d33cf9f60cc4be751b3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_dirac, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=1763362218, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, com.redhat.component=rhceph-container, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main)
Dec 02 09:54:41 np0005541913.localdomain systemd[1]: libpod-conmon-af67c11c156a0e3bae599c7f9492e1928a7b153afce1d33cf9f60cc4be751b3f.scope: Deactivated successfully.
Dec 02 09:54:41 np0005541913.localdomain sudo[292547]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:41 np0005541913.localdomain sudo[292621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:41 np0005541913.localdomain sudo[292621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:41 np0005541913.localdomain sudo[292621]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: mon.np0005541910 calling monitor election
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: mon.np0005541914 calling monitor election
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913 calling monitor election
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: mon.np0005541910 calling monitor election
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: mon.np0005541914 calling monitor election
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: monmap epoch 8
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:54:35.609159+0000
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541910
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005541913
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: osdmap e86: 6 total, 6 up, 6 in
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: mgrmap e19: np0005541911.adcgiw(active, since 55s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: Health check failed: 1/4 mons down, quorum np0005541911,np0005541910,np0005541914 (MON_DOWN)
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: overall HEALTH_OK
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 calling monitor election
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 is new leader, mons np0005541911,np0005541910,np0005541914,np0005541913 in quorum (ranks 0,1,2,3)
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: monmap epoch 8
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:54:35.609159+0000
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541910
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005541913
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: osdmap e86: 6 total, 6 up, 6 in
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: mgrmap e19: np0005541911.adcgiw(active, since 55s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: Health check cleared: MON_DOWN (was: 1/4 mons down, quorum np0005541911,np0005541910,np0005541914)
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: Cluster is now healthy
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: overall HEALTH_OK
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:54:41 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:41 np0005541913.localdomain sudo[292639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:41 np0005541913.localdomain sudo[292639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:54:42 np0005541913.localdomain podman[292674]: 
Dec 02 09:54:42 np0005541913.localdomain podman[292674]: 2025-12-02 09:54:42.213815499 +0000 UTC m=+0.083440674 container create 51ffb19b36e8ea39570ffcc2746ea74da6b8a158a6a62d09a7110fd228df694f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_allen, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, distribution-scope=public, io.buildah.version=1.41.4, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:54:42 np0005541913.localdomain systemd[1]: Started libpod-conmon-51ffb19b36e8ea39570ffcc2746ea74da6b8a158a6a62d09a7110fd228df694f.scope.
Dec 02 09:54:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:54:42 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:42 np0005541913.localdomain podman[292674]: 2025-12-02 09:54:42.178754961 +0000 UTC m=+0.048380176 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:42 np0005541913.localdomain podman[292674]: 2025-12-02 09:54:42.289821333 +0000 UTC m=+0.159446478 container init 51ffb19b36e8ea39570ffcc2746ea74da6b8a158a6a62d09a7110fd228df694f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_allen, RELEASE=main, CEPH_POINT_RELEASE=, name=rhceph, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, version=7, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:54:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:42.316 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:42 np0005541913.localdomain systemd[1]: tmp-crun.O35XT5.mount: Deactivated successfully.
Dec 02 09:54:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cc0638aa22702c8e0b8ba9ee737d08ad71547f53218d7fc3a677eb6fe86b53a2-merged.mount: Deactivated successfully.
Dec 02 09:54:42 np0005541913.localdomain podman[292674]: 2025-12-02 09:54:42.328722074 +0000 UTC m=+0.198347209 container start 51ffb19b36e8ea39570ffcc2746ea74da6b8a158a6a62d09a7110fd228df694f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_allen, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7)
Dec 02 09:54:42 np0005541913.localdomain podman[292674]: 2025-12-02 09:54:42.329256028 +0000 UTC m=+0.198881163 container attach 51ffb19b36e8ea39570ffcc2746ea74da6b8a158a6a62d09a7110fd228df694f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_allen, architecture=x86_64, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, RELEASE=main, name=rhceph, version=7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, release=1763362218, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git)
Dec 02 09:54:42 np0005541913.localdomain goofy_allen[292689]: 167 167
Dec 02 09:54:42 np0005541913.localdomain systemd[1]: libpod-51ffb19b36e8ea39570ffcc2746ea74da6b8a158a6a62d09a7110fd228df694f.scope: Deactivated successfully.
Dec 02 09:54:42 np0005541913.localdomain podman[292674]: 2025-12-02 09:54:42.331648393 +0000 UTC m=+0.201273558 container died 51ffb19b36e8ea39570ffcc2746ea74da6b8a158a6a62d09a7110fd228df694f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_allen, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, RELEASE=main, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 02 09:54:42 np0005541913.localdomain podman[292690]: 2025-12-02 09:54:42.343827498 +0000 UTC m=+0.070954609 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:54:42 np0005541913.localdomain podman[292690]: 2025-12-02 09:54:42.364272386 +0000 UTC m=+0.091399527 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 09:54:42 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:54:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-49966b7a1e1c663a518f169632b4de751094d41633e12358cfad7afd9a638765-merged.mount: Deactivated successfully.
Dec 02 09:54:42 np0005541913.localdomain podman[292708]: 2025-12-02 09:54:42.457851 +0000 UTC m=+0.116809287 container remove 51ffb19b36e8ea39570ffcc2746ea74da6b8a158a6a62d09a7110fd228df694f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_allen, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, release=1763362218, io.openshift.tags=rhceph ceph)
Dec 02 09:54:42 np0005541913.localdomain systemd[1]: libpod-conmon-51ffb19b36e8ea39570ffcc2746ea74da6b8a158a6a62d09a7110fd228df694f.scope: Deactivated successfully.
Dec 02 09:54:42 np0005541913.localdomain sudo[292639]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.638466) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669282638571, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 3170, "num_deletes": 517, "total_data_size": 9133097, "memory_usage": 9700784, "flush_reason": "Manual Compaction"}
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669282678722, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5540365, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10020, "largest_seqno": 13185, "table_properties": {"data_size": 5527476, "index_size": 7666, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4165, "raw_key_size": 35681, "raw_average_key_size": 21, "raw_value_size": 5497233, "raw_average_value_size": 3327, "num_data_blocks": 331, "num_entries": 1652, "num_filter_entries": 1652, "num_deletions": 516, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669219, "oldest_key_time": 1764669219, "file_creation_time": 1764669282, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 40420 microseconds, and 9035 cpu microseconds.
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.678877) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5540365 bytes OK
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.678945) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.681063) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.681099) EVENT_LOG_v1 {"time_micros": 1764669282681091, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.681124) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 9116858, prev total WAL file size 9165647, number of live WAL files 2.
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.682790) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end)
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(5410KB)], [15(8874KB)]
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669282682836, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 14627416, "oldest_snapshot_seqno": -1}
Dec 02 09:54:42 np0005541913.localdomain sudo[292734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:42 np0005541913.localdomain sudo[292734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:42 np0005541913.localdomain sudo[292734]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 9819 keys, 12520209 bytes, temperature: kUnknown
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669282773472, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12520209, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12463448, "index_size": 31124, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24581, "raw_key_size": 261997, "raw_average_key_size": 26, "raw_value_size": 12294357, "raw_average_value_size": 1252, "num_data_blocks": 1189, "num_entries": 9819, "num_filter_entries": 9819, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669201, "oldest_key_time": 0, "file_creation_time": 1764669282, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.773881) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12520209 bytes
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.775552) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.2 rd, 138.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.3, 8.7 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(4.9) write-amplify(2.3) OK, records in: 10901, records dropped: 1082 output_compression: NoCompression
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.775588) EVENT_LOG_v1 {"time_micros": 1764669282775572, "job": 6, "event": "compaction_finished", "compaction_time_micros": 90748, "compaction_time_cpu_micros": 27991, "output_level": 6, "num_output_files": 1, "total_output_size": 12520209, "num_input_records": 10901, "num_output_records": 9819, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669282776702, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669282778281, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.682692) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.778316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.778323) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.778326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.778329) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:42.778331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.0 (monmap changed)...
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 02 09:54:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:42 np0005541913.localdomain sudo[292752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:42 np0005541913.localdomain sudo[292752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:42.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:54:43 np0005541913.localdomain podman[292786]: 
Dec 02 09:54:43 np0005541913.localdomain podman[292786]: 2025-12-02 09:54:43.248621833 +0000 UTC m=+0.052140116 container create fe97f2c7084d0c4fed940777dcf6cdec965ccb622a532637fb15996a7af8d63b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_raman, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, name=rhceph, version=7)
Dec 02 09:54:43 np0005541913.localdomain systemd[1]: Started libpod-conmon-fe97f2c7084d0c4fed940777dcf6cdec965ccb622a532637fb15996a7af8d63b.scope.
Dec 02 09:54:43 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:43 np0005541913.localdomain podman[292786]: 2025-12-02 09:54:43.299930796 +0000 UTC m=+0.103449089 container init fe97f2c7084d0c4fed940777dcf6cdec965ccb622a532637fb15996a7af8d63b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_raman, distribution-scope=public, io.buildah.version=1.41.4, release=1763362218, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:54:43 np0005541913.localdomain podman[292786]: 2025-12-02 09:54:43.307285333 +0000 UTC m=+0.110803616 container start fe97f2c7084d0c4fed940777dcf6cdec965ccb622a532637fb15996a7af8d63b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_raman, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, RELEASE=main, release=1763362218, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:54:43 np0005541913.localdomain podman[292786]: 2025-12-02 09:54:43.307469997 +0000 UTC m=+0.110988310 container attach fe97f2c7084d0c4fed940777dcf6cdec965ccb622a532637fb15996a7af8d63b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_raman, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git)
Dec 02 09:54:43 np0005541913.localdomain sad_raman[292801]: 167 167
Dec 02 09:54:43 np0005541913.localdomain systemd[1]: libpod-fe97f2c7084d0c4fed940777dcf6cdec965ccb622a532637fb15996a7af8d63b.scope: Deactivated successfully.
Dec 02 09:54:43 np0005541913.localdomain podman[292786]: 2025-12-02 09:54:43.311794414 +0000 UTC m=+0.115312717 container died fe97f2c7084d0c4fed940777dcf6cdec965ccb622a532637fb15996a7af8d63b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_raman, io.buildah.version=1.41.4, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, name=rhceph, GIT_CLEAN=True)
Dec 02 09:54:43 np0005541913.localdomain podman[292786]: 2025-12-02 09:54:43.223464199 +0000 UTC m=+0.026982532 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-975f9be749d384e67e5da09162bbe797c10c50b15982d4c8ea024a86b856ed2d-merged.mount: Deactivated successfully.
Dec 02 09:54:43 np0005541913.localdomain podman[292806]: 2025-12-02 09:54:43.40431353 +0000 UTC m=+0.078583434 container remove fe97f2c7084d0c4fed940777dcf6cdec965ccb622a532637fb15996a7af8d63b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_raman, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, RELEASE=main, build-date=2025-11-26T19:44:28Z, version=7, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, release=1763362218, ceph=True, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:54:43 np0005541913.localdomain systemd[1]: libpod-conmon-fe97f2c7084d0c4fed940777dcf6cdec965ccb622a532637fb15996a7af8d63b.scope: Deactivated successfully.
Dec 02 09:54:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:43.520 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:43 np0005541913.localdomain sudo[292752]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:43 np0005541913.localdomain sudo[292830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:43 np0005541913.localdomain sudo[292830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:43 np0005541913.localdomain sudo[292830]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:43 np0005541913.localdomain sudo[292848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:43 np0005541913.localdomain sudo[292848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:43 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.3 (monmap changed)...
Dec 02 09:54:43 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:54:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:54:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:44 np0005541913.localdomain podman[292883]: 
Dec 02 09:54:44 np0005541913.localdomain podman[292883]: 2025-12-02 09:54:44.03991807 +0000 UTC m=+0.045328544 container create 6a338a69d54bc7cddaff20e9af58ca43cafd8727d73431bdcf10a5b996e5ea25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_fermat, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, distribution-scope=public, io.openshift.expose-services=, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.buildah.version=1.41.4, vendor=Red Hat, Inc., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: Started libpod-conmon-6a338a69d54bc7cddaff20e9af58ca43cafd8727d73431bdcf10a5b996e5ea25.scope.
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:44 np0005541913.localdomain podman[292883]: 2025-12-02 09:54:44.103773589 +0000 UTC m=+0.109184033 container init 6a338a69d54bc7cddaff20e9af58ca43cafd8727d73431bdcf10a5b996e5ea25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_fermat, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, release=1763362218)
Dec 02 09:54:44 np0005541913.localdomain podman[292883]: 2025-12-02 09:54:44.113217921 +0000 UTC m=+0.118628395 container start 6a338a69d54bc7cddaff20e9af58ca43cafd8727d73431bdcf10a5b996e5ea25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_fermat, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, version=7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git)
Dec 02 09:54:44 np0005541913.localdomain podman[292883]: 2025-12-02 09:54:44.113443707 +0000 UTC m=+0.118854171 container attach 6a338a69d54bc7cddaff20e9af58ca43cafd8727d73431bdcf10a5b996e5ea25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_fermat, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, release=1763362218, io.buildah.version=1.41.4, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec 02 09:54:44 np0005541913.localdomain naughty_fermat[292898]: 167 167
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: libpod-6a338a69d54bc7cddaff20e9af58ca43cafd8727d73431bdcf10a5b996e5ea25.scope: Deactivated successfully.
Dec 02 09:54:44 np0005541913.localdomain podman[292883]: 2025-12-02 09:54:44.116957791 +0000 UTC m=+0.122368265 container died 6a338a69d54bc7cddaff20e9af58ca43cafd8727d73431bdcf10a5b996e5ea25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_fermat, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, distribution-scope=public, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, CEPH_POINT_RELEASE=, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4)
Dec 02 09:54:44 np0005541913.localdomain podman[292883]: 2025-12-02 09:54:44.020357306 +0000 UTC m=+0.025767740 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:44 np0005541913.localdomain podman[292903]: 2025-12-02 09:54:44.18939713 +0000 UTC m=+0.060737466 container remove 6a338a69d54bc7cddaff20e9af58ca43cafd8727d73431bdcf10a5b996e5ea25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_fermat, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, version=7, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, RELEASE=main, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4)
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: libpod-conmon-6a338a69d54bc7cddaff20e9af58ca43cafd8727d73431bdcf10a5b996e5ea25.scope: Deactivated successfully.
Dec 02 09:54:44 np0005541913.localdomain sudo[292848]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-33c47d5d8abfd6bcc148330ef95dfbec3ad7883f8d9c96a56ac7fc6b10227ee0-merged.mount: Deactivated successfully.
Dec 02 09:54:44 np0005541913.localdomain sudo[292920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:44 np0005541913.localdomain sudo[292920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:44 np0005541913.localdomain sudo[292920]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:44 np0005541913.localdomain sudo[292938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:44 np0005541913.localdomain sudo[292938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:54:44 np0005541913.localdomain ceph-mon[289473]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:44 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:54:44 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:54:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:54:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:54:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:44 np0005541913.localdomain podman[292972]: 2025-12-02 09:54:44.816719209 +0000 UTC m=+0.060067339 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true)
Dec 02 09:54:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:44.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:54:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:44.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:54:44 np0005541913.localdomain podman[292987]: 
Dec 02 09:54:44 np0005541913.localdomain podman[292987]: 2025-12-02 09:54:44.83620884 +0000 UTC m=+0.060968833 container create 30bcf210668a25bf48a22e0a56bf0d970964d1b53c572fc4019a1808cab36676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_elion, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True)
Dec 02 09:54:44 np0005541913.localdomain podman[292987]: 2025-12-02 09:54:44.812467485 +0000 UTC m=+0.037227488 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: Started libpod-conmon-30bcf210668a25bf48a22e0a56bf0d970964d1b53c572fc4019a1808cab36676.scope.
Dec 02 09:54:44 np0005541913.localdomain podman[292971]: 2025-12-02 09:54:44.927304229 +0000 UTC m=+0.168051139 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:54:44 np0005541913.localdomain podman[292972]: 2025-12-02 09:54:44.934808129 +0000 UTC m=+0.178156239 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:54:44 np0005541913.localdomain podman[292987]: 2025-12-02 09:54:44.946040139 +0000 UTC m=+0.170800172 container init 30bcf210668a25bf48a22e0a56bf0d970964d1b53c572fc4019a1808cab36676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_elion, distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:54:44 np0005541913.localdomain podman[292987]: 2025-12-02 09:54:44.954617809 +0000 UTC m=+0.179377802 container start 30bcf210668a25bf48a22e0a56bf0d970964d1b53c572fc4019a1808cab36676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_elion, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, ceph=True, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:54:44 np0005541913.localdomain podman[292987]: 2025-12-02 09:54:44.955657967 +0000 UTC m=+0.180417970 container attach 30bcf210668a25bf48a22e0a56bf0d970964d1b53c572fc4019a1808cab36676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_elion, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container)
Dec 02 09:54:44 np0005541913.localdomain fervent_elion[293033]: 167 167
Dec 02 09:54:44 np0005541913.localdomain systemd[1]: libpod-30bcf210668a25bf48a22e0a56bf0d970964d1b53c572fc4019a1808cab36676.scope: Deactivated successfully.
Dec 02 09:54:44 np0005541913.localdomain podman[292987]: 2025-12-02 09:54:44.957875537 +0000 UTC m=+0.182635530 container died 30bcf210668a25bf48a22e0a56bf0d970964d1b53c572fc4019a1808cab36676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_elion, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, RELEASE=main, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:54:44 np0005541913.localdomain podman[292971]: 2025-12-02 09:54:44.991053854 +0000 UTC m=+0.231800814 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:54:45 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:54:45 np0005541913.localdomain podman[293041]: 2025-12-02 09:54:45.077357214 +0000 UTC m=+0.110549899 container remove 30bcf210668a25bf48a22e0a56bf0d970964d1b53c572fc4019a1808cab36676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_elion, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, ceph=True, release=1763362218, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph)
Dec 02 09:54:45 np0005541913.localdomain systemd[1]: libpod-conmon-30bcf210668a25bf48a22e0a56bf0d970964d1b53c572fc4019a1808cab36676.scope: Deactivated successfully.
Dec 02 09:54:45 np0005541913.localdomain sudo[292938]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-43529ba9346b7711928e0610390f5fa37417762ac54993ba1b99fc5a1c13ea7f-merged.mount: Deactivated successfully.
Dec 02 09:54:45 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:54:45 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:54:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:54:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:45.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:54:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:45.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:54:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:45.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:54:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:46.585 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:54:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:46.585 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:54:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:46.585 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:54:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:46.585 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:54:46 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541914 (monmap changed)...
Dec 02 09:54:46 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541914 on np0005541914.localdomain
Dec 02 09:54:46 np0005541913.localdomain ceph-mon[289473]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:54:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:46.938 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:54:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.140 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.141 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.141 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.142 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.376 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:54:47 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.1 (monmap changed)...
Dec 02 09:54:47 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:54:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 02 09:54:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.936 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.937 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.937 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.937 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:54:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:47.938 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2742484200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.324 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.387s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.403 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.403 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.582 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.691 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.693 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11812MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.693 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.693 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.4 (monmap changed)...
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.107:0/2742484200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:54:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.914 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.915 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.916 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:54:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:48.949 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.315458) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669289315526, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 520, "num_deletes": 256, "total_data_size": 621723, "memory_usage": 632808, "flush_reason": "Manual Compaction"}
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669289320353, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 358219, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13190, "largest_seqno": 13705, "table_properties": {"data_size": 355267, "index_size": 935, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7391, "raw_average_key_size": 19, "raw_value_size": 349101, "raw_average_value_size": 921, "num_data_blocks": 39, "num_entries": 379, "num_filter_entries": 379, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669282, "oldest_key_time": 1764669282, "file_creation_time": 1764669289, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 4939 microseconds, and 1866 cpu microseconds.
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.320405) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 358219 bytes OK
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.320432) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.322094) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.322119) EVENT_LOG_v1 {"time_micros": 1764669289322112, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.322144) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 618499, prev total WAL file size 618499, number of live WAL files 2.
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.322813) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353136' seq:72057594037927935, type:22 .. '6C6F676D0033373638' seq:0, type:0; will stop at (end)
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(349KB)], [18(11MB)]
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669289322922, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 12878428, "oldest_snapshot_seqno": -1}
Dec 02 09:54:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:49.387 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:54:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:49.395 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 9668 keys, 12768934 bytes, temperature: kUnknown
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669289424760, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 12768934, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12712157, "index_size": 31524, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24197, "raw_key_size": 260004, "raw_average_key_size": 26, "raw_value_size": 12544692, "raw_average_value_size": 1297, "num_data_blocks": 1204, "num_entries": 9668, "num_filter_entries": 9668, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669201, "oldest_key_time": 0, "file_creation_time": 1764669289, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.425197) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 12768934 bytes
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.427190) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.3 rd, 125.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 11.9 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(71.6) write-amplify(35.6) OK, records in: 10198, records dropped: 530 output_compression: NoCompression
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.427243) EVENT_LOG_v1 {"time_micros": 1764669289427220, "job": 8, "event": "compaction_finished", "compaction_time_micros": 101949, "compaction_time_cpu_micros": 39892, "output_level": 6, "num_output_files": 1, "total_output_size": 12768934, "num_input_records": 10198, "num_output_records": 9668, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669289427665, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669289431116, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.322723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.431239) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.431248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.431252) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.431256) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:54:49.431259) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:54:49 np0005541913.localdomain sudo[293103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:54:49 np0005541913.localdomain sudo[293103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:49 np0005541913.localdomain sudo[293103]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:49 np0005541913.localdomain sudo[293121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:54:49 np0005541913.localdomain sudo[293121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541914.sqgqkj (monmap changed)...
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541914.sqgqkj on np0005541914.localdomain
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541914.lljzmk (monmap changed)...
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.107:0/312733303' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:50.035 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:54:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:50.038 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:54:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:50.039 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:54:50 np0005541913.localdomain sudo[293121]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:51 np0005541913.localdomain sudo[293171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:54:51 np0005541913.localdomain sudo[293171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:51 np0005541913.localdomain sudo[293171]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:51 np0005541913.localdomain sudo[293189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:54:51 np0005541913.localdomain sudo[293189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:51 np0005541913.localdomain sudo[293189]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:51 np0005541913.localdomain sudo[293207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:54:51 np0005541913.localdomain sudo[293207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:51 np0005541913.localdomain sudo[293207]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.108:0/955695829' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:54:51 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.108:0/3012433288' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:54:51 np0005541913.localdomain sudo[293225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:51 np0005541913.localdomain sudo[293225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:51 np0005541913.localdomain sudo[293225]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:54:52 np0005541913.localdomain sudo[293243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:54:52 np0005541913.localdomain sudo[293243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293243]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:54:52 np0005541913.localdomain sudo[293277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293277]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:54:52 np0005541913.localdomain sudo[293295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293295]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:54:52 np0005541913.localdomain sudo[293313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293313]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:54:52 np0005541913.localdomain sudo[293331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293331]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:52.380 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:52 np0005541913.localdomain sudo[293349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:54:52 np0005541913.localdomain sudo[293349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293349]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:54:52 np0005541913.localdomain sudo[293367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293367]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:54:52 np0005541913.localdomain sudo[293385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293385]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:54:52 np0005541913.localdomain sudo[293403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293403]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:54:52 np0005541913.localdomain sudo[293437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293437]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:54:52 np0005541913.localdomain sudo[293455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293455]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:52 np0005541913.localdomain sudo[293473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:52 np0005541913.localdomain sudo[293473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:52 np0005541913.localdomain sudo[293473]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='client.26696 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005541912.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: Deploying daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:54:53 np0005541913.localdomain sudo[293491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:54:53 np0005541913.localdomain sudo[293491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:54:53 np0005541913.localdomain sudo[293491]: pam_unix(sudo:session): session closed for user root
Dec 02 09:54:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:53.585 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:54 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541910 (monmap changed)...
Dec 02 09:54:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541910.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:54:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:54 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541910 on np0005541910.localdomain
Dec 02 09:54:54 np0005541913.localdomain ceph-mon[289473]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:54:54 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.106:0/2775911751' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:54:54 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Dec 02 09:54:54 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.106:0/2644624687' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541910.kzipdo (monmap changed)...
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541910.kzipdo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541910.kzipdo on np0005541910.localdomain
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Dec 02 09:54:55 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x564503731600 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: paxos.3).electionLogic(38) init, last seen epoch 38
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:55 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:54:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:54:56 np0005541913.localdomain podman[293509]: 2025-12-02 09:54:56.433485802 +0000 UTC m=+0.073004514 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 09:54:56 np0005541913.localdomain podman[293509]: 2025-12-02 09:54:56.445022931 +0000 UTC m=+0.084541683 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 09:54:56 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:54:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:57.382 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:54:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:54:58.588 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541911.adcgiw (monmap changed)...
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541911.adcgiw on np0005541911.localdomain
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541910 calling monitor election
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541914 calling monitor election
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913 calling monitor election
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541910"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 calling monitor election
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 is new leader, mons np0005541911,np0005541910,np0005541914,np0005541913 in quorum (ranks 0,1,2,3)
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: monmap epoch 9
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:54:55.352641+0000
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541910
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005541913
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: osdmap e86: 6 total, 6 up, 6 in
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: mgrmap e19: np0005541911.adcgiw(active, since 75s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: Health check failed: 1/5 mons down, quorum np0005541911,np0005541910,np0005541914,np0005541913 (MON_DOWN)
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005541911,np0005541910,np0005541914,np0005541913
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005541911,np0005541910,np0005541914,np0005541913
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]:     mon.np0005541912 (rank 4) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum)
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541911 (monmap changed)...
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541911 on np0005541911.localdomain
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:02 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:55:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:02.385 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:02 np0005541913.localdomain ceph-mon[289473]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:55:02 np0005541913.localdomain ceph-mon[289473]: paxos.3).electionLogic(40) init, last seen epoch 40
Dec 02 09:55:02 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:55:02 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:55:02 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:55:02 np0005541913.localdomain podman[293528]: 2025-12-02 09:55:02.44521396 +0000 UTC m=+0.083316201 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 09:55:02 np0005541913.localdomain podman[293528]: 2025-12-02 09:55:02.480037692 +0000 UTC m=+0.118139873 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:55:02 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:55:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:55:03.040 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:55:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:55:03.040 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:55:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:55:03.042 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: mon.np0005541912 calling monitor election
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.2 (monmap changed)...
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: mon.np0005541910 calling monitor election
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: mon.np0005541912 calling monitor election
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: mon.np0005541914 calling monitor election
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913 calling monitor election
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 calling monitor election
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 is new leader, mons np0005541911,np0005541910,np0005541914,np0005541913,np0005541912 in quorum (ranks 0,1,2,3,4)
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: monmap epoch 9
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:54:55.352641+0000
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541910
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005541913
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: osdmap e86: 6 total, 6 up, 6 in
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: mgrmap e19: np0005541911.adcgiw(active, since 77s), standbys: np0005541914.lljzmk, np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005541911,np0005541910,np0005541914,np0005541913)
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: Cluster is now healthy
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: overall HEALTH_OK
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:03 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:03.590 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:55:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:55:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:55:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:55:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:55:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:55:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:55:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:55:04 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.5 (monmap changed)...
Dec 02 09:55:04 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:55:04 np0005541913.localdomain ceph-mon[289473]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:04 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:04 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:04 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:04 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:04 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.32:0/371118405' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 09:55:04 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.32:0/371118405' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 09:55:04 np0005541913.localdomain podman[293546]: 2025-12-02 09:55:04.429814503 +0000 UTC m=+0.068470263 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 09:55:04 np0005541913.localdomain podman[293546]: 2025-12-02 09:55:04.444946938 +0000 UTC m=+0.083602718 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.buildah.version=1.33.7, version=9.6)
Dec 02 09:55:04 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:05 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:05 np0005541913.localdomain sudo[293566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:05 np0005541913.localdomain sudo[293566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:05 np0005541913.localdomain sudo[293566]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:05 np0005541913.localdomain sudo[293584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:05 np0005541913.localdomain sudo[293584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:55:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:55:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:55:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:55:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:55:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18706 "" "Go-http-client/1.1"
Dec 02 09:55:06 np0005541913.localdomain podman[293618]: 
Dec 02 09:55:06 np0005541913.localdomain podman[293618]: 2025-12-02 09:55:06.420685313 +0000 UTC m=+0.079476317 container create 61feebfcfc6f867124b867e58fe5b8f804ab18b8e9b4f2489f6e029af09a0b61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bose, version=7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:55:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-61feebfcfc6f867124b867e58fe5b8f804ab18b8e9b4f2489f6e029af09a0b61.scope.
Dec 02 09:55:06 np0005541913.localdomain podman[293618]: 2025-12-02 09:55:06.388133492 +0000 UTC m=+0.046924516 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:55:06 np0005541913.localdomain podman[293618]: 2025-12-02 09:55:06.519242171 +0000 UTC m=+0.178033145 container init 61feebfcfc6f867124b867e58fe5b8f804ab18b8e9b4f2489f6e029af09a0b61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bose, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=1763362218, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph)
Dec 02 09:55:06 np0005541913.localdomain systemd[1]: tmp-crun.kbVurg.mount: Deactivated successfully.
Dec 02 09:55:06 np0005541913.localdomain podman[293618]: 2025-12-02 09:55:06.53751419 +0000 UTC m=+0.196305164 container start 61feebfcfc6f867124b867e58fe5b8f804ab18b8e9b4f2489f6e029af09a0b61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bose, release=1763362218, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.41.4, GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:55:06 np0005541913.localdomain podman[293618]: 2025-12-02 09:55:06.538552948 +0000 UTC m=+0.197343922 container attach 61feebfcfc6f867124b867e58fe5b8f804ab18b8e9b4f2489f6e029af09a0b61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bose, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_BRANCH=main, ceph=True, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:55:06 np0005541913.localdomain elegant_bose[293634]: 167 167
Dec 02 09:55:06 np0005541913.localdomain systemd[1]: libpod-61feebfcfc6f867124b867e58fe5b8f804ab18b8e9b4f2489f6e029af09a0b61.scope: Deactivated successfully.
Dec 02 09:55:06 np0005541913.localdomain podman[293618]: 2025-12-02 09:55:06.543657155 +0000 UTC m=+0.202448159 container died 61feebfcfc6f867124b867e58fe5b8f804ab18b8e9b4f2489f6e029af09a0b61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bose, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public, io.openshift.expose-services=, name=rhceph, vcs-type=git, build-date=2025-11-26T19:44:28Z)
Dec 02 09:55:06 np0005541913.localdomain podman[293637]: 2025-12-02 09:55:06.601425011 +0000 UTC m=+0.086759793 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:55:06 np0005541913.localdomain podman[293637]: 2025-12-02 09:55:06.614941092 +0000 UTC m=+0.100275844 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:55:06 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:55:06 np0005541913.localdomain podman[293650]: 2025-12-02 09:55:06.704226991 +0000 UTC m=+0.143315966 container remove 61feebfcfc6f867124b867e58fe5b8f804ab18b8e9b4f2489f6e029af09a0b61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bose, name=rhceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, ceph=True, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:55:06 np0005541913.localdomain systemd[1]: libpod-conmon-61feebfcfc6f867124b867e58fe5b8f804ab18b8e9b4f2489f6e029af09a0b61.scope: Deactivated successfully.
Dec 02 09:55:06 np0005541913.localdomain sudo[293584]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:06 np0005541913.localdomain sudo[293678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.200:0/432160104' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:55:06 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:06 np0005541913.localdomain sudo[293678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:06 np0005541913.localdomain sudo[293678]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:06 np0005541913.localdomain sudo[293696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:06 np0005541913.localdomain sudo[293696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:07 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:07.387 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-9f955240051d03777a505466749a19601f44f417af121f0dcc452e65d8616616-merged.mount: Deactivated successfully.
Dec 02 09:55:07 np0005541913.localdomain podman[293732]: 
Dec 02 09:55:07 np0005541913.localdomain podman[293732]: 2025-12-02 09:55:07.472550154 +0000 UTC m=+0.089681631 container create 678d54a6eca58f9b08c69d52db1d073ca5850d93bc2f570eae232b4cba262646 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_swanson, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, release=1763362218, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:55:07 np0005541913.localdomain systemd[1]: Started libpod-conmon-678d54a6eca58f9b08c69d52db1d073ca5850d93bc2f570eae232b4cba262646.scope.
Dec 02 09:55:07 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:07 np0005541913.localdomain podman[293732]: 2025-12-02 09:55:07.43913228 +0000 UTC m=+0.056263827 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:07 np0005541913.localdomain podman[293732]: 2025-12-02 09:55:07.542600809 +0000 UTC m=+0.159732306 container init 678d54a6eca58f9b08c69d52db1d073ca5850d93bc2f570eae232b4cba262646 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_swanson, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, version=7)
Dec 02 09:55:07 np0005541913.localdomain podman[293732]: 2025-12-02 09:55:07.555083873 +0000 UTC m=+0.172215400 container start 678d54a6eca58f9b08c69d52db1d073ca5850d93bc2f570eae232b4cba262646 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_swanson, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, version=7, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:55:07 np0005541913.localdomain podman[293732]: 2025-12-02 09:55:07.555686788 +0000 UTC m=+0.172818295 container attach 678d54a6eca58f9b08c69d52db1d073ca5850d93bc2f570eae232b4cba262646 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_swanson, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, release=1763362218, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 09:55:07 np0005541913.localdomain affectionate_swanson[293746]: 167 167
Dec 02 09:55:07 np0005541913.localdomain podman[293732]: 2025-12-02 09:55:07.575841688 +0000 UTC m=+0.192973205 container died 678d54a6eca58f9b08c69d52db1d073ca5850d93bc2f570eae232b4cba262646 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_swanson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, RELEASE=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, distribution-scope=public, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:55:07 np0005541913.localdomain systemd[1]: libpod-678d54a6eca58f9b08c69d52db1d073ca5850d93bc2f570eae232b4cba262646.scope: Deactivated successfully.
Dec 02 09:55:07 np0005541913.localdomain podman[293751]: 2025-12-02 09:55:07.689257353 +0000 UTC m=+0.098587099 container remove 678d54a6eca58f9b08c69d52db1d073ca5850d93bc2f570eae232b4cba262646 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_swanson, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, RELEASE=main, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:55:07 np0005541913.localdomain systemd[1]: libpod-conmon-678d54a6eca58f9b08c69d52db1d073ca5850d93bc2f570eae232b4cba262646.scope: Deactivated successfully.
Dec 02 09:55:07 np0005541913.localdomain sudo[293696]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:07 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.0 (monmap changed)...
Dec 02 09:55:07 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:55:07 np0005541913.localdomain ceph-mon[289473]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:07 np0005541913.localdomain ceph-mon[289473]: from='client.26891 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:55:07 np0005541913.localdomain ceph-mon[289473]: Reconfig service osd.default_drive_group
Dec 02 09:55:07 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:07 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:07 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain sudo[293774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:08 np0005541913.localdomain sudo[293774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:08 np0005541913.localdomain sudo[293774]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:08 np0005541913.localdomain sudo[293792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:08 np0005541913.localdomain sudo[293792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-849e1f0b64969c4c08c1f26ff847021db1cd8ae7309d369f35cd8de39c3701ce-merged.mount: Deactivated successfully.
Dec 02 09:55:08 np0005541913.localdomain podman[293827]: 
Dec 02 09:55:08 np0005541913.localdomain podman[293827]: 2025-12-02 09:55:08.595125916 +0000 UTC m=+0.070063796 container create d784826d7cab897b10e1d7325f678ac5523e09a4bcae33d290e7c0e44a390eb6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_zhukovsky, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., version=7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.buildah.version=1.41.4, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7)
Dec 02 09:55:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:08.593 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:08 np0005541913.localdomain systemd[1]: Started libpod-conmon-d784826d7cab897b10e1d7325f678ac5523e09a4bcae33d290e7c0e44a390eb6.scope.
Dec 02 09:55:08 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:08 np0005541913.localdomain podman[293827]: 2025-12-02 09:55:08.56495831 +0000 UTC m=+0.039896270 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:08 np0005541913.localdomain podman[293827]: 2025-12-02 09:55:08.663663401 +0000 UTC m=+0.138601321 container init d784826d7cab897b10e1d7325f678ac5523e09a4bcae33d290e7c0e44a390eb6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_zhukovsky, io.buildah.version=1.41.4, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, release=1763362218, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_BRANCH=main, architecture=x86_64)
Dec 02 09:55:08 np0005541913.localdomain podman[293827]: 2025-12-02 09:55:08.677100881 +0000 UTC m=+0.152038791 container start d784826d7cab897b10e1d7325f678ac5523e09a4bcae33d290e7c0e44a390eb6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_zhukovsky, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.)
Dec 02 09:55:08 np0005541913.localdomain podman[293827]: 2025-12-02 09:55:08.677368248 +0000 UTC m=+0.152306198 container attach d784826d7cab897b10e1d7325f678ac5523e09a4bcae33d290e7c0e44a390eb6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_zhukovsky, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, release=1763362218, build-date=2025-11-26T19:44:28Z, name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:55:08 np0005541913.localdomain stoic_zhukovsky[293842]: 167 167
Dec 02 09:55:08 np0005541913.localdomain systemd[1]: libpod-d784826d7cab897b10e1d7325f678ac5523e09a4bcae33d290e7c0e44a390eb6.scope: Deactivated successfully.
Dec 02 09:55:08 np0005541913.localdomain podman[293827]: 2025-12-02 09:55:08.682938457 +0000 UTC m=+0.157876367 container died d784826d7cab897b10e1d7325f678ac5523e09a4bcae33d290e7c0e44a390eb6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_zhukovsky, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, release=1763362218, vcs-type=git, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Dec 02 09:55:08 np0005541913.localdomain podman[293847]: 2025-12-02 09:55:08.774795814 +0000 UTC m=+0.083251328 container remove d784826d7cab897b10e1d7325f678ac5523e09a4bcae33d290e7c0e44a390eb6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_zhukovsky, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:55:08 np0005541913.localdomain systemd[1]: libpod-conmon-d784826d7cab897b10e1d7325f678ac5523e09a4bcae33d290e7c0e44a390eb6.scope: Deactivated successfully.
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.3 (monmap changed)...
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:08 np0005541913.localdomain sudo[293792]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:09 np0005541913.localdomain sudo[293870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:09 np0005541913.localdomain sudo[293870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:09 np0005541913.localdomain sudo[293870]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:09 np0005541913.localdomain sudo[293888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:09 np0005541913.localdomain sudo[293888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c4f2212c0e5df612c2a81411783f5e19957d870cc9113eadc8fd50dfa0cd686b-merged.mount: Deactivated successfully.
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e87 e87: 6 total, 6 up, 6 in
Dec 02 09:55:09 np0005541913.localdomain podman[293923]: 
Dec 02 09:55:09 np0005541913.localdomain podman[293923]: 2025-12-02 09:55:09.622746187 +0000 UTC m=+0.078547643 container create 76b0abb565f877404435e1c7109f89385b9711965607944334c280e2dca01d43 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_gould, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_CLEAN=True)
Dec 02 09:55:09 np0005541913.localdomain sshd[290313]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:55:09 np0005541913.localdomain systemd[1]: Started libpod-conmon-76b0abb565f877404435e1c7109f89385b9711965607944334c280e2dca01d43.scope.
Dec 02 09:55:09 np0005541913.localdomain systemd-logind[757]: Session 65 logged out. Waiting for processes to exit.
Dec 02 09:55:09 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:09 np0005541913.localdomain podman[293923]: 2025-12-02 09:55:09.594310346 +0000 UTC m=+0.050111832 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:09 np0005541913.localdomain podman[293923]: 2025-12-02 09:55:09.704992487 +0000 UTC m=+0.160793923 container init 76b0abb565f877404435e1c7109f89385b9711965607944334c280e2dca01d43 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_gould, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, version=7, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:55:09 np0005541913.localdomain podman[293923]: 2025-12-02 09:55:09.71593841 +0000 UTC m=+0.171739866 container start 76b0abb565f877404435e1c7109f89385b9711965607944334c280e2dca01d43 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_gould, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, vcs-type=git, vendor=Red Hat, Inc., release=1763362218, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public)
Dec 02 09:55:09 np0005541913.localdomain podman[293923]: 2025-12-02 09:55:09.716215678 +0000 UTC m=+0.172017114 container attach 76b0abb565f877404435e1c7109f89385b9711965607944334c280e2dca01d43 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_gould, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, description=Red Hat Ceph Storage 7, version=7)
Dec 02 09:55:09 np0005541913.localdomain happy_gould[293938]: 167 167
Dec 02 09:55:09 np0005541913.localdomain systemd[1]: libpod-76b0abb565f877404435e1c7109f89385b9711965607944334c280e2dca01d43.scope: Deactivated successfully.
Dec 02 09:55:09 np0005541913.localdomain podman[293923]: 2025-12-02 09:55:09.718948911 +0000 UTC m=+0.174750347 container died 76b0abb565f877404435e1c7109f89385b9711965607944334c280e2dca01d43 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_gould, io.openshift.expose-services=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, RELEASE=main)
Dec 02 09:55:09 np0005541913.localdomain podman[293943]: 2025-12-02 09:55:09.830230939 +0000 UTC m=+0.096524314 container remove 76b0abb565f877404435e1c7109f89385b9711965607944334c280e2dca01d43 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_gould, build-date=2025-11-26T19:44:28Z, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, name=rhceph, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, RELEASE=main, CEPH_POINT_RELEASE=, version=7, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:55:09 np0005541913.localdomain systemd[1]: libpod-conmon-76b0abb565f877404435e1c7109f89385b9711965607944334c280e2dca01d43.scope: Deactivated successfully.
Dec 02 09:55:09 np0005541913.localdomain sudo[293888]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:09 np0005541913.localdomain systemd[1]: session-65.scope: Deactivated successfully.
Dec 02 09:55:09 np0005541913.localdomain systemd[1]: session-65.scope: Consumed 19.772s CPU time.
Dec 02 09:55:09 np0005541913.localdomain systemd-logind[757]: Removed session 65.
Dec 02 09:55:09 np0005541913.localdomain sshd[293959]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' 
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.14184 172.18.0.105:0/1560580735' entity='mgr.np0005541911.adcgiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.200:0/2202206912' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: Activating manager daemon np0005541914.lljzmk
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: osdmap e87: 6 total, 6 up, 6 in
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: mgrmap e20: np0005541914.lljzmk(active, starting, since 0.198391s), standbys: np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541910"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mds metadata", "who": "mds.np0005541914.sqgqkj"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mds metadata", "who": "mds.np0005541913.maexpe"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mds metadata", "who": "mds.np0005541912.ghcwcm"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr metadata", "who": "np0005541914.lljzmk", "id": "np0005541914.lljzmk"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr metadata", "who": "np0005541910.kzipdo", "id": "np0005541910.kzipdo"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr metadata", "who": "np0005541913.mfesdm", "id": "np0005541913.mfesdm"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr metadata", "who": "np0005541912.qwddia", "id": "np0005541912.qwddia"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr metadata", "who": "np0005541909.kfesnk", "id": "np0005541909.kfesnk"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mds metadata"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: Manager daemon np0005541914.lljzmk is now available
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541909.localdomain.devices.0"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541909.localdomain.devices.0"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005541909.localdomain.devices.0"}]': finished
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541909.localdomain.devices.0"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541909.localdomain.devices.0"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005541909.localdomain.devices.0"}]': finished
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541914.lljzmk/mirror_snapshot_schedule"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541914.lljzmk/mirror_snapshot_schedule"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541914.lljzmk/trash_purge_schedule"} : dispatch
Dec 02 09:55:09 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541914.lljzmk/trash_purge_schedule"} : dispatch
Dec 02 09:55:10 np0005541913.localdomain sshd[293959]: Accepted publickey for ceph-admin from 192.168.122.108 port 44994 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 09:55:10 np0005541913.localdomain systemd-logind[757]: New session 66 of user ceph-admin.
Dec 02 09:55:10 np0005541913.localdomain systemd[1]: Started Session 66 of User ceph-admin.
Dec 02 09:55:10 np0005541913.localdomain sshd[293959]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 09:55:10 np0005541913.localdomain sudo[293963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:10 np0005541913.localdomain sudo[293963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:10 np0005541913.localdomain sudo[293963]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:10 np0005541913.localdomain sudo[293981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:55:10 np0005541913.localdomain sudo[293981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8401a8d1babdb6458ecb02658dbc12bf4dd9c1ed37ac5282f3d398e501423c8a-merged.mount: Deactivated successfully.
Dec 02 09:55:10 np0005541913.localdomain ceph-mon[289473]: removing stray HostCache host record np0005541909.localdomain.devices.0
Dec 02 09:55:10 np0005541913.localdomain ceph-mon[289473]: mgrmap e21: np0005541914.lljzmk(active, since 1.22617s), standbys: np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:55:11 np0005541913.localdomain podman[294069]: 2025-12-02 09:55:11.042881948 +0000 UTC m=+0.076453707 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, release=1763362218)
Dec 02 09:55:11 np0005541913.localdomain podman[294069]: 2025-12-02 09:55:11.128981642 +0000 UTC m=+0.162553371 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, RELEASE=main, io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, release=1763362218, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64)
Dec 02 09:55:11 np0005541913.localdomain sudo[293981]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:11 np0005541913.localdomain sudo[294191]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:11 np0005541913.localdomain sudo[294191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:11 np0005541913.localdomain sudo[294191]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:11 np0005541913.localdomain sudo[294209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:55:11 np0005541913.localdomain sudo[294209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:55:10] ENGINE Bus STARTING
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:55:10] ENGINE Serving on http://172.18.0.108:8765
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:55:10] ENGINE Serving on https://172.18.0.108:7150
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:55:10] ENGINE Client ('172.18.0.108', 34066) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: [02/Dec/2025:09:55:10] ENGINE Bus STARTED
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:11 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:12 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:12.391 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:12 np0005541913.localdomain sudo[294209]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:12 np0005541913.localdomain sudo[294260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:12 np0005541913.localdomain sudo[294260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:55:12 np0005541913.localdomain sudo[294260]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:12 np0005541913.localdomain sudo[294279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 09:55:12 np0005541913.localdomain sudo[294279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:12 np0005541913.localdomain systemd[1]: tmp-crun.sIodTz.mount: Deactivated successfully.
Dec 02 09:55:12 np0005541913.localdomain podman[294278]: 2025-12-02 09:55:12.79045736 +0000 UTC m=+0.112464260 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Dec 02 09:55:12 np0005541913.localdomain podman[294278]: 2025-12-02 09:55:12.828462937 +0000 UTC m=+0.150469847 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true)
Dec 02 09:55:12 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd/host:np0005541911", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd/host:np0005541911", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd/host:np0005541910", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd/host:np0005541910", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:13 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:13 np0005541913.localdomain sudo[294279]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:55:13 np0005541913.localdomain sudo[294334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294334]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:55:13 np0005541913.localdomain sudo[294352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294352]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:55:13 np0005541913.localdomain sudo[294370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294370]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:13 np0005541913.localdomain sudo[294388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294388]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:55:13 np0005541913.localdomain sudo[294406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294406]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:13.596 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:13 np0005541913.localdomain sudo[294440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:55:13 np0005541913.localdomain sudo[294440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294440]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:55:13 np0005541913.localdomain sudo[294458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294458]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:55:13 np0005541913.localdomain sudo[294476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294476]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:55:13 np0005541913.localdomain sudo[294494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294494]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:55:13 np0005541913.localdomain sudo[294512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294512]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:13 np0005541913.localdomain sudo[294530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:55:13 np0005541913.localdomain sudo[294530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:13 np0005541913.localdomain sudo[294530]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:14 np0005541913.localdomain sudo[294548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294548]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Adjusting osd_memory_target on np0005541912.localdomain to 836.6M
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Unable to set osd_memory_target on np0005541912.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: mgrmap e22: np0005541914.lljzmk(active, since 3s), standbys: np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Adjusting osd_memory_target on np0005541914.localdomain to 836.6M
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Unable to set osd_memory_target on np0005541914.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Adjusting osd_memory_target on np0005541913.localdomain to 836.6M
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Unable to set osd_memory_target on np0005541913.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/etc/ceph/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain sudo[294566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:55:14 np0005541913.localdomain sudo[294566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294566]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:55:14 np0005541913.localdomain sudo[294600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294600]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:55:14 np0005541913.localdomain sudo[294618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294618]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:55:14 np0005541913.localdomain sudo[294636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294636]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:55:14 np0005541913.localdomain sudo[294654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294654]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:55:14 np0005541913.localdomain sudo[294672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294672]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:55:14 np0005541913.localdomain sudo[294690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294690]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:14 np0005541913.localdomain sudo[294708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294708]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:55:14 np0005541913.localdomain sudo[294726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294726]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:55:14 np0005541913.localdomain sudo[294760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294760]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:14 np0005541913.localdomain sudo[294778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:55:14 np0005541913.localdomain sudo[294778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:14 np0005541913.localdomain sudo[294778]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:15 np0005541913.localdomain sudo[294796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain sudo[294796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:55:15 np0005541913.localdomain sudo[294796]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:55:15 np0005541913.localdomain sudo[294826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:55:15 np0005541913.localdomain sudo[294826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:15 np0005541913.localdomain sudo[294826]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:15 np0005541913.localdomain systemd[1]: tmp-crun.V99SdH.mount: Deactivated successfully.
Dec 02 09:55:15 np0005541913.localdomain podman[294813]: 2025-12-02 09:55:15.161505086 +0000 UTC m=+0.106215153 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:55:15 np0005541913.localdomain podman[294813]: 2025-12-02 09:55:15.200050917 +0000 UTC m=+0.144760964 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:55:15 np0005541913.localdomain sudo[294859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:55:15 np0005541913.localdomain sudo[294859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:15 np0005541913.localdomain sudo[294859]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:15 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:55:15 np0005541913.localdomain podman[294815]: 2025-12-02 09:55:15.247374864 +0000 UTC m=+0.190845598 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 02 09:55:15 np0005541913.localdomain sudo[294884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:55:15 np0005541913.localdomain sudo[294884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:15 np0005541913.localdomain sudo[294884]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:15 np0005541913.localdomain podman[294815]: 2025-12-02 09:55:15.333187911 +0000 UTC m=+0.276658635 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:55:15 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Standby manager daemon np0005541911.adcgiw started
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Updating np0005541910.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain sudo[294916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:15 np0005541913.localdomain sudo[294916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:15 np0005541913.localdomain sudo[294916]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:15 np0005541913.localdomain sudo[294934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:55:15 np0005541913.localdomain sudo[294934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:15 np0005541913.localdomain sudo[294934]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:15 np0005541913.localdomain sudo[294968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:55:15 np0005541913.localdomain sudo[294968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:15 np0005541913.localdomain sudo[294968]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:15 np0005541913.localdomain sudo[294986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:55:15 np0005541913.localdomain sudo[294986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:15 np0005541913.localdomain sudo[294986]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:15 np0005541913.localdomain sudo[295004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:55:15 np0005541913.localdomain sudo[295004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:15 np0005541913.localdomain sudo[295004]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:16 np0005541913.localdomain sudo[295022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:55:16 np0005541913.localdomain sudo[295022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:16 np0005541913.localdomain sudo[295022]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr metadata", "who": "np0005541911.adcgiw", "id": "np0005541911.adcgiw"} : dispatch
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: mgrmap e23: np0005541914.lljzmk(active, since 6s), standbys: np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541909.kfesnk, np0005541911.adcgiw
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 0 B/s wr, 22 op/s
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541910.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541910.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:16 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:17.395 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541910 (monmap changed)...
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541910 on np0005541910.localdomain
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541910.kzipdo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541910.kzipdo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:55:17 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:18.598 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:19 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541910.kzipdo (monmap changed)...
Dec 02 09:55:19 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541910.kzipdo on np0005541910.localdomain
Dec 02 09:55:19 np0005541913.localdomain ceph-mon[289473]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 0 B/s wr, 16 op/s
Dec 02 09:55:19 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:19 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:19 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:55:19 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:19 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:55:21 np0005541913.localdomain ceph-mon[289473]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Dec 02 09:55:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:55:21 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:21 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:55:22 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:22.398 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:22 np0005541913.localdomain ceph-mon[289473]: from='client.34293 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:55:22 np0005541913.localdomain ceph-mon[289473]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Dec 02 09:55:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:23.600 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: from='client.26775 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: Saving service mon spec with placement label:mon
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:55:25 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:25 np0005541913.localdomain sudo[295040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:25 np0005541913.localdomain sudo[295040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:25 np0005541913.localdomain sudo[295040]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:25 np0005541913.localdomain sudo[295058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:25 np0005541913.localdomain sudo[295058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Dec 02 09:55:26 np0005541913.localdomain ceph-mon[289473]: from='client.34303 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541912", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:55:26 np0005541913.localdomain podman[295095]: 
Dec 02 09:55:26 np0005541913.localdomain podman[295095]: 2025-12-02 09:55:26.300320134 +0000 UTC m=+0.085884538 container create ead4efb670ba7986942a41e576112c87a7eccfb14a8585f9a2c97bb44531a16e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_brahmagupta, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218)
Dec 02 09:55:26 np0005541913.localdomain systemd[1]: Started libpod-conmon-ead4efb670ba7986942a41e576112c87a7eccfb14a8585f9a2c97bb44531a16e.scope.
Dec 02 09:55:26 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:26 np0005541913.localdomain podman[295095]: 2025-12-02 09:55:26.269463819 +0000 UTC m=+0.055028273 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:26 np0005541913.localdomain podman[295095]: 2025-12-02 09:55:26.376095423 +0000 UTC m=+0.161659857 container init ead4efb670ba7986942a41e576112c87a7eccfb14a8585f9a2c97bb44531a16e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_brahmagupta, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_BRANCH=main, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_CLEAN=True, vcs-type=git, build-date=2025-11-26T19:44:28Z, RELEASE=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph)
Dec 02 09:55:26 np0005541913.localdomain podman[295095]: 2025-12-02 09:55:26.384652072 +0000 UTC m=+0.170216506 container start ead4efb670ba7986942a41e576112c87a7eccfb14a8585f9a2c97bb44531a16e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_brahmagupta, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.41.4, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, name=rhceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git)
Dec 02 09:55:26 np0005541913.localdomain podman[295095]: 2025-12-02 09:55:26.384927569 +0000 UTC m=+0.170492033 container attach ead4efb670ba7986942a41e576112c87a7eccfb14a8585f9a2c97bb44531a16e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_brahmagupta, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=1763362218, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True)
Dec 02 09:55:26 np0005541913.localdomain condescending_brahmagupta[295110]: 167 167
Dec 02 09:55:26 np0005541913.localdomain systemd[1]: libpod-ead4efb670ba7986942a41e576112c87a7eccfb14a8585f9a2c97bb44531a16e.scope: Deactivated successfully.
Dec 02 09:55:26 np0005541913.localdomain podman[295095]: 2025-12-02 09:55:26.391444153 +0000 UTC m=+0.177008627 container died ead4efb670ba7986942a41e576112c87a7eccfb14a8585f9a2c97bb44531a16e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_brahmagupta, version=7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, release=1763362218, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git)
Dec 02 09:55:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:55:26 np0005541913.localdomain podman[295115]: 2025-12-02 09:55:26.509187004 +0000 UTC m=+0.104144788 container remove ead4efb670ba7986942a41e576112c87a7eccfb14a8585f9a2c97bb44531a16e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_brahmagupta, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, version=7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 09:55:26 np0005541913.localdomain systemd[1]: libpod-conmon-ead4efb670ba7986942a41e576112c87a7eccfb14a8585f9a2c97bb44531a16e.scope: Deactivated successfully.
Dec 02 09:55:26 np0005541913.localdomain sudo[295058]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:26 np0005541913.localdomain podman[295127]: 2025-12-02 09:55:26.595089113 +0000 UTC m=+0.101089086 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3)
Dec 02 09:55:26 np0005541913.localdomain podman[295127]: 2025-12-02 09:55:26.609054177 +0000 UTC m=+0.115054200 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 09:55:26 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:55:26 np0005541913.localdomain sudo[295151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:26 np0005541913.localdomain sudo[295151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:26 np0005541913.localdomain sudo[295151]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:26 np0005541913.localdomain sudo[295169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:26 np0005541913.localdomain sudo[295169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:27 np0005541913.localdomain podman[295204]: 
Dec 02 09:55:27 np0005541913.localdomain podman[295204]: 2025-12-02 09:55:27.251121087 +0000 UTC m=+0.070075086 container create a64a010d249d0cac800180d92e2630489433cad8ead76612fc0a312da23c7037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_varahamihira, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, version=7, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, release=1763362218, com.redhat.component=rhceph-container)
Dec 02 09:55:27 np0005541913.localdomain systemd[1]: Started libpod-conmon-a64a010d249d0cac800180d92e2630489433cad8ead76612fc0a312da23c7037.scope.
Dec 02 09:55:27 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0c7c9c2f61f9960fbf47d176668d7d64f15b6a30c632a994c2abcbd0ac2504cf-merged.mount: Deactivated successfully.
Dec 02 09:55:27 np0005541913.localdomain podman[295204]: 2025-12-02 09:55:27.31474851 +0000 UTC m=+0.133702479 container init a64a010d249d0cac800180d92e2630489433cad8ead76612fc0a312da23c7037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_varahamihira, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64)
Dec 02 09:55:27 np0005541913.localdomain podman[295204]: 2025-12-02 09:55:27.32111155 +0000 UTC m=+0.140065519 container start a64a010d249d0cac800180d92e2630489433cad8ead76612fc0a312da23c7037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_varahamihira, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, build-date=2025-11-26T19:44:28Z, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=1763362218, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container)
Dec 02 09:55:27 np0005541913.localdomain podman[295204]: 2025-12-02 09:55:27.321333566 +0000 UTC m=+0.140287535 container attach a64a010d249d0cac800180d92e2630489433cad8ead76612fc0a312da23c7037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_varahamihira, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, ceph=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.buildah.version=1.41.4, release=1763362218, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:55:27 np0005541913.localdomain vibrant_varahamihira[295219]: 167 167
Dec 02 09:55:27 np0005541913.localdomain podman[295204]: 2025-12-02 09:55:27.226413426 +0000 UTC m=+0.045367405 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:27 np0005541913.localdomain podman[295204]: 2025-12-02 09:55:27.340201711 +0000 UTC m=+0.159155670 container died a64a010d249d0cac800180d92e2630489433cad8ead76612fc0a312da23c7037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_varahamihira, io.buildah.version=1.41.4, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True)
Dec 02 09:55:27 np0005541913.localdomain systemd[1]: libpod-a64a010d249d0cac800180d92e2630489433cad8ead76612fc0a312da23c7037.scope: Deactivated successfully.
Dec 02 09:55:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-754a0a7cbc31610f7bd21c041c3b1e8e5eaf84ca00398450a433ed70430c1ab4-merged.mount: Deactivated successfully.
Dec 02 09:55:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:27.401 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:27 np0005541913.localdomain podman[295224]: 2025-12-02 09:55:27.413767609 +0000 UTC m=+0.068638138 container remove a64a010d249d0cac800180d92e2630489433cad8ead76612fc0a312da23c7037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_varahamihira, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1763362218, architecture=x86_64, io.buildah.version=1.41.4)
Dec 02 09:55:27 np0005541913.localdomain systemd[1]: libpod-conmon-a64a010d249d0cac800180d92e2630489433cad8ead76612fc0a312da23c7037.scope: Deactivated successfully.
Dec 02 09:55:27 np0005541913.localdomain sudo[295169]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e9 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/555242505' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:55:27 np0005541913.localdomain sudo[295240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:27 np0005541913.localdomain sudo[295240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:27 np0005541913.localdomain sudo[295240]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.200:0/555242505' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:55:27 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:27 np0005541913.localdomain sudo[295258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:27 np0005541913.localdomain sudo[295258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:28 np0005541913.localdomain podman[295293]: 
Dec 02 09:55:28 np0005541913.localdomain podman[295293]: 2025-12-02 09:55:28.106026123 +0000 UTC m=+0.080736881 container create 5dfa9f0cd212cda7f4431fa5349aeb70d7991e8c29b63aec1b24d41de6d5785a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sanderson, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, name=rhceph, architecture=x86_64, io.buildah.version=1.41.4, GIT_CLEAN=True, distribution-scope=public, release=1763362218, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:55:28 np0005541913.localdomain systemd[1]: Started libpod-conmon-5dfa9f0cd212cda7f4431fa5349aeb70d7991e8c29b63aec1b24d41de6d5785a.scope.
Dec 02 09:55:28 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:28 np0005541913.localdomain podman[295293]: 2025-12-02 09:55:28.07302111 +0000 UTC m=+0.047731918 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:28 np0005541913.localdomain podman[295293]: 2025-12-02 09:55:28.174073684 +0000 UTC m=+0.148784452 container init 5dfa9f0cd212cda7f4431fa5349aeb70d7991e8c29b63aec1b24d41de6d5785a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sanderson, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main)
Dec 02 09:55:28 np0005541913.localdomain podman[295293]: 2025-12-02 09:55:28.182641763 +0000 UTC m=+0.157352521 container start 5dfa9f0cd212cda7f4431fa5349aeb70d7991e8c29b63aec1b24d41de6d5785a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sanderson, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, com.redhat.component=rhceph-container)
Dec 02 09:55:28 np0005541913.localdomain podman[295293]: 2025-12-02 09:55:28.183697392 +0000 UTC m=+0.158408130 container attach 5dfa9f0cd212cda7f4431fa5349aeb70d7991e8c29b63aec1b24d41de6d5785a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sanderson, name=rhceph, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, RELEASE=main, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:55:28 np0005541913.localdomain eloquent_sanderson[295308]: 167 167
Dec 02 09:55:28 np0005541913.localdomain systemd[1]: libpod-5dfa9f0cd212cda7f4431fa5349aeb70d7991e8c29b63aec1b24d41de6d5785a.scope: Deactivated successfully.
Dec 02 09:55:28 np0005541913.localdomain podman[295293]: 2025-12-02 09:55:28.186221009 +0000 UTC m=+0.160931837 container died 5dfa9f0cd212cda7f4431fa5349aeb70d7991e8c29b63aec1b24d41de6d5785a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sanderson, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main)
Dec 02 09:55:28 np0005541913.localdomain podman[295314]: 2025-12-02 09:55:28.276922256 +0000 UTC m=+0.079684193 container remove 5dfa9f0cd212cda7f4431fa5349aeb70d7991e8c29b63aec1b24d41de6d5785a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sanderson, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, version=7, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z)
Dec 02 09:55:28 np0005541913.localdomain systemd[1]: libpod-conmon-5dfa9f0cd212cda7f4431fa5349aeb70d7991e8c29b63aec1b24d41de6d5785a.scope: Deactivated successfully.
Dec 02 09:55:28 np0005541913.localdomain sudo[295258]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:28.603 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:28 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541913 (monmap changed)...
Dec 02 09:55:28 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541913 on np0005541913.localdomain
Dec 02 09:55:28 np0005541913.localdomain ceph-mon[289473]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:28 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:28 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:28 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:28 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:28 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:29 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541914 (monmap changed)...
Dec 02 09:55:29 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541914 on np0005541914.localdomain
Dec 02 09:55:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:55:29 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:30 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.1 (monmap changed)...
Dec 02 09:55:30 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:55:30 np0005541913.localdomain ceph-mon[289473]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:30 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:30 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:30 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:30 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:30 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 02 09:55:30 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:31 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.4 (monmap changed)...
Dec 02 09:55:31 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:55:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:31 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:32.404 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541914.sqgqkj (monmap changed)...
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541914.sqgqkj on np0005541914.localdomain
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.200:0/2343995021' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:55:32 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:55:33 np0005541913.localdomain podman[295331]: 2025-12-02 09:55:33.468359478 +0000 UTC m=+0.098969248 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:55:33 np0005541913.localdomain podman[295331]: 2025-12-02 09:55:33.477058971 +0000 UTC m=+0.107668751 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 02 09:55:33 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:55:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:33.606 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:33 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541914.lljzmk (monmap changed)...
Dec 02 09:55:33 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:55:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:55:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:55:33 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:55:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:55:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:55:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:55:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:55:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:55:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:55:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:55:34 np0005541913.localdomain sudo[295349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:55:34 np0005541913.localdomain sudo[295349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:34 np0005541913.localdomain sudo[295349]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.396346) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669334396411, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2356, "num_deletes": 265, "total_data_size": 8432363, "memory_usage": 9082592, "flush_reason": "Manual Compaction"}
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669334426644, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 4802871, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13710, "largest_seqno": 16061, "table_properties": {"data_size": 4793637, "index_size": 5483, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 23848, "raw_average_key_size": 22, "raw_value_size": 4773312, "raw_average_value_size": 4456, "num_data_blocks": 229, "num_entries": 1071, "num_filter_entries": 1071, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669289, "oldest_key_time": 1764669289, "file_creation_time": 1764669334, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 30348 microseconds, and 10975 cpu microseconds.
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.426693) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 4802871 bytes OK
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.426720) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.428616) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.428634) EVENT_LOG_v1 {"time_micros": 1764669334428629, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.428674) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 8420670, prev total WAL file size 8420670, number of live WAL files 2.
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.429687) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303238' seq:72057594037927935, type:22 .. '6B760031323930' seq:0, type:0; will stop at (end)
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(4690KB)], [21(12MB)]
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669334429722, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 17571805, "oldest_snapshot_seqno": -1}
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10249 keys, 16745842 bytes, temperature: kUnknown
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669334546174, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 16745842, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16684060, "index_size": 35057, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25669, "raw_key_size": 275342, "raw_average_key_size": 26, "raw_value_size": 16505205, "raw_average_value_size": 1610, "num_data_blocks": 1339, "num_entries": 10249, "num_filter_entries": 10249, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669201, "oldest_key_time": 0, "file_creation_time": 1764669334, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.546597) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 16745842 bytes
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.548835) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.7 rd, 143.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.6, 12.2 +0.0 blob) out(16.0 +0.0 blob), read-write-amplify(7.1) write-amplify(3.5) OK, records in: 10739, records dropped: 490 output_compression: NoCompression
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.548872) EVENT_LOG_v1 {"time_micros": 1764669334548856, "job": 10, "event": "compaction_finished", "compaction_time_micros": 116589, "compaction_time_cpu_micros": 23488, "output_level": 6, "num_output_files": 1, "total_output_size": 16745842, "num_input_records": 10739, "num_output_records": 10249, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669334549765, "job": 10, "event": "table_file_deletion", "file_number": 23}
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669334551655, "job": 10, "event": "table_file_deletion", "file_number": 21}
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.429578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.551807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.551817) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.551823) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.551827) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:34.551831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541914 (monmap changed)...
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='client.26961 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541910", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: mgrmap e24: np0005541914.lljzmk(active, since 24s), standbys: np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541911.adcgiw
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:55:34 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:35 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x564503730f20 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Dec 02 09:55:35 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@3(peon) e10  my rank is now 2 (was 3)
Dec 02 09:55:35 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0
Dec 02 09:55:35 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0
Dec 02 09:55:35 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x56450d29a000 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0
Dec 02 09:55:35 np0005541913.localdomain ceph-mon[289473]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:55:35 np0005541913.localdomain ceph-mon[289473]: paxos.2).electionLogic(42) init, last seen epoch 42
Dec 02 09:55:35 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:55:35 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:55:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:55:35 np0005541913.localdomain systemd[1]: tmp-crun.tRPmeM.mount: Deactivated successfully.
Dec 02 09:55:35 np0005541913.localdomain podman[295367]: 2025-12-02 09:55:35.459834528 +0000 UTC m=+0.097631753 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 09:55:35 np0005541913.localdomain podman[295367]: 2025-12-02 09:55:35.475170329 +0000 UTC m=+0.112967584 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter)
Dec 02 09:55:35 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:55:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:55:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:55:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:55:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:55:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:55:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18724 "" "Go-http-client/1.1"
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541910 on np0005541910.localdomain
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='client.34287 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005541910"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: Remove daemons mon.np0005541910
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "quorum_status"} : dispatch
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: Safe to remove mon.np0005541910: new quorum should be ['np0005541911', 'np0005541914', 'np0005541913', 'np0005541912'] (from ['np0005541911', 'np0005541914', 'np0005541913', 'np0005541912'])
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: Removing monitor np0005541910 from monmap...
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon rm", "name": "np0005541910"} : dispatch
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: Removing daemon mon.np0005541910 from np0005541910.localdomain -- ports []
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: mon.np0005541912 calling monitor election
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913 calling monitor election
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: mon.np0005541914 calling monitor election
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 calling monitor election
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: mon.np0005541911 is new leader, mons np0005541911,np0005541914,np0005541913,np0005541912 in quorum (ranks 0,1,2,3)
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: monmap epoch 10
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: last_changed 2025-12-02T09:55:35.077328+0000
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: min_mon_release 18 (reef)
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: election_strategy: 1
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005541913
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: osdmap e87: 6 total, 6 up, 6 in
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: mgrmap e24: np0005541914.lljzmk(active, since 27s), standbys: np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541911.adcgiw
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]:     stray daemon mgr.np0005541909.kfesnk on host np0005541909.localdomain not managed by cephadm
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]:     stray host np0005541909.localdomain has 1 stray daemons: ['mgr.np0005541909.kfesnk']
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:37 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:55:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:37.407 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:37 np0005541913.localdomain podman[295386]: 2025-12-02 09:55:37.415132501 +0000 UTC m=+0.061565399 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:55:37 np0005541913.localdomain podman[295386]: 2025-12-02 09:55:37.420120054 +0000 UTC m=+0.066552902 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:55:37 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541910.kzipdo (monmap changed)...
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541910.kzipdo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541910.kzipdo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541910.kzipdo on np0005541910.localdomain
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:55:38 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:38.609 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541911 (monmap changed)...
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: from='client.34330 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541910.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: Removed label mon from host np0005541910.localdomain
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541911.adcgiw (monmap changed)...
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541911.adcgiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541911.adcgiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:39 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541911.adcgiw on np0005541911.localdomain
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.855294) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669340855344, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 528, "num_deletes": 251, "total_data_size": 502557, "memory_usage": 512496, "flush_reason": "Manual Compaction"}
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669340859972, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 315167, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16067, "largest_seqno": 16589, "table_properties": {"data_size": 312185, "index_size": 901, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8645, "raw_average_key_size": 21, "raw_value_size": 305665, "raw_average_value_size": 764, "num_data_blocks": 37, "num_entries": 400, "num_filter_entries": 400, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669334, "oldest_key_time": 1764669334, "file_creation_time": 1764669340, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 4723 microseconds, and 1593 cpu microseconds.
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.860018) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 315167 bytes OK
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.860043) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.861320) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.861344) EVENT_LOG_v1 {"time_micros": 1764669340861337, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.861369) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 499246, prev total WAL file size 499246, number of live WAL files 2.
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.861978) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end)
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(307KB)], [24(15MB)]
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669340862044, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 17061009, "oldest_snapshot_seqno": -1}
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 10122 keys, 14985663 bytes, temperature: kUnknown
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669340965808, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 14985663, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14926440, "index_size": 32818, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 273589, "raw_average_key_size": 27, "raw_value_size": 14751431, "raw_average_value_size": 1457, "num_data_blocks": 1241, "num_entries": 10122, "num_filter_entries": 10122, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669201, "oldest_key_time": 0, "file_creation_time": 1764669340, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d9d4cd30-d7e1-42a3-a4ff-e4bd7db629d9", "db_session_id": "OW4D0W92HOAH7R2F6LZX", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.966639) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 14985663 bytes
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.969715) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 163.6 rd, 143.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 16.0 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(101.7) write-amplify(47.5) OK, records in: 10649, records dropped: 527 output_compression: NoCompression
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.969758) EVENT_LOG_v1 {"time_micros": 1764669340969738, "job": 12, "event": "compaction_finished", "compaction_time_micros": 104268, "compaction_time_cpu_micros": 27768, "output_level": 6, "num_output_files": 1, "total_output_size": 14985663, "num_input_records": 10649, "num_output_records": 10122, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669340970525, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669340973659, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.861844) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.973932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.973941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.973944) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.973947) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: rocksdb: (Original Log Time 2025/12/02-09:55:40.973950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541911 (monmap changed)...
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541911 on np0005541911.localdomain
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='client.34297 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541910.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:40 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:42 np0005541913.localdomain ceph-mon[289473]: Removed label mgr from host np0005541910.localdomain
Dec 02 09:55:42 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:55:42 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:55:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:55:42 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:42 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:42.409 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:43 np0005541913.localdomain ceph-mon[289473]: from='client.26796 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541910.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:55:43 np0005541913.localdomain ceph-mon[289473]: Removed label _admin from host np0005541910.localdomain
Dec 02 09:55:43 np0005541913.localdomain ceph-mon[289473]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:43 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.2 (monmap changed)...
Dec 02 09:55:43 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:55:43 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:55:43 np0005541913.localdomain podman[295410]: 2025-12-02 09:55:43.43817216 +0000 UTC m=+0.079431456 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 02 09:55:43 np0005541913.localdomain podman[295410]: 2025-12-02 09:55:43.4811638 +0000 UTC m=+0.122423106 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 09:55:43 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:55:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:43.612 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:44 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.5 (monmap changed)...
Dec 02 09:55:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:55:44 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:44 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:55:45 np0005541913.localdomain ceph-mon[289473]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:45 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:55:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:55:45 np0005541913.localdomain podman[295430]: 2025-12-02 09:55:45.445878274 +0000 UTC m=+0.085233872 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:55:45 np0005541913.localdomain podman[295430]: 2025-12-02 09:55:45.454474184 +0000 UTC m=+0.093829792 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:55:45 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:55:45 np0005541913.localdomain podman[295431]: 2025-12-02 09:55:45.500754042 +0000 UTC m=+0.132631240 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 09:55:45 np0005541913.localdomain podman[295431]: 2025-12-02 09:55:45.539073997 +0000 UTC m=+0.170951185 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec 02 09:55:45 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:55:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:46.035 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:46.076 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:46.077 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:46.077 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:55:46 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:46.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:46.829 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:55:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:46.829 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:55:46 np0005541913.localdomain sudo[295477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:46 np0005541913.localdomain sudo[295477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:46 np0005541913.localdomain sudo[295477]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:47 np0005541913.localdomain sudo[295495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:47 np0005541913.localdomain sudo[295495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:47.413 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:47 np0005541913.localdomain podman[295530]: 
Dec 02 09:55:47 np0005541913.localdomain podman[295530]: 2025-12-02 09:55:47.427515209 +0000 UTC m=+0.044386679 container create 2354d7bcabc3b73860c9c09c50b67f4e3bad2b3d20ee578e0110c1c5c8aafb6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_morse, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-type=git, RELEASE=main, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z)
Dec 02 09:55:47 np0005541913.localdomain systemd[1]: Started libpod-conmon-2354d7bcabc3b73860c9c09c50b67f4e3bad2b3d20ee578e0110c1c5c8aafb6d.scope.
Dec 02 09:55:47 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:47 np0005541913.localdomain podman[295530]: 2025-12-02 09:55:47.474123607 +0000 UTC m=+0.090995107 container init 2354d7bcabc3b73860c9c09c50b67f4e3bad2b3d20ee578e0110c1c5c8aafb6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_morse, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, ceph=True, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:55:47 np0005541913.localdomain systemd[1]: tmp-crun.UHRNTI.mount: Deactivated successfully.
Dec 02 09:55:47 np0005541913.localdomain podman[295530]: 2025-12-02 09:55:47.485229214 +0000 UTC m=+0.102100694 container start 2354d7bcabc3b73860c9c09c50b67f4e3bad2b3d20ee578e0110c1c5c8aafb6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_morse, distribution-scope=public, io.buildah.version=1.41.4, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_CLEAN=True)
Dec 02 09:55:47 np0005541913.localdomain podman[295530]: 2025-12-02 09:55:47.485787229 +0000 UTC m=+0.102658719 container attach 2354d7bcabc3b73860c9c09c50b67f4e3bad2b3d20ee578e0110c1c5c8aafb6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_morse, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, release=1763362218, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:55:47 np0005541913.localdomain keen_morse[295545]: 167 167
Dec 02 09:55:47 np0005541913.localdomain systemd[1]: libpod-2354d7bcabc3b73860c9c09c50b67f4e3bad2b3d20ee578e0110c1c5c8aafb6d.scope: Deactivated successfully.
Dec 02 09:55:47 np0005541913.localdomain podman[295530]: 2025-12-02 09:55:47.488693956 +0000 UTC m=+0.105565476 container died 2354d7bcabc3b73860c9c09c50b67f4e3bad2b3d20ee578e0110c1c5c8aafb6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_morse, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:55:47 np0005541913.localdomain podman[295530]: 2025-12-02 09:55:47.408075429 +0000 UTC m=+0.024946949 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:47 np0005541913.localdomain podman[295550]: 2025-12-02 09:55:47.577962045 +0000 UTC m=+0.079558710 container remove 2354d7bcabc3b73860c9c09c50b67f4e3bad2b3d20ee578e0110c1c5c8aafb6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_morse, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.openshift.expose-services=, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4)
Dec 02 09:55:47 np0005541913.localdomain systemd[1]: libpod-conmon-2354d7bcabc3b73860c9c09c50b67f4e3bad2b3d20ee578e0110c1c5c8aafb6d.scope: Deactivated successfully.
Dec 02 09:55:47 np0005541913.localdomain sudo[295495]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:47.635 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:55:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:47.637 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:55:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:47.637 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:55:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:47.637 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:55:47 np0005541913.localdomain sudo[295566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:47 np0005541913.localdomain sudo[295566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:47 np0005541913.localdomain sudo[295566]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:47 np0005541913.localdomain sudo[295584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:47 np0005541913.localdomain sudo[295584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:55:47 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:48 np0005541913.localdomain podman[295619]: 
Dec 02 09:55:48 np0005541913.localdomain podman[295619]: 2025-12-02 09:55:48.271464102 +0000 UTC m=+0.072790369 container create cb6a64f2e5ddf5cfc625b3f9958202481bb5fae5e3eaae79b3e72ba1d99a6721 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_turing, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4)
Dec 02 09:55:48 np0005541913.localdomain systemd[1]: Started libpod-conmon-cb6a64f2e5ddf5cfc625b3f9958202481bb5fae5e3eaae79b3e72ba1d99a6721.scope.
Dec 02 09:55:48 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:48 np0005541913.localdomain podman[295619]: 2025-12-02 09:55:48.333728879 +0000 UTC m=+0.135055146 container init cb6a64f2e5ddf5cfc625b3f9958202481bb5fae5e3eaae79b3e72ba1d99a6721 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_turing, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64)
Dec 02 09:55:48 np0005541913.localdomain podman[295619]: 2025-12-02 09:55:48.241766048 +0000 UTC m=+0.043092365 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:48 np0005541913.localdomain podman[295619]: 2025-12-02 09:55:48.345215276 +0000 UTC m=+0.146541553 container start cb6a64f2e5ddf5cfc625b3f9958202481bb5fae5e3eaae79b3e72ba1d99a6721 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_turing, CEPH_POINT_RELEASE=, version=7, name=rhceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, release=1763362218, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True)
Dec 02 09:55:48 np0005541913.localdomain podman[295619]: 2025-12-02 09:55:48.345555585 +0000 UTC m=+0.146881892 container attach cb6a64f2e5ddf5cfc625b3f9958202481bb5fae5e3eaae79b3e72ba1d99a6721 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_turing, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, release=1763362218, vendor=Red Hat, Inc., vcs-type=git)
Dec 02 09:55:48 np0005541913.localdomain nifty_turing[295635]: 167 167
Dec 02 09:55:48 np0005541913.localdomain systemd[1]: libpod-cb6a64f2e5ddf5cfc625b3f9958202481bb5fae5e3eaae79b3e72ba1d99a6721.scope: Deactivated successfully.
Dec 02 09:55:48 np0005541913.localdomain podman[295619]: 2025-12-02 09:55:48.350806405 +0000 UTC m=+0.152132742 container died cb6a64f2e5ddf5cfc625b3f9958202481bb5fae5e3eaae79b3e72ba1d99a6721 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_turing, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, architecture=x86_64, vendor=Red Hat, Inc., version=7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:55:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-598a1355734ee37431373316f6aecb5fb85e77592d26354a91db8aff75419954-merged.mount: Deactivated successfully.
Dec 02 09:55:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b68deb4f9eb5f1eb957711509d66cf8879be8fad83b7c950899bf065c890600c-merged.mount: Deactivated successfully.
Dec 02 09:55:48 np0005541913.localdomain podman[295640]: 2025-12-02 09:55:48.464964389 +0000 UTC m=+0.100801437 container remove cb6a64f2e5ddf5cfc625b3f9958202481bb5fae5e3eaae79b3e72ba1d99a6721 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_turing, vcs-type=git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z)
Dec 02 09:55:48 np0005541913.localdomain systemd[1]: libpod-conmon-cb6a64f2e5ddf5cfc625b3f9958202481bb5fae5e3eaae79b3e72ba1d99a6721.scope: Deactivated successfully.
Dec 02 09:55:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:48.643 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:48 np0005541913.localdomain sudo[295584]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:48.761 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:55:48 np0005541913.localdomain sudo[295664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:48 np0005541913.localdomain sudo[295664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:48 np0005541913.localdomain sudo[295664]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:48 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.0 (monmap changed)...
Dec 02 09:55:48 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:55:48 np0005541913.localdomain ceph-mon[289473]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 02 09:55:48 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:48 np0005541913.localdomain sudo[295682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:48 np0005541913.localdomain sudo[295682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:49.263 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:55:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:49.264 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:55:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:49.265 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:49.265 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:49.266 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:49.266 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:49 np0005541913.localdomain podman[295717]: 
Dec 02 09:55:49 np0005541913.localdomain podman[295717]: 2025-12-02 09:55:49.386225231 +0000 UTC m=+0.079897398 container create 0a5f6dcc5993292ab514244312b8d5e7e1e1bd6e610650d9be8f48dcd1286e17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_saha, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_BRANCH=main)
Dec 02 09:55:49 np0005541913.localdomain systemd[1]: Started libpod-conmon-0a5f6dcc5993292ab514244312b8d5e7e1e1bd6e610650d9be8f48dcd1286e17.scope.
Dec 02 09:55:49 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:49 np0005541913.localdomain podman[295717]: 2025-12-02 09:55:49.352894379 +0000 UTC m=+0.046566586 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:49 np0005541913.localdomain podman[295717]: 2025-12-02 09:55:49.456679888 +0000 UTC m=+0.150352055 container init 0a5f6dcc5993292ab514244312b8d5e7e1e1bd6e610650d9be8f48dcd1286e17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_saha, io.openshift.expose-services=, release=1763362218, GIT_BRANCH=main, version=7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:55:49 np0005541913.localdomain systemd[1]: tmp-crun.dgPeU2.mount: Deactivated successfully.
Dec 02 09:55:49 np0005541913.localdomain podman[295717]: 2025-12-02 09:55:49.469252263 +0000 UTC m=+0.162924430 container start 0a5f6dcc5993292ab514244312b8d5e7e1e1bd6e610650d9be8f48dcd1286e17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_saha, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, name=rhceph, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main)
Dec 02 09:55:49 np0005541913.localdomain podman[295717]: 2025-12-02 09:55:49.470155277 +0000 UTC m=+0.163827444 container attach 0a5f6dcc5993292ab514244312b8d5e7e1e1bd6e610650d9be8f48dcd1286e17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_saha, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, distribution-scope=public, build-date=2025-11-26T19:44:28Z, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=)
Dec 02 09:55:49 np0005541913.localdomain pedantic_saha[295732]: 167 167
Dec 02 09:55:49 np0005541913.localdomain systemd[1]: libpod-0a5f6dcc5993292ab514244312b8d5e7e1e1bd6e610650d9be8f48dcd1286e17.scope: Deactivated successfully.
Dec 02 09:55:49 np0005541913.localdomain podman[295717]: 2025-12-02 09:55:49.472453879 +0000 UTC m=+0.166126076 container died 0a5f6dcc5993292ab514244312b8d5e7e1e1bd6e610650d9be8f48dcd1286e17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_saha, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4)
Dec 02 09:55:49 np0005541913.localdomain podman[295737]: 2025-12-02 09:55:49.568780897 +0000 UTC m=+0.087323928 container remove 0a5f6dcc5993292ab514244312b8d5e7e1e1bd6e610650d9be8f48dcd1286e17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_saha, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, architecture=x86_64)
Dec 02 09:55:49 np0005541913.localdomain systemd[1]: libpod-conmon-0a5f6dcc5993292ab514244312b8d5e7e1e1bd6e610650d9be8f48dcd1286e17.scope: Deactivated successfully.
Dec 02 09:55:49 np0005541913.localdomain sudo[295682]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:49.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:49.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:55:49 np0005541913.localdomain sudo[295760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:49 np0005541913.localdomain sudo[295760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:49 np0005541913.localdomain sudo[295760]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:49 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.3 (monmap changed)...
Dec 02 09:55:49 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:55:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:49 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:49 np0005541913.localdomain sudo[295778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:49 np0005541913.localdomain sudo[295778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:50 np0005541913.localdomain podman[295812]: 
Dec 02 09:55:50 np0005541913.localdomain podman[295812]: 2025-12-02 09:55:50.352020745 +0000 UTC m=+0.058561528 container create 0730104ba80701935ef0a87e1961e49eed72f6d8b662022cd4c070cfdbb22e8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chaplygin, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=)
Dec 02 09:55:50 np0005541913.localdomain systemd[1]: Started libpod-conmon-0730104ba80701935ef0a87e1961e49eed72f6d8b662022cd4c070cfdbb22e8c.scope.
Dec 02 09:55:50 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:50 np0005541913.localdomain podman[295812]: 2025-12-02 09:55:50.322025552 +0000 UTC m=+0.028566385 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:50 np0005541913.localdomain podman[295812]: 2025-12-02 09:55:50.426707314 +0000 UTC m=+0.133248117 container init 0730104ba80701935ef0a87e1961e49eed72f6d8b662022cd4c070cfdbb22e8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chaplygin, ceph=True, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main)
Dec 02 09:55:50 np0005541913.localdomain brave_chaplygin[295829]: 167 167
Dec 02 09:55:50 np0005541913.localdomain podman[295812]: 2025-12-02 09:55:50.436577307 +0000 UTC m=+0.143118070 container start 0730104ba80701935ef0a87e1961e49eed72f6d8b662022cd4c070cfdbb22e8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chaplygin, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_CLEAN=True)
Dec 02 09:55:50 np0005541913.localdomain podman[295812]: 2025-12-02 09:55:50.436913436 +0000 UTC m=+0.143454219 container attach 0730104ba80701935ef0a87e1961e49eed72f6d8b662022cd4c070cfdbb22e8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chaplygin, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, version=7, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:55:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0e963a5f3fd7734119a998d76ee4d1922f739a9a8ee7d9bb35b866259cbcf348-merged.mount: Deactivated successfully.
Dec 02 09:55:50 np0005541913.localdomain podman[295812]: 2025-12-02 09:55:50.439032443 +0000 UTC m=+0.145573206 container died 0730104ba80701935ef0a87e1961e49eed72f6d8b662022cd4c070cfdbb22e8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chaplygin, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.41.4, ceph=True)
Dec 02 09:55:50 np0005541913.localdomain systemd[1]: libpod-0730104ba80701935ef0a87e1961e49eed72f6d8b662022cd4c070cfdbb22e8c.scope: Deactivated successfully.
Dec 02 09:55:50 np0005541913.localdomain systemd[1]: tmp-crun.8iFvgG.mount: Deactivated successfully.
Dec 02 09:55:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d2f6381695e455ae5f450c39646c3471924a3560fa9c693a84670ab1552b1462-merged.mount: Deactivated successfully.
Dec 02 09:55:50 np0005541913.localdomain podman[295834]: 2025-12-02 09:55:50.547540257 +0000 UTC m=+0.099655188 container remove 0730104ba80701935ef0a87e1961e49eed72f6d8b662022cd4c070cfdbb22e8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chaplygin, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, CEPH_POINT_RELEASE=)
Dec 02 09:55:50 np0005541913.localdomain systemd[1]: libpod-conmon-0730104ba80701935ef0a87e1961e49eed72f6d8b662022cd4c070cfdbb22e8c.scope: Deactivated successfully.
Dec 02 09:55:50 np0005541913.localdomain sudo[295778]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:50 np0005541913.localdomain sudo[295851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:50 np0005541913.localdomain sudo[295851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:50 np0005541913.localdomain sudo[295851]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:50.779 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:55:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:50.780 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:55:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:50.780 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:55:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:50.780 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:55:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:50.780 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:55:50 np0005541913.localdomain sudo[295869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:50 np0005541913.localdomain sudo[295869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:50 np0005541913.localdomain ceph-mon[289473]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:50 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:55:50 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:55:50 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:50 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:50 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:50 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:50 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:55:50 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.210 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:55:51 np0005541913.localdomain podman[295925]: 
Dec 02 09:55:51 np0005541913.localdomain podman[295925]: 2025-12-02 09:55:51.230457501 +0000 UTC m=+0.049929197 container create 9f6c936d2862ba2d1a3be9da3f5b74a95a35d36db3ca9f374e3cd246fcfeef93 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_liskov, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, release=1763362218, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:55:51 np0005541913.localdomain systemd[1]: Started libpod-conmon-9f6c936d2862ba2d1a3be9da3f5b74a95a35d36db3ca9f374e3cd246fcfeef93.scope.
Dec 02 09:55:51 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:51 np0005541913.localdomain podman[295925]: 2025-12-02 09:55:51.281772884 +0000 UTC m=+0.101244610 container init 9f6c936d2862ba2d1a3be9da3f5b74a95a35d36db3ca9f374e3cd246fcfeef93 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_liskov, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64)
Dec 02 09:55:51 np0005541913.localdomain podman[295925]: 2025-12-02 09:55:51.28909191 +0000 UTC m=+0.108563636 container start 9f6c936d2862ba2d1a3be9da3f5b74a95a35d36db3ca9f374e3cd246fcfeef93 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_liskov, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=1763362218, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:55:51 np0005541913.localdomain podman[295925]: 2025-12-02 09:55:51.289400828 +0000 UTC m=+0.108872594 container attach 9f6c936d2862ba2d1a3be9da3f5b74a95a35d36db3ca9f374e3cd246fcfeef93 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_liskov, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218)
Dec 02 09:55:51 np0005541913.localdomain hopeful_liskov[295942]: 167 167
Dec 02 09:55:51 np0005541913.localdomain systemd[1]: libpod-9f6c936d2862ba2d1a3be9da3f5b74a95a35d36db3ca9f374e3cd246fcfeef93.scope: Deactivated successfully.
Dec 02 09:55:51 np0005541913.localdomain podman[295925]: 2025-12-02 09:55:51.292911162 +0000 UTC m=+0.112382888 container died 9f6c936d2862ba2d1a3be9da3f5b74a95a35d36db3ca9f374e3cd246fcfeef93 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_liskov, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, vcs-type=git, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, RELEASE=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, version=7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z)
Dec 02 09:55:51 np0005541913.localdomain podman[295925]: 2025-12-02 09:55:51.207035974 +0000 UTC m=+0.026507760 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:51 np0005541913.localdomain podman[295947]: 2025-12-02 09:55:51.370035926 +0000 UTC m=+0.071920005 container remove 9f6c936d2862ba2d1a3be9da3f5b74a95a35d36db3ca9f374e3cd246fcfeef93 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_liskov, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.41.4, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, version=7, CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:55:51 np0005541913.localdomain systemd[1]: libpod-conmon-9f6c936d2862ba2d1a3be9da3f5b74a95a35d36db3ca9f374e3cd246fcfeef93.scope: Deactivated successfully.
Dec 02 09:55:51 np0005541913.localdomain sudo[295869]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6ca21786009132d6c533cd4d1bb77f991c80d86d138baa779c46181907fdc0f3-merged.mount: Deactivated successfully.
Dec 02 09:55:51 np0005541913.localdomain sudo[295964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:55:51 np0005541913.localdomain sudo[295964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:51 np0005541913.localdomain sudo[295964]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.603 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.604 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:55:51 np0005541913.localdomain sudo[295982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:51 np0005541913.localdomain sudo[295982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.841 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.842 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11668MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.843 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.843 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:55:51 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:55:51 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:55:51 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.107:0/3750190611' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:55:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:55:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:55:51 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.943 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.945 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.945 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:55:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:51.998 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/9569723' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:55:52 np0005541913.localdomain podman[296019]: 
Dec 02 09:55:52 np0005541913.localdomain podman[296019]: 2025-12-02 09:55:52.101686674 +0000 UTC m=+0.067834127 container create a5246e21c4743b64fe0c4444cd005b219a7d35abbfff7de396b9f36931f71ad4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_grothendieck, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, architecture=x86_64)
Dec 02 09:55:52 np0005541913.localdomain systemd[1]: Started libpod-conmon-a5246e21c4743b64fe0c4444cd005b219a7d35abbfff7de396b9f36931f71ad4.scope.
Dec 02 09:55:52 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:55:52 np0005541913.localdomain podman[296019]: 2025-12-02 09:55:52.17108542 +0000 UTC m=+0.137232903 container init a5246e21c4743b64fe0c4444cd005b219a7d35abbfff7de396b9f36931f71ad4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_grothendieck, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.41.4, RELEASE=main, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public)
Dec 02 09:55:52 np0005541913.localdomain podman[296019]: 2025-12-02 09:55:52.076633444 +0000 UTC m=+0.042780967 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:55:52 np0005541913.localdomain podman[296019]: 2025-12-02 09:55:52.180979616 +0000 UTC m=+0.147127089 container start a5246e21c4743b64fe0c4444cd005b219a7d35abbfff7de396b9f36931f71ad4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_grothendieck, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, RELEASE=main, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:55:52 np0005541913.localdomain podman[296019]: 2025-12-02 09:55:52.181177351 +0000 UTC m=+0.147324824 container attach a5246e21c4743b64fe0c4444cd005b219a7d35abbfff7de396b9f36931f71ad4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_grothendieck, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, release=1763362218, RELEASE=main, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z)
Dec 02 09:55:52 np0005541913.localdomain pedantic_grothendieck[296054]: 167 167
Dec 02 09:55:52 np0005541913.localdomain systemd[1]: libpod-a5246e21c4743b64fe0c4444cd005b219a7d35abbfff7de396b9f36931f71ad4.scope: Deactivated successfully.
Dec 02 09:55:52 np0005541913.localdomain podman[296019]: 2025-12-02 09:55:52.185050064 +0000 UTC m=+0.151197507 container died a5246e21c4743b64fe0c4444cd005b219a7d35abbfff7de396b9f36931f71ad4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_grothendieck, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, release=1763362218, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:55:52 np0005541913.localdomain podman[296059]: 2025-12-02 09:55:52.285732248 +0000 UTC m=+0.092056413 container remove a5246e21c4743b64fe0c4444cd005b219a7d35abbfff7de396b9f36931f71ad4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_grothendieck, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:55:52 np0005541913.localdomain systemd[1]: libpod-conmon-a5246e21c4743b64fe0c4444cd005b219a7d35abbfff7de396b9f36931f71ad4.scope: Deactivated successfully.
Dec 02 09:55:52 np0005541913.localdomain sudo[295982]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:52.415 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3ea84b6acdc75ff1523e06148e5bb1bdea47824bbb621ca51e2223aa345cfe82-merged.mount: Deactivated successfully.
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1371164466' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:55:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:52.458 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:55:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:52.465 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:55:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:52.522 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:55:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:52.525 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:55:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:52.525 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/533854766' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541913 (monmap changed)...
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541913 on np0005541913.localdomain
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.108:0/9569723' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.107:0/1371164466' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:55:52 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.108:0/533854766' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:55:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:53.646 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:53 np0005541913.localdomain ceph-mon[289473]: Reconfiguring crash.np0005541914 (monmap changed)...
Dec 02 09:55:53 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon crash.np0005541914 on np0005541914.localdomain
Dec 02 09:55:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:55:53 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: from='client.26836 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005541910.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: Added label _no_schedule to host np0005541910.localdomain
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005541910.localdomain
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.1 (monmap changed)...
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 02 09:55:54 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:55 np0005541913.localdomain ceph-mon[289473]: Reconfiguring osd.4 (monmap changed)...
Dec 02 09:55:55 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:55:55 np0005541913.localdomain ceph-mon[289473]: from='client.34313 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005541910.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:55:55 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.106:0/2376208400' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:55:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:55:55 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mds.mds.np0005541914.sqgqkj (monmap changed)...
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mds.mds.np0005541914.sqgqkj on np0005541914.localdomain
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.106:0/3001664006' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:55:56 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:55:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:55:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:57.419 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:57 np0005541913.localdomain systemd[1]: tmp-crun.TVHEyU.mount: Deactivated successfully.
Dec 02 09:55:57 np0005541913.localdomain podman[296078]: 2025-12-02 09:55:57.461790072 +0000 UTC m=+0.096814131 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:55:57 np0005541913.localdomain podman[296078]: 2025-12-02 09:55:57.476243718 +0000 UTC m=+0.111267797 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true)
Dec 02 09:55:57 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mgr.np0005541914.lljzmk (monmap changed)...
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:57 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:55:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:55:58.651 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:55:58 np0005541913.localdomain ceph-mon[289473]: Reconfiguring mon.np0005541914 (monmap changed)...
Dec 02 09:55:58 np0005541913.localdomain ceph-mon[289473]: Reconfiguring daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:55:58 np0005541913.localdomain ceph-mon[289473]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:55:59 np0005541913.localdomain sudo[296098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:55:59 np0005541913.localdomain sudo[296098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:59 np0005541913.localdomain sudo[296098]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:59 np0005541913.localdomain sudo[296116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:55:59 np0005541913.localdomain sudo[296116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:59 np0005541913.localdomain sudo[296116]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:59 np0005541913.localdomain sudo[296134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:55:59 np0005541913.localdomain sudo[296134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:59 np0005541913.localdomain sudo[296134]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:59 np0005541913.localdomain sudo[296152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:55:59 np0005541913.localdomain sudo[296152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:59 np0005541913.localdomain sudo[296152]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:59 np0005541913.localdomain sudo[296170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:55:59 np0005541913.localdomain sudo[296170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:59 np0005541913.localdomain sudo[296170]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:59 np0005541913.localdomain sudo[296204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:55:59 np0005541913.localdomain sudo[296204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:59 np0005541913.localdomain sudo[296204]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:59 np0005541913.localdomain sudo[296222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:55:59 np0005541913.localdomain sudo[296222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:59 np0005541913.localdomain sudo[296222]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:59 np0005541913.localdomain sudo[296240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:55:59 np0005541913.localdomain sudo[296240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:59 np0005541913.localdomain sudo[296240]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:59 np0005541913.localdomain sudo[296258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:55:59 np0005541913.localdomain sudo[296258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:55:59 np0005541913.localdomain sudo[296258]: pam_unix(sudo:session): session closed for user root
Dec 02 09:55:59 np0005541913.localdomain sudo[296276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:55:59 np0005541913.localdomain sudo[296276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:00 np0005541913.localdomain sudo[296276]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:00 np0005541913.localdomain sudo[296294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:56:00 np0005541913.localdomain sudo[296294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:00 np0005541913.localdomain sudo[296294]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: from='client.44243 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005541910.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541910.localdomain"} : dispatch
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541910.localdomain"} : dispatch
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005541910.localdomain"}]': finished
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: Removed host np0005541910.localdomain
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: executing refresh((['np0005541910.localdomain', 'np0005541911.localdomain', 'np0005541912.localdomain', 'np0005541913.localdomain', 'np0005541914.localdomain'],)) failed.
                                                           Traceback (most recent call last):
                                                             File "/usr/share/ceph/mgr/cephadm/utils.py", line 94, in do_work
                                                               return f(*arg)
                                                             File "/usr/share/ceph/mgr/cephadm/serve.py", line 317, in refresh
                                                               and not self.mgr.inventory.has_label(host, SpecialHostLabels.NO_MEMORY_AUTOTUNE)
                                                             File "/usr/share/ceph/mgr/cephadm/inventory.py", line 253, in has_label
                                                               host = self._get_stored_name(host)
                                                             File "/usr/share/ceph/mgr/cephadm/inventory.py", line 181, in _get_stored_name
                                                               self.assert_host(host)
                                                             File "/usr/share/ceph/mgr/cephadm/inventory.py", line 209, in assert_host
                                                               raise OrchestratorError('host %s does not exist' % host)
                                                           orchestrator._interface.OrchestratorError: host np0005541910.localdomain does not exist
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:56:00 np0005541913.localdomain ceph-mon[289473]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:56:00 np0005541913.localdomain sudo[296312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:00 np0005541913.localdomain sudo[296312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:00 np0005541913.localdomain sudo[296312]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:00 np0005541913.localdomain sudo[296330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:56:00 np0005541913.localdomain sudo[296330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:00 np0005541913.localdomain sudo[296330]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:00 np0005541913.localdomain sudo[296364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:56:00 np0005541913.localdomain sudo[296364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:00 np0005541913.localdomain sudo[296364]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:00 np0005541913.localdomain sudo[296382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:56:00 np0005541913.localdomain sudo[296382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:00 np0005541913.localdomain sudo[296382]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:00 np0005541913.localdomain sudo[296400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:56:00 np0005541913.localdomain sudo[296400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:00 np0005541913.localdomain sudo[296400]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:01 np0005541913.localdomain sshd[296418]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:56:01 np0005541913.localdomain sshd[296418]: Accepted publickey for tripleo-admin from 192.168.122.11 port 46514 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 09:56:01 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 1003.
Dec 02 09:56:01 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Dec 02 09:56:01 np0005541913.localdomain systemd-logind[757]: New session 67 of user tripleo-admin.
Dec 02 09:56:01 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Dec 02 09:56:01 np0005541913.localdomain systemd[1]: Starting User Manager for UID 1003...
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Queued start job for default target Main User Target.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Created slice User Application Slice.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Reached target Paths.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Reached target Timers.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Starting D-Bus User Message Bus Socket...
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Starting Create User's Volatile Files and Directories...
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Listening on D-Bus User Message Bus Socket.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Reached target Sockets.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Finished Create User's Volatile Files and Directories.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Reached target Basic System.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Reached target Main User Target.
Dec 02 09:56:01 np0005541913.localdomain systemd[296422]: Startup finished in 168ms.
Dec 02 09:56:01 np0005541913.localdomain systemd[1]: Started User Manager for UID 1003.
Dec 02 09:56:01 np0005541913.localdomain systemd[1]: Started Session 67 of User tripleo-admin.
Dec 02 09:56:01 np0005541913.localdomain sshd[296418]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 02 09:56:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:01 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:01 np0005541913.localdomain sudo[296512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:56:01 np0005541913.localdomain sudo[296512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:01 np0005541913.localdomain sudo[296512]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:02 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:56:02 np0005541913.localdomain sudo[296581]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nazkltakliftyzurfxudiqdveignfrxk ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764669361.6757183-61137-44228183486880/AnsiballZ_lineinfile.py
Dec 02 09:56:02 np0005541913.localdomain sudo[296581]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 09:56:02 np0005541913.localdomain python3[296583]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line=    - ip_netmask: 172.18.0.104/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 09:56:02 np0005541913.localdomain sudo[296581]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:02.421 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:02 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:02 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:56:02 np0005541913.localdomain ceph-mon[289473]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:02 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:02 np0005541913.localdomain sudo[296727]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmsdudbbrajdpjsxwgxyacuumlepjzza ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764669362.5209513-61153-33451419170674/AnsiballZ_command.py
Dec 02 09:56:02 np0005541913.localdomain sudo[296727]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 09:56:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:56:03.041 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:56:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:56:03.041 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:56:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:56:03.042 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:56:03 np0005541913.localdomain python3[296729]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.104/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:56:03 np0005541913.localdomain sudo[296727]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:03.652 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:03 np0005541913.localdomain sudo[296872]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msqshxqkvyghcvtdioyviyzcorqivvzo ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764669363.3123255-61164-51224519413854/AnsiballZ_command.py
Dec 02 09:56:03 np0005541913.localdomain sudo[296872]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 02 09:56:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:56:03 np0005541913.localdomain podman[296875]: 2025-12-02 09:56:03.864681135 +0000 UTC m=+0.096609397 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 09:56:03 np0005541913.localdomain podman[296875]: 2025-12-02 09:56:03.871890567 +0000 UTC m=+0.103818819 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 09:56:03 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:56:03 np0005541913.localdomain python3[296874]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.104 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 09:56:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:56:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:56:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:56:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:56:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:56:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:56:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:56:04 np0005541913.localdomain ceph-mon[289473]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:04 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.32:0/727099599' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 09:56:04 np0005541913.localdomain ceph-mon[289473]: from='client.? 172.18.0.32:0/727099599' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 09:56:05 np0005541913.localdomain sudo[296872]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:56:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:56:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:56:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:56:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:56:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18714 "" "Go-http-client/1.1"
Dec 02 09:56:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:56:06 np0005541913.localdomain podman[296910]: 2025-12-02 09:56:06.445242728 +0000 UTC m=+0.081772029 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64)
Dec 02 09:56:06 np0005541913.localdomain podman[296910]: 2025-12-02 09:56:06.455645597 +0000 UTC m=+0.092174908 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container)
Dec 02 09:56:06 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:56:06 np0005541913.localdomain ceph-mon[289473]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:07 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:56:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:07.423 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:07 np0005541913.localdomain sudo[296930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:56:07 np0005541913.localdomain sudo[296930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:56:07 np0005541913.localdomain sudo[296930]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:07 np0005541913.localdomain podman[296948]: 2025-12-02 09:56:07.634682275 +0000 UTC m=+0.084218204 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:56:07 np0005541913.localdomain podman[296948]: 2025-12-02 09:56:07.642790632 +0000 UTC m=+0.092326581 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 09:56:07 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:56:08 np0005541913.localdomain ceph-mon[289473]: from='client.44259 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:56:08 np0005541913.localdomain ceph-mon[289473]: Saving service mon spec with placement label:mon
Dec 02 09:56:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:56:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:08 np0005541913.localdomain ceph-mon[289473]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:56:08 np0005541913.localdomain ceph-mon[289473]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:08.654 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:09 np0005541913.localdomain ceph-mon[289473]: from='client.34329 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541913", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:56:09 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x56450d29a160 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0
Dec 02 09:56:09 np0005541913.localdomain ceph-mon[289473]: mon.np0005541913@2(peon) e11  removed from monmap, suicide.
Dec 02 09:56:09 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x564503730f20 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Dec 02 09:56:09 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x56450d29a000 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Dec 02 09:56:09 np0005541913.localdomain sudo[296970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:09 np0005541913.localdomain sudo[296970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:09 np0005541913.localdomain sudo[296970]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:09 np0005541913.localdomain podman[296986]: 2025-12-02 09:56:09.959415822 +0000 UTC m=+0.061063685 container died 36af0ed2ef00d05ae4aad6f924c8b496242c4f2361918e4bd57717905928e70b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mon-np0005541913, release=1763362218, io.openshift.expose-services=, vcs-type=git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4)
Dec 02 09:56:09 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-393e1e54e92e7ce105bdb9ae967dcd71a5af0f60b460340c2a56d8deb0a84a42-merged.mount: Deactivated successfully.
Dec 02 09:56:10 np0005541913.localdomain sudo[297000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 rm-daemon --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 --name mon.np0005541913 --force
Dec 02 09:56:10 np0005541913.localdomain podman[296986]: 2025-12-02 09:56:10.002336932 +0000 UTC m=+0.103984765 container remove 36af0ed2ef00d05ae4aad6f924c8b496242c4f2361918e4bd57717905928e70b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mon-np0005541913, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:56:10 np0005541913.localdomain sudo[297000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:10 np0005541913.localdomain systemd[1]: ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074@mon.np0005541913.service: Deactivated successfully.
Dec 02 09:56:10 np0005541913.localdomain systemd[1]: Stopped Ceph mon.np0005541913 for c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 09:56:10 np0005541913.localdomain systemd[1]: ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074@mon.np0005541913.service: Consumed 7.908s CPU time.
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:56:11 np0005541913.localdomain systemd-sysv-generator[297146]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:56:11 np0005541913.localdomain systemd-rc-local-generator[297143]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:11 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:11 np0005541913.localdomain sudo[297000]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:11 np0005541913.localdomain sudo[297158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:11 np0005541913.localdomain sudo[297158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:11 np0005541913.localdomain sudo[297158]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:11 np0005541913.localdomain sudo[297176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:56:11 np0005541913.localdomain sudo[297176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:12 np0005541913.localdomain systemd[1]: tmp-crun.eZLR8U.mount: Deactivated successfully.
Dec 02 09:56:12 np0005541913.localdomain podman[297266]: 2025-12-02 09:56:12.371795384 +0000 UTC m=+0.096802411 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 09:56:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:12.426 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:12 np0005541913.localdomain podman[297266]: 2025-12-02 09:56:12.473919907 +0000 UTC m=+0.198926874 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, architecture=x86_64, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:56:12 np0005541913.localdomain sudo[297176]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:13 np0005541913.localdomain sudo[297369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:13 np0005541913.localdomain sudo[297369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:13 np0005541913.localdomain sudo[297369]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:13 np0005541913.localdomain sudo[297387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:56:13 np0005541913.localdomain sudo[297387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:13.658 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:13 np0005541913.localdomain sudo[297387]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:13 np0005541913.localdomain sudo[297437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:56:13 np0005541913.localdomain sudo[297437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:56:13 np0005541913.localdomain sudo[297437]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:13 np0005541913.localdomain sudo[297456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:56:13 np0005541913.localdomain sudo[297456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:13 np0005541913.localdomain sudo[297456]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain podman[297454]: 2025-12-02 09:56:14.007800442 +0000 UTC m=+0.081901183 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 09:56:14 np0005541913.localdomain podman[297454]: 2025-12-02 09:56:14.021987971 +0000 UTC m=+0.096088712 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:56:14 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:56:14 np0005541913.localdomain sudo[297490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:56:14 np0005541913.localdomain sudo[297490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297490]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:14 np0005541913.localdomain sudo[297510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297510]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:56:14 np0005541913.localdomain sudo[297528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297528]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:56:14 np0005541913.localdomain sudo[297562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297562]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:56:14 np0005541913.localdomain sudo[297580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297580]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:56:14 np0005541913.localdomain sudo[297598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297598]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:56:14 np0005541913.localdomain sudo[297616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297616]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:56:14 np0005541913.localdomain sudo[297634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297634]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:56:14 np0005541913.localdomain sudo[297652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297652]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:14 np0005541913.localdomain sudo[297670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297670]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:14 np0005541913.localdomain sudo[297688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:56:14 np0005541913.localdomain sudo[297688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:14 np0005541913.localdomain sudo[297688]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:15 np0005541913.localdomain sudo[297722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:56:15 np0005541913.localdomain sudo[297722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:15 np0005541913.localdomain sudo[297722]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:15 np0005541913.localdomain sudo[297740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:56:15 np0005541913.localdomain sudo[297740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:15 np0005541913.localdomain sudo[297740]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:15 np0005541913.localdomain sudo[297758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:56:15 np0005541913.localdomain sudo[297758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:15 np0005541913.localdomain sudo[297758]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:15 np0005541913.localdomain sudo[297776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:56:15 np0005541913.localdomain sudo[297776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:56:15 np0005541913.localdomain sudo[297776]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:15 np0005541913.localdomain podman[297794]: 2025-12-02 09:56:15.594514071 +0000 UTC m=+0.083432994 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:56:15 np0005541913.localdomain podman[297794]: 2025-12-02 09:56:15.60946186 +0000 UTC m=+0.098380743 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:56:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:56:15 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:56:15 np0005541913.localdomain podman[297815]: 2025-12-02 09:56:15.686820641 +0000 UTC m=+0.068974617 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 02 09:56:15 np0005541913.localdomain podman[297815]: 2025-12-02 09:56:15.724108128 +0000 UTC m=+0.106262114 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 02 09:56:15 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.104 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.110 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff6ddbda-d180-4d50-90f7-76610ef11666', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.105835', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2498a68e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': '75dfc9337d093e5e540339d21b7fe077a6faefad276a00236c23b7f068a02f76'}]}, 'timestamp': '2025-12-02 09:56:16.111980', '_unique_id': '239212fa2ef04bbbbc37b733c15c71aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.113 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.116 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df4ca96e-b533-4dd7-b15c-d16fca21659c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.116062', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '24995f98-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': 'dabe46f08dbb3d3977c8c4a82e1d8c2f5b72876b0eb3adb8dbbe7c5b4474156c'}]}, 'timestamp': '2025-12-02 09:56:16.116562', '_unique_id': '8b5bd022952d4e7aaebe79a7e4990d7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.117 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.119 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c33eb1e5-2241-4562-b4a6-7870a8fdfd7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.119071', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '2499da7c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': '7e35bc6683109ff7237d7eff8695b48be71b4c593a732846248e257744ad84ef'}]}, 'timestamp': '2025-12-02 09:56:16.119785', '_unique_id': '1e30485ad9864879a99d8c31cdc1a068'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.120 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.122 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.141 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a68cb9f-745b-41b9-91ef-6ff07384f202', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:56:16.122244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '249d4112-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.359916731, 'message_signature': '47a987eb7fd220ab3dde9b6cb8b711c2176adfd73f0c23d34e14cffd81edbd81'}]}, 'timestamp': '2025-12-02 09:56:16.142134', '_unique_id': '1c2464c7da5f41939c1536e4a02a259d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.143 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.145 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da4ffd42-5cd3-4f91-a944-b5cfaa5fa0e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.145672', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '249de7ca-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': '1e023951a1d0a8d7a7576ec703e10ea34b27bd0f125fbafea11c0429d4246f60'}]}, 'timestamp': '2025-12-02 09:56:16.146274', '_unique_id': 'b3ceb74d08754ab782c285369c12e1d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.147 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.159 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.159 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78104aa0-d4e9-4333-aae9-6554e6316e26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:56:16.148794', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '249ff7a4-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.367885334, 'message_signature': '41db1114f02c7cf4e35cc664b72057ec5c91549b70fa942a5f47449d154539a6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:56:16.148794', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a00dac-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.367885334, 'message_signature': 'd790dc85fc7a3933e875ef976d521faf99375f22d5da727dce99e1ec27e8c04a'}]}, 'timestamp': '2025-12-02 09:56:16.160320', '_unique_id': 'b23344de8e7a436da317c4a28fc283d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.189 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbfa26ae-36a5-49fc-89ba-73b733bc7a19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:56:16.162872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a48422-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': 'd9799851c83fcb90293cd3f6f8a8af3d51b41248d4f97f70f8a0f24d4c23b815'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:56:16.162872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a49ec6-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': 'c212d9a5164c936a47710f9e0488262f6dbbfafb0190a896e8b9fccef2127856'}]}, 'timestamp': '2025-12-02 09:56:16.190258', '_unique_id': '066cfc363dbe4e8a80b624afddb78343'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.193 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3398ed49-71bd-477a-aaca-6842ea5fe52c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.193519', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '24a5337c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': '7806b2265bad9f9ac63c72e052993f37efacd6d19d4aafaf2952f08f9800656c'}]}, 'timestamp': '2025-12-02 09:56:16.194160', '_unique_id': 'cd78ad54ef964f32a012556a7a055108'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.196 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.196 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 13970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c109c1fc-0834-4659-bc90-854224f16616', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13970000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:56:16.196698', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '24a5ad34-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.359916731, 'message_signature': '117c39cb934da7a128558b2e14be14730e79988804718058e16c2ab094083f18'}]}, 'timestamp': '2025-12-02 09:56:16.197175', '_unique_id': '6c154b499a3045bcbf1582aadce042e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.199 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62c2f4e0-cc67-4fd5-b1d0-5dd9bf6f87b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:56:16.199495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a62106-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.367885334, 'message_signature': '175d2048e1df649d7bb0317f8fb3f460fa6080d25ab009880fa0277473f4778b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:56:16.199495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a6339e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.367885334, 'message_signature': 'b50f06ef406c5b5ea7a7753ba438f41888edaeb81c5b9f0048ca35c33f77b05c'}]}, 'timestamp': '2025-12-02 09:56:16.200601', '_unique_id': '5102ba27df224e0faa72b296e777d68b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c50ff6b1-ea7f-4959-bb44-454510e753c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:56:16.203211', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a6ab76-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': '6399161b062cdfb77f97a41b8f07dce26680c3a66a80e0686820535cebf99555'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:56:16.203211', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a6befe-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': '21f297abb24677ac7ec9a97923aea32633a45e414b3a2cf2aa3c16b885fab445'}]}, 'timestamp': '2025-12-02 09:56:16.204176', '_unique_id': 'bb55e25a51b14413a4413f0843071b4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49dcb828-b3d6-4d8c-83fe-a376c010713a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.206513', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '24a72e66-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': '2bf9ffd6db95a997353190697b61c77e0df931d42b09182be76e9523d1c9f151'}]}, 'timestamp': '2025-12-02 09:56:16.207041', '_unique_id': 'd6a0cbb362614a9e980fe71ccbcc17cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.208 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '382d2207-d63b-439d-9097-654daa0ae028', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:56:16.209377', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a79d6a-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': 'a3957cdc3ced3074cde3054f01212f36ec954b9e80973198329448d2cdac3c8e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:56:16.209377', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a7b354-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': '0abd72a7667db11e7c58a11b91eaad9adb8c7ebba025baa42e105112fe15907c'}]}, 'timestamp': '2025-12-02 09:56:16.210425', '_unique_id': 'd151a8aa90fb48b7a9ee662e122d2db0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.211 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ede161ac-becf-4d5b-96a4-2b0882bc252f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.213517', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '24a84238-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': 'e3b4d0e8cf08bf26e33fa6db4e62d1a5934533028c72301af89500bb1ba5827b'}]}, 'timestamp': '2025-12-02 09:56:16.214180', '_unique_id': 'f8fc11d8cf8f47539d08bcbbc73de395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.217 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b07e3e8-8a8b-4f7a-92b8-62545acb604e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.217345', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '24a8d81a-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': 'c54593617e1c8f88bb8aa6c37c5c15be9c923ecfece4735a6bfac741bd43b0e7'}]}, 'timestamp': '2025-12-02 09:56:16.218034', '_unique_id': '26aa26df08a74d1795022acd3a54c6f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.221 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.221 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfab355b-d56d-4ab8-a002-93efd4858345', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:56:16.221075', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a967b2-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': '0ac92561ad86dacc4c38d9b4700dba7c2de69b85b9739c8758ffa32814e342d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:56:16.221075', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a98166-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': '8f0502602d59c266f7955922037ac2c38f0bb82c9082ed8cb7ccaa97e3ee9d2c'}]}, 'timestamp': '2025-12-02 09:56:16.222330', '_unique_id': '1221f6e53a694d6cbc1c6428c13c2db6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.224 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.225 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6c699ed-e0a6-433d-b9ba-8d3c77a0fb51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:56:16.224920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a9f916-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': 'e954b2f34758c10bd0df3ccb6c87e44943a9339d524541127dfe6780dda24adf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:56:16.224920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24aa06a4-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': '4b2f7803d6272b98c57b917d995bc626839443a0f4709980e86a75e2813899f8'}]}, 'timestamp': '2025-12-02 09:56:16.225646', '_unique_id': 'b55b8b7d790e45f5b05dc2efb603515f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.227 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.227 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aebaa78a-2206-4f5d-9e07-77a187ef1d55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:56:16.227341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24aa5780-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.367885334, 'message_signature': '35a3ce4f346656903e3e477451c7b9af2471c34989870df659f8aed7e257dbfd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:56:16.227341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24aa6a0e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.367885334, 'message_signature': '247f3a40caca0cfbb4945a5e1389718ae970d4bf80b76438d1a536a449d6c337'}]}, 'timestamp': '2025-12-02 09:56:16.228162', '_unique_id': 'dc9dd2bc50f74e7db0ac378fe87a1050'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.229 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.230 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.230 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.231 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe89c98c-2b92-4be5-9010-0a87bd849797', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:56:16.230824', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24aae02e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': 'b49bdbed57486dc37afc724ace815758d25b762e206c5f41878940d5fc2ee2ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:56:16.230824', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24aaee48-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.381964141, 'message_signature': '20dd9fb5dc11fac72a8cb43819ea7e4c775daba2f3116b324f8b3c247b207741'}]}, 'timestamp': '2025-12-02 09:56:16.231555', '_unique_id': '63a5a856eee84ff798bbf693237b613f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.232 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.233 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a46ca6ec-2ffc-4d14-b6b4-f80f1d33d903', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.233573', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '24ab4d52-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': '9596f2ed0220f5d12958e26c80a547d8c5157ef670dc387f1c5cbb42c5c30caa'}]}, 'timestamp': '2025-12-02 09:56:16.234022', '_unique_id': '96422d91a47a45b49e4f1c43842c5b9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.234 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.235 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cb40f12-92d5-407c-ad4d-0b2f61659463', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:56:16.235763', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '24aba194-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11538.324931424, 'message_signature': 'b74de5da6fefd954ae2ebdee7204bfac1fc7063b37253531e5f044b2e7a905fa'}]}, 'timestamp': '2025-12-02 09:56:16.236156', '_unique_id': '5540f3e4b2ad42cf9bc7af8316f1ff21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:56:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:56:16.237 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:56:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:17.429 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:18.661 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:22.432 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:22 np0005541913.localdomain sudo[297840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:22 np0005541913.localdomain sudo[297840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:22 np0005541913.localdomain sudo[297840]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:22 np0005541913.localdomain sudo[297858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:22 np0005541913.localdomain sudo[297858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:22 np0005541913.localdomain sudo[297876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:22 np0005541913.localdomain sudo[297876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:22 np0005541913.localdomain sudo[297876]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:22 np0005541913.localdomain sudo[297905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:22 np0005541913.localdomain sudo[297905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:23 np0005541913.localdomain podman[297926]: 
Dec 02 09:56:23 np0005541913.localdomain podman[297926]: 2025-12-02 09:56:23.056569646 +0000 UTC m=+0.077531606 container create 03264424b054028998cb34a7a2a526187fac923fd9b68fd02d20b07e0cf1c1f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_montalcini, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, architecture=x86_64)
Dec 02 09:56:23 np0005541913.localdomain systemd[1]: Started libpod-conmon-03264424b054028998cb34a7a2a526187fac923fd9b68fd02d20b07e0cf1c1f1.scope.
Dec 02 09:56:23 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:56:23 np0005541913.localdomain podman[297926]: 2025-12-02 09:56:23.021374244 +0000 UTC m=+0.042336234 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:23 np0005541913.localdomain podman[297926]: 2025-12-02 09:56:23.131982253 +0000 UTC m=+0.152944223 container init 03264424b054028998cb34a7a2a526187fac923fd9b68fd02d20b07e0cf1c1f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_montalcini, io.buildah.version=1.41.4, RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph)
Dec 02 09:56:23 np0005541913.localdomain podman[297926]: 2025-12-02 09:56:23.143197933 +0000 UTC m=+0.164159863 container start 03264424b054028998cb34a7a2a526187fac923fd9b68fd02d20b07e0cf1c1f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_montalcini, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.41.4, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=1763362218, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 09:56:23 np0005541913.localdomain podman[297926]: 2025-12-02 09:56:23.143374699 +0000 UTC m=+0.164336729 container attach 03264424b054028998cb34a7a2a526187fac923fd9b68fd02d20b07e0cf1c1f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_montalcini, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, architecture=x86_64, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, version=7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 02 09:56:23 np0005541913.localdomain happy_montalcini[297942]: 167 167
Dec 02 09:56:23 np0005541913.localdomain systemd[1]: libpod-03264424b054028998cb34a7a2a526187fac923fd9b68fd02d20b07e0cf1c1f1.scope: Deactivated successfully.
Dec 02 09:56:23 np0005541913.localdomain podman[297926]: 2025-12-02 09:56:23.149288187 +0000 UTC m=+0.170250167 container died 03264424b054028998cb34a7a2a526187fac923fd9b68fd02d20b07e0cf1c1f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_montalcini, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, architecture=x86_64, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:56:23 np0005541913.localdomain podman[297947]: 2025-12-02 09:56:23.368637416 +0000 UTC m=+0.211871030 container remove 03264424b054028998cb34a7a2a526187fac923fd9b68fd02d20b07e0cf1c1f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_montalcini, description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, distribution-scope=public, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_CLEAN=True, RELEASE=main)
Dec 02 09:56:23 np0005541913.localdomain systemd[1]: libpod-conmon-03264424b054028998cb34a7a2a526187fac923fd9b68fd02d20b07e0cf1c1f1.scope: Deactivated successfully.
Dec 02 09:56:23 np0005541913.localdomain sudo[297858]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:23 np0005541913.localdomain sudo[297999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:23 np0005541913.localdomain sudo[297999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:23 np0005541913.localdomain sudo[297999]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:23.664 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:23 np0005541913.localdomain podman[298021]: 
Dec 02 09:56:23 np0005541913.localdomain sudo[298029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:23 np0005541913.localdomain sudo[298029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:23 np0005541913.localdomain podman[298021]: 2025-12-02 09:56:23.680157742 +0000 UTC m=+0.081203024 container create f4ef60eb22415719a1c970c386f62f3626afe9c9f0046e50f65727dd53fbc558 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_napier, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:56:23 np0005541913.localdomain systemd[1]: Started libpod-conmon-f4ef60eb22415719a1c970c386f62f3626afe9c9f0046e50f65727dd53fbc558.scope.
Dec 02 09:56:23 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:56:23 np0005541913.localdomain podman[298021]: 2025-12-02 09:56:23.645481184 +0000 UTC m=+0.046526516 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:23 np0005541913.localdomain podman[298021]: 2025-12-02 09:56:23.750652288 +0000 UTC m=+0.151697580 container init f4ef60eb22415719a1c970c386f62f3626afe9c9f0046e50f65727dd53fbc558 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_napier, distribution-scope=public, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Dec 02 09:56:23 np0005541913.localdomain podman[298021]: 2025-12-02 09:56:23.760122302 +0000 UTC m=+0.161167594 container start f4ef60eb22415719a1c970c386f62f3626afe9c9f0046e50f65727dd53fbc558 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_napier, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-type=git, ceph=True)
Dec 02 09:56:23 np0005541913.localdomain podman[298021]: 2025-12-02 09:56:23.760313697 +0000 UTC m=+0.161358979 container attach f4ef60eb22415719a1c970c386f62f3626afe9c9f0046e50f65727dd53fbc558 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_napier, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, release=1763362218, ceph=True)
Dec 02 09:56:23 np0005541913.localdomain pensive_napier[298055]: 167 167
Dec 02 09:56:23 np0005541913.localdomain systemd[1]: libpod-f4ef60eb22415719a1c970c386f62f3626afe9c9f0046e50f65727dd53fbc558.scope: Deactivated successfully.
Dec 02 09:56:23 np0005541913.localdomain podman[298021]: 2025-12-02 09:56:23.765156146 +0000 UTC m=+0.166201498 container died f4ef60eb22415719a1c970c386f62f3626afe9c9f0046e50f65727dd53fbc558 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_napier, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:56:24 np0005541913.localdomain podman[298060]: 2025-12-02 09:56:24.030180998 +0000 UTC m=+0.253192686 container remove f4ef60eb22415719a1c970c386f62f3626afe9c9f0046e50f65727dd53fbc558 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_napier, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, version=7, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, GIT_CLEAN=True, ceph=True, architecture=x86_64, io.openshift.expose-services=)
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: libpod-conmon-f4ef60eb22415719a1c970c386f62f3626afe9c9f0046e50f65727dd53fbc558.scope: Deactivated successfully.
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: tmp-crun.hyP7RN.mount: Deactivated successfully.
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-2d5ea9eba9b610c177964979d11c3c21eb2827b38812ec17a33d13c56a655531-merged.mount: Deactivated successfully.
Dec 02 09:56:24 np0005541913.localdomain podman[298091]: 
Dec 02 09:56:24 np0005541913.localdomain podman[298091]: 2025-12-02 09:56:24.142722679 +0000 UTC m=+0.078504171 container create d2fa5ff358638910ceab21ed19ba0a0f387a4d8652b339ea1aa3bc6a8029da77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_hofstadter, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7)
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: Started libpod-conmon-d2fa5ff358638910ceab21ed19ba0a0f387a4d8652b339ea1aa3bc6a8029da77.scope.
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:56:24 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83c163f36118e059d7cbe6fb224a42b05d9f767faa6bde3af4732b16364ddc82/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 02 09:56:24 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83c163f36118e059d7cbe6fb224a42b05d9f767faa6bde3af4732b16364ddc82/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec 02 09:56:24 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83c163f36118e059d7cbe6fb224a42b05d9f767faa6bde3af4732b16364ddc82/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 09:56:24 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83c163f36118e059d7cbe6fb224a42b05d9f767faa6bde3af4732b16364ddc82/merged/var/lib/ceph/mon/ceph-np0005541913 supports timestamps until 2038 (0x7fffffff)
Dec 02 09:56:24 np0005541913.localdomain podman[298091]: 2025-12-02 09:56:24.205698025 +0000 UTC m=+0.141479517 container init d2fa5ff358638910ceab21ed19ba0a0f387a4d8652b339ea1aa3bc6a8029da77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_hofstadter, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, RELEASE=main, CEPH_POINT_RELEASE=, name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, release=1763362218, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:56:24 np0005541913.localdomain podman[298091]: 2025-12-02 09:56:24.108383931 +0000 UTC m=+0.044165473 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:24 np0005541913.localdomain podman[298091]: 2025-12-02 09:56:24.222273579 +0000 UTC m=+0.158055071 container start d2fa5ff358638910ceab21ed19ba0a0f387a4d8652b339ea1aa3bc6a8029da77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_hofstadter, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, version=7)
Dec 02 09:56:24 np0005541913.localdomain podman[298091]: 2025-12-02 09:56:24.222565066 +0000 UTC m=+0.158346598 container attach d2fa5ff358638910ceab21ed19ba0a0f387a4d8652b339ea1aa3bc6a8029da77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_hofstadter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vcs-type=git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_CLEAN=True, name=rhceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, distribution-scope=public, version=7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: libpod-d2fa5ff358638910ceab21ed19ba0a0f387a4d8652b339ea1aa3bc6a8029da77.scope: Deactivated successfully.
Dec 02 09:56:24 np0005541913.localdomain podman[298091]: 2025-12-02 09:56:24.321402661 +0000 UTC m=+0.257184203 container died d2fa5ff358638910ceab21ed19ba0a0f387a4d8652b339ea1aa3bc6a8029da77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_hofstadter, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64)
Dec 02 09:56:24 np0005541913.localdomain podman[298132]: 2025-12-02 09:56:24.475757371 +0000 UTC m=+0.143534681 container remove d2fa5ff358638910ceab21ed19ba0a0f387a4d8652b339ea1aa3bc6a8029da77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_hofstadter, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, release=1763362218, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: libpod-conmon-d2fa5ff358638910ceab21ed19ba0a0f387a4d8652b339ea1aa3bc6a8029da77.scope: Deactivated successfully.
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:56:24 np0005541913.localdomain systemd-rc-local-generator[298170]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:56:24 np0005541913.localdomain systemd-sysv-generator[298173]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: tmp-crun.Ktvj4e.mount: Deactivated successfully.
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-83c163f36118e059d7cbe6fb224a42b05d9f767faa6bde3af4732b16364ddc82-merged.mount: Deactivated successfully.
Dec 02 09:56:24 np0005541913.localdomain systemd[1]: Reloading.
Dec 02 09:56:25 np0005541913.localdomain systemd-rc-local-generator[298213]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 09:56:25 np0005541913.localdomain systemd-sysv-generator[298216]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: Starting Ceph mon.np0005541913 for c7c8e171-a193-56fb-95fa-8879fcfa7074...
Dec 02 09:56:25 np0005541913.localdomain podman[298278]: 
Dec 02 09:56:25 np0005541913.localdomain podman[298278]: 2025-12-02 09:56:25.595791692 +0000 UTC m=+0.083786533 container create f33ff84df5750a140229ed8a5fedf56b5e50bccee283e663eaa312fe015d114c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mon-np0005541913, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1763362218, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7)
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: tmp-crun.1c4J9E.mount: Deactivated successfully.
Dec 02 09:56:25 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e43dfa7ef56d13eb080a91a7a700935be6bbfa9e6ab42fe13cf1e9a8a1b0d8d0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 09:56:25 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e43dfa7ef56d13eb080a91a7a700935be6bbfa9e6ab42fe13cf1e9a8a1b0d8d0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 09:56:25 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e43dfa7ef56d13eb080a91a7a700935be6bbfa9e6ab42fe13cf1e9a8a1b0d8d0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 09:56:25 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e43dfa7ef56d13eb080a91a7a700935be6bbfa9e6ab42fe13cf1e9a8a1b0d8d0/merged/var/lib/ceph/mon/ceph-np0005541913 supports timestamps until 2038 (0x7fffffff)
Dec 02 09:56:25 np0005541913.localdomain podman[298278]: 2025-12-02 09:56:25.556788058 +0000 UTC m=+0.044782929 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:25 np0005541913.localdomain podman[298278]: 2025-12-02 09:56:25.661737526 +0000 UTC m=+0.149732367 container init f33ff84df5750a140229ed8a5fedf56b5e50bccee283e663eaa312fe015d114c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mon-np0005541913, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, ceph=True, architecture=x86_64, io.openshift.expose-services=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph)
Dec 02 09:56:25 np0005541913.localdomain podman[298278]: 2025-12-02 09:56:25.670858791 +0000 UTC m=+0.158853632 container start f33ff84df5750a140229ed8a5fedf56b5e50bccee283e663eaa312fe015d114c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mon-np0005541913, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 02 09:56:25 np0005541913.localdomain bash[298278]: f33ff84df5750a140229ed8a5fedf56b5e50bccee283e663eaa312fe015d114c
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: Started Ceph mon.np0005541913 for c7c8e171-a193-56fb-95fa-8879fcfa7074.
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: set uid:gid to 167:167 (ceph:ceph)
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: pidfile_write: ignore empty --pid-file
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: load: jerasure load: lrc 
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: RocksDB version: 7.9.2
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Git sha 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: DB SUMMARY
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: DB Session ID:  7NRXCK2K9UGWEPQBYWTV
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: CURRENT file:  CURRENT
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: IDENTITY file:  IDENTITY
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005541913/store.db dir, Total Num: 0, files: 
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005541913/store.db: 000004.log size: 761 ; 
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                         Options.error_if_exists: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                       Options.create_if_missing: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                         Options.paranoid_checks: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                                     Options.env: 0x5631822d99e0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                                Options.info_log: 0x563183c4ad20
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.max_file_opening_threads: 16
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                              Options.statistics: (nil)
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                               Options.use_fsync: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                       Options.max_log_file_size: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                         Options.allow_fallocate: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                        Options.use_direct_reads: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:          Options.create_missing_column_families: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                              Options.db_log_dir: 
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                                 Options.wal_dir: 
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                   Options.advise_random_on_open: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                    Options.write_buffer_manager: 0x563183c5b540
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                            Options.rate_limiter: (nil)
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.unordered_write: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                               Options.row_cache: None
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                              Options.wal_filter: None
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.allow_ingest_behind: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.two_write_queues: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.manual_wal_flush: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.wal_compression: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.atomic_flush: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                 Options.log_readahead_size: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.allow_data_in_errors: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.db_host_id: __hostname__
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.max_background_jobs: 2
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.max_background_compactions: -1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.max_subcompactions: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.max_total_wal_size: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                          Options.max_open_files: -1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                          Options.bytes_per_sync: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:       Options.compaction_readahead_size: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.max_background_flushes: -1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Compression algorithms supported:
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         kZSTD supported: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         kXpressCompression supported: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         kBZip2Compression supported: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         kLZ4Compression supported: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         kZlibCompression supported: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         kLZ4HCCompression supported: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         kSnappyCompression supported: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005541913/store.db/MANIFEST-000005
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:           Options.merge_operator: 
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:        Options.compaction_filter: None
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:        Options.compaction_filter_factory: None
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:  Options.sst_partitioner_factory: None
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x563183c4a980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x563183c47350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:        Options.write_buffer_size: 33554432
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:  Options.max_write_buffer_number: 2
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:          Options.compression: NoCompression
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:       Options.prefix_extractor: nullptr
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.num_levels: 7
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.compression_opts.level: 32767
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:               Options.compression_opts.strategy: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                  Options.compression_opts.enabled: false
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                        Options.arena_block_size: 1048576
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.disable_auto_compactions: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                   Options.table_properties_collectors: 
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                   Options.inplace_update_support: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                           Options.bloom_locality: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                    Options.max_successive_merges: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.paranoid_file_checks: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.force_consistency_checks: 1
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.report_bg_io_stats: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                               Options.ttl: 2592000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                       Options.enable_blob_files: false
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                           Options.min_blob_size: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                          Options.blob_file_size: 268435456
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb:                Options.blob_file_starting_level: 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005541913/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2b5a5119-a77e-4ac2-8a7c-136bbfa56c89
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669385721329, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669385740724, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669385740874, "job": 1, "event": "recovery_finished"}
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x563183c6ee00
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: DB pointer 0x563183d64000
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913 does not exist in monmap, will attempt to join an existing cluster
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.84 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.019       0      0       0.0       0.0
                                                            Sum      1/0    1.84 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.019       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.019       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.019       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.06 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.06 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x563183c47350#2 capacity: 512.00 MB usage: 1.17 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.9e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1,0.95 KB,0.000181794%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: using public_addr v2:172.18.0.104:0/0 -> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: starting mon.np0005541913 rank -1 at public addrs [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] at bind addrs [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005541913 fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(???) e0 preinit fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing) e11 sync_obtain_latest_monmap
Dec 02 09:56:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing) e11 sync_obtain_latest_monmap obtained monmap e11
Dec 02 09:56:25 np0005541913.localdomain sudo[297905]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:25 np0005541913.localdomain podman[298339]: 
Dec 02 09:56:25 np0005541913.localdomain podman[298339]: 2025-12-02 09:56:25.934531616 +0000 UTC m=+0.098669141 container create e97d30f2d7f3c7033dc7df5787d709b325554bb024b71ab06957ece71a4e791a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_khayyam, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_CLEAN=True, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7)
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: Started libpod-conmon-e97d30f2d7f3c7033dc7df5787d709b325554bb024b71ab06957ece71a4e791a.scope.
Dec 02 09:56:25 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:56:25 np0005541913.localdomain podman[298339]: 2025-12-02 09:56:25.891004081 +0000 UTC m=+0.055141646 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:26 np0005541913.localdomain podman[298339]: 2025-12-02 09:56:26.004862118 +0000 UTC m=+0.168999643 container init e97d30f2d7f3c7033dc7df5787d709b325554bb024b71ab06957ece71a4e791a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_khayyam, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, build-date=2025-11-26T19:44:28Z, architecture=x86_64, RELEASE=main, release=1763362218, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7)
Dec 02 09:56:26 np0005541913.localdomain podman[298339]: 2025-12-02 09:56:26.012879743 +0000 UTC m=+0.177017268 container start e97d30f2d7f3c7033dc7df5787d709b325554bb024b71ab06957ece71a4e791a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_khayyam, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, release=1763362218, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, vcs-type=git)
Dec 02 09:56:26 np0005541913.localdomain eager_khayyam[298354]: 167 167
Dec 02 09:56:26 np0005541913.localdomain podman[298339]: 2025-12-02 09:56:26.013322984 +0000 UTC m=+0.177460499 container attach e97d30f2d7f3c7033dc7df5787d709b325554bb024b71ab06957ece71a4e791a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_khayyam, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, build-date=2025-11-26T19:44:28Z, RELEASE=main)
Dec 02 09:56:26 np0005541913.localdomain systemd[1]: libpod-e97d30f2d7f3c7033dc7df5787d709b325554bb024b71ab06957ece71a4e791a.scope: Deactivated successfully.
Dec 02 09:56:26 np0005541913.localdomain podman[298339]: 2025-12-02 09:56:26.017421644 +0000 UTC m=+0.181559179 container died e97d30f2d7f3c7033dc7df5787d709b325554bb024b71ab06957ece71a4e791a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_khayyam, RELEASE=main, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, release=1763362218, io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=)
Dec 02 09:56:26 np0005541913.localdomain podman[298359]: 2025-12-02 09:56:26.086599425 +0000 UTC m=+0.067028325 container remove e97d30f2d7f3c7033dc7df5787d709b325554bb024b71ab06957ece71a4e791a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_khayyam, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, ceph=True, io.buildah.version=1.41.4, vendor=Red Hat, Inc., RELEASE=main, release=1763362218, GIT_BRANCH=main, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=)
Dec 02 09:56:26 np0005541913.localdomain systemd[1]: libpod-conmon-e97d30f2d7f3c7033dc7df5787d709b325554bb024b71ab06957ece71a4e791a.scope: Deactivated successfully.
Dec 02 09:56:26 np0005541913.localdomain sudo[298029]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:26 np0005541913.localdomain sudo[298382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:26 np0005541913.localdomain sudo[298382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:26 np0005541913.localdomain sudo[298382]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).mds e16 new map
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        15
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-12-02T08:05:53.424954+0000
                                                           modified        2025-12-02T09:52:13.505190+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        84
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26573}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26573 members: 26573
                                                           [mds.mds.np0005541912.ghcwcm{0:26573} state up:active seq 13 addr [v2:172.18.0.106:6808/955707462,v1:172.18.0.106:6809/955707462] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005541914.sqgqkj{-1:16923} state up:standby seq 1 addr [v2:172.18.0.108:6808/2216063099,v1:172.18.0.108:6809/2216063099] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005541913.maexpe{-1:26386} state up:standby seq 1 addr [v2:172.18.0.107:6808/3746047079,v1:172.18.0.107:6809/3746047079] compat {c=[1],r=[1],i=[17ff]}]
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).osd e87 crush map has features 3314933000852226048, adjusting msgr requires
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).osd e87 crush map has features 288514051259236352, adjusting msgr requires
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).osd e87 crush map has features 288514051259236352, adjusting msgr requires
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).osd e87 crush map has features 288514051259236352, adjusting msgr requires
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541911.adcgiw (monmap changed)...
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541911.adcgiw on np0005541911.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541911.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541911 (monmap changed)...
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541911 on np0005541911.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.2 (monmap changed)...
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.5 (monmap changed)...
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='client.26871 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005541913.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Deploying daemon mon.np0005541913 on np0005541913.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.0 (monmap changed)...
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).paxosservice(auth 1..38) refresh upgraded, format 0 -> 3
Dec 02 09:56:26 np0005541913.localdomain sudo[298400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:26 np0005541913.localdomain sudo[298400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:26 np0005541913.localdomain podman[298435]: 
Dec 02 09:56:26 np0005541913.localdomain podman[298435]: 2025-12-02 09:56:26.953175693 +0000 UTC m=+0.042842717 container create f68db0244436d329a512f1be275ccdbbf5652c4911780816afe0b3335a7fd21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_villani, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, name=rhceph, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4)
Dec 02 09:56:26 np0005541913.localdomain systemd[1]: Started libpod-conmon-f68db0244436d329a512f1be275ccdbbf5652c4911780816afe0b3335a7fd21f.scope.
Dec 02 09:56:26 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:56:27 np0005541913.localdomain podman[298435]: 2025-12-02 09:56:27.008444392 +0000 UTC m=+0.098111416 container init f68db0244436d329a512f1be275ccdbbf5652c4911780816afe0b3335a7fd21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_villani, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-type=git, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z)
Dec 02 09:56:27 np0005541913.localdomain podman[298435]: 2025-12-02 09:56:27.019885978 +0000 UTC m=+0.109553002 container start f68db0244436d329a512f1be275ccdbbf5652c4911780816afe0b3335a7fd21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_villani, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, name=rhceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=)
Dec 02 09:56:27 np0005541913.localdomain podman[298435]: 2025-12-02 09:56:27.020052053 +0000 UTC m=+0.109719097 container attach f68db0244436d329a512f1be275ccdbbf5652c4911780816afe0b3335a7fd21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_villani, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=1763362218, version=7, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, architecture=x86_64, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True)
Dec 02 09:56:27 np0005541913.localdomain ecstatic_villani[298451]: 167 167
Dec 02 09:56:27 np0005541913.localdomain systemd[1]: libpod-f68db0244436d329a512f1be275ccdbbf5652c4911780816afe0b3335a7fd21f.scope: Deactivated successfully.
Dec 02 09:56:27 np0005541913.localdomain podman[298435]: 2025-12-02 09:56:27.02257703 +0000 UTC m=+0.112244134 container died f68db0244436d329a512f1be275ccdbbf5652c4911780816afe0b3335a7fd21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_villani, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:56:27 np0005541913.localdomain podman[298435]: 2025-12-02 09:56:26.937818553 +0000 UTC m=+0.027485587 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:27 np0005541913.localdomain podman[298456]: 2025-12-02 09:56:27.120287205 +0000 UTC m=+0.085418346 container remove f68db0244436d329a512f1be275ccdbbf5652c4911780816afe0b3335a7fd21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_villani, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, RELEASE=main, distribution-scope=public, name=rhceph, GIT_CLEAN=True, io.buildah.version=1.41.4, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7)
Dec 02 09:56:27 np0005541913.localdomain systemd[1]: libpod-conmon-f68db0244436d329a512f1be275ccdbbf5652c4911780816afe0b3335a7fd21f.scope: Deactivated successfully.
Dec 02 09:56:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7da958795fda4d9380b200b691c953f6a9c2a5bcf242ab83072f70fd14ac119b-merged.mount: Deactivated successfully.
Dec 02 09:56:27 np0005541913.localdomain sudo[298400]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:27 np0005541913.localdomain sudo[298479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:27 np0005541913.localdomain sudo[298479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:27 np0005541913.localdomain sudo[298479]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:27.435 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:27 np0005541913.localdomain sudo[298497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:27 np0005541913.localdomain sudo[298497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:56:27 np0005541913.localdomain systemd[1]: tmp-crun.FLsrv8.mount: Deactivated successfully.
Dec 02 09:56:27 np0005541913.localdomain podman[298514]: 2025-12-02 09:56:27.648565751 +0000 UTC m=+0.115399628 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 09:56:27 np0005541913.localdomain podman[298514]: 2025-12-02 09:56:27.654073339 +0000 UTC m=+0.120907116 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3)
Dec 02 09:56:27 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:56:27 np0005541913.localdomain podman[298552]: 
Dec 02 09:56:27 np0005541913.localdomain podman[298552]: 2025-12-02 09:56:27.983063342 +0000 UTC m=+0.069596054 container create 93ecfd56657b85ea090788788980db4eb2d3ec6235b666da4cde40c880b028a4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_antonelli, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, ceph=True, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.component=rhceph-container)
Dec 02 09:56:28 np0005541913.localdomain systemd[1]: Started libpod-conmon-93ecfd56657b85ea090788788980db4eb2d3ec6235b666da4cde40c880b028a4.scope.
Dec 02 09:56:28 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:56:28 np0005541913.localdomain podman[298552]: 2025-12-02 09:56:28.056931068 +0000 UTC m=+0.143463810 container init 93ecfd56657b85ea090788788980db4eb2d3ec6235b666da4cde40c880b028a4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_antonelli, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, version=7, io.buildah.version=1.41.4, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, ceph=True)
Dec 02 09:56:28 np0005541913.localdomain podman[298552]: 2025-12-02 09:56:27.960368445 +0000 UTC m=+0.046901227 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:28 np0005541913.localdomain podman[298552]: 2025-12-02 09:56:28.067009308 +0000 UTC m=+0.153542020 container start 93ecfd56657b85ea090788788980db4eb2d3ec6235b666da4cde40c880b028a4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_antonelli, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1763362218, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:56:28 np0005541913.localdomain podman[298552]: 2025-12-02 09:56:28.067137461 +0000 UTC m=+0.153670173 container attach 93ecfd56657b85ea090788788980db4eb2d3ec6235b666da4cde40c880b028a4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_antonelli, GIT_CLEAN=True, io.buildah.version=1.41.4, release=1763362218, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:56:28 np0005541913.localdomain sharp_antonelli[298567]: 167 167
Dec 02 09:56:28 np0005541913.localdomain systemd[1]: libpod-93ecfd56657b85ea090788788980db4eb2d3ec6235b666da4cde40c880b028a4.scope: Deactivated successfully.
Dec 02 09:56:28 np0005541913.localdomain podman[298552]: 2025-12-02 09:56:28.070624375 +0000 UTC m=+0.157157087 container died 93ecfd56657b85ea090788788980db4eb2d3ec6235b666da4cde40c880b028a4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_antonelli, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, release=1763362218, distribution-scope=public)
Dec 02 09:56:28 np0005541913.localdomain podman[298572]: 2025-12-02 09:56:28.15115832 +0000 UTC m=+0.067301722 container remove 93ecfd56657b85ea090788788980db4eb2d3ec6235b666da4cde40c880b028a4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_antonelli, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, architecture=x86_64, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218)
Dec 02 09:56:28 np0005541913.localdomain systemd[1]: libpod-conmon-93ecfd56657b85ea090788788980db4eb2d3ec6235b666da4cde40c880b028a4.scope: Deactivated successfully.
Dec 02 09:56:28 np0005541913.localdomain sudo[298497]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:28 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6818c6a7917354a45eac228f6c57997c8061997e84d16a26b19709d85f55258c-merged.mount: Deactivated successfully.
Dec 02 09:56:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:28.667 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:29 np0005541913.localdomain sudo[298589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:29 np0005541913.localdomain sudo[298589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:29 np0005541913.localdomain sudo[298589]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:29 np0005541913.localdomain sudo[298607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:29 np0005541913.localdomain sudo[298607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:30 np0005541913.localdomain podman[298642]: 
Dec 02 09:56:30 np0005541913.localdomain podman[298642]: 2025-12-02 09:56:30.21813781 +0000 UTC m=+0.077803473 container create b734fdc9962a63fba32dc35c07c738b65f06c65910d06ad39b3e6ece59602800 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shannon, ceph=True, name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Dec 02 09:56:30 np0005541913.localdomain systemd[1]: Started libpod-conmon-b734fdc9962a63fba32dc35c07c738b65f06c65910d06ad39b3e6ece59602800.scope.
Dec 02 09:56:30 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:56:30 np0005541913.localdomain podman[298642]: 2025-12-02 09:56:30.186820973 +0000 UTC m=+0.046486696 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:30 np0005541913.localdomain podman[298642]: 2025-12-02 09:56:30.293764814 +0000 UTC m=+0.153430487 container init b734fdc9962a63fba32dc35c07c738b65f06c65910d06ad39b3e6ece59602800 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shannon, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, ceph=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:56:30 np0005541913.localdomain systemd[1]: tmp-crun.jOp00s.mount: Deactivated successfully.
Dec 02 09:56:30 np0005541913.localdomain podman[298642]: 2025-12-02 09:56:30.309347352 +0000 UTC m=+0.169013025 container start b734fdc9962a63fba32dc35c07c738b65f06c65910d06ad39b3e6ece59602800 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shannon, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.expose-services=, release=1763362218, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 09:56:30 np0005541913.localdomain podman[298642]: 2025-12-02 09:56:30.309595158 +0000 UTC m=+0.169260881 container attach b734fdc9962a63fba32dc35c07c738b65f06c65910d06ad39b3e6ece59602800 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shannon, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:56:30 np0005541913.localdomain modest_shannon[298658]: 167 167
Dec 02 09:56:30 np0005541913.localdomain systemd[1]: libpod-b734fdc9962a63fba32dc35c07c738b65f06c65910d06ad39b3e6ece59602800.scope: Deactivated successfully.
Dec 02 09:56:30 np0005541913.localdomain podman[298642]: 2025-12-02 09:56:30.31227742 +0000 UTC m=+0.171943133 container died b734fdc9962a63fba32dc35c07c738b65f06c65910d06ad39b3e6ece59602800 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shannon, build-date=2025-11-26T19:44:28Z, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:56:30 np0005541913.localdomain podman[298663]: 2025-12-02 09:56:30.41618809 +0000 UTC m=+0.089613849 container remove b734fdc9962a63fba32dc35c07c738b65f06c65910d06ad39b3e6ece59602800 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shannon, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph)
Dec 02 09:56:30 np0005541913.localdomain systemd[1]: libpod-conmon-b734fdc9962a63fba32dc35c07c738b65f06c65910d06ad39b3e6ece59602800.scope: Deactivated successfully.
Dec 02 09:56:30 np0005541913.localdomain sudo[298607]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7e82fd8df3763b742fe704874094d01c53cf40aac8ba49e63e74183dfb563878-merged.mount: Deactivated successfully.
Dec 02 09:56:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:32.438 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.3 (monmap changed)...
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541914 (monmap changed)...
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541914 on np0005541914.localdomain
Dec 02 09:56:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:33.669 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:56:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:56:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:56:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:56:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:56:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:56:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:56:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:56:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:56:34 np0005541913.localdomain podman[298680]: 2025-12-02 09:56:34.451524401 +0000 UTC m=+0.078715317 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 02 09:56:34 np0005541913.localdomain podman[298680]: 2025-12-02 09:56:34.456298179 +0000 UTC m=+0.083489035 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:56:34 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:56:35 np0005541913.localdomain sudo[298699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:35 np0005541913.localdomain sudo[298699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:35 np0005541913.localdomain sudo[298699]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:35 np0005541913.localdomain sudo[298717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:56:35 np0005541913.localdomain sudo[298717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:36 np0005541913.localdomain podman[298806]: 2025-12-02 09:56:36.023678189 +0000 UTC m=+0.069088159 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.expose-services=, version=7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:56:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:56:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:56:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:56:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:56:36 np0005541913.localdomain podman[298806]: 2025-12-02 09:56:36.14441197 +0000 UTC m=+0.189821950 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, architecture=x86_64, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True)
Dec 02 09:56:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:56:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18721 "" "Go-http-client/1.1"
Dec 02 09:56:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.1 (monmap changed)...
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.4 (monmap changed)...
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541914.sqgqkj (monmap changed)...
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541914.sqgqkj on np0005541914.localdomain
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541914.lljzmk (monmap changed)...
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:36 np0005541913.localdomain podman[298912]: 2025-12-02 09:56:36.616943365 +0000 UTC m=+0.095541798 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Dec 02 09:56:36 np0005541913.localdomain podman[298912]: 2025-12-02 09:56:36.653709278 +0000 UTC m=+0.132307721 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 02 09:56:36 np0005541913.localdomain sudo[298717]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:36 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:56:36 np0005541913.localdomain sudo[298946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:56:36 np0005541913.localdomain sudo[298946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:37 np0005541913.localdomain sudo[298946]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:37.441 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:56:38 np0005541913.localdomain systemd[1]: tmp-crun.gARKAM.mount: Deactivated successfully.
Dec 02 09:56:38 np0005541913.localdomain podman[298964]: 2025-12-02 09:56:38.491109644 +0000 UTC m=+0.095900317 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:56:38 np0005541913.localdomain podman[298964]: 2025-12-02 09:56:38.501960594 +0000 UTC m=+0.106751267 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:56:38 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:56:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:38.672 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:38 np0005541913.localdomain sudo[298988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:56:38 np0005541913.localdomain sudo[298988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:38 np0005541913.localdomain sudo[298988]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/1875286268' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='client.34342 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: Reconfig service osd.default_drive_group
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:56:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:41 np0005541913.localdomain sudo[299006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:41 np0005541913.localdomain sudo[299006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:41 np0005541913.localdomain sudo[299006]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:41 np0005541913.localdomain sudo[299024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:41 np0005541913.localdomain sudo[299024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:41 np0005541913.localdomain podman[299060]: 
Dec 02 09:56:41 np0005541913.localdomain podman[299060]: 2025-12-02 09:56:41.671792856 +0000 UTC m=+0.085111219 container create b27019543f618016732fbac028f11b0589f39145b32832cf018fbf444f8230a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_buck, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_CLEAN=True, release=1763362218, io.buildah.version=1.41.4)
Dec 02 09:56:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-b27019543f618016732fbac028f11b0589f39145b32832cf018fbf444f8230a1.scope.
Dec 02 09:56:41 np0005541913.localdomain podman[299060]: 2025-12-02 09:56:41.639257625 +0000 UTC m=+0.052576028 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:56:41 np0005541913.localdomain podman[299060]: 2025-12-02 09:56:41.755989619 +0000 UTC m=+0.169307962 container init b27019543f618016732fbac028f11b0589f39145b32832cf018fbf444f8230a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_buck, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, com.redhat.component=rhceph-container, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=)
Dec 02 09:56:41 np0005541913.localdomain podman[299060]: 2025-12-02 09:56:41.765247557 +0000 UTC m=+0.178565890 container start b27019543f618016732fbac028f11b0589f39145b32832cf018fbf444f8230a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_buck, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.buildah.version=1.41.4, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, version=7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:56:41 np0005541913.localdomain podman[299060]: 2025-12-02 09:56:41.76539565 +0000 UTC m=+0.178714033 container attach b27019543f618016732fbac028f11b0589f39145b32832cf018fbf444f8230a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_buck, GIT_CLEAN=True, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:56:41 np0005541913.localdomain inspiring_buck[299075]: 167 167
Dec 02 09:56:41 np0005541913.localdomain systemd[1]: libpod-b27019543f618016732fbac028f11b0589f39145b32832cf018fbf444f8230a1.scope: Deactivated successfully.
Dec 02 09:56:41 np0005541913.localdomain podman[299060]: 2025-12-02 09:56:41.772007227 +0000 UTC m=+0.185325590 container died b27019543f618016732fbac028f11b0589f39145b32832cf018fbf444f8230a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_buck, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Dec 02 09:56:41 np0005541913.localdomain podman[299080]: 2025-12-02 09:56:41.863379452 +0000 UTC m=+0.082657123 container remove b27019543f618016732fbac028f11b0589f39145b32832cf018fbf444f8230a1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_buck, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-11-26T19:44:28Z, RELEASE=main, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=1763362218, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True)
Dec 02 09:56:41 np0005541913.localdomain systemd[1]: libpod-conmon-b27019543f618016732fbac028f11b0589f39145b32832cf018fbf444f8230a1.scope: Deactivated successfully.
Dec 02 09:56:42 np0005541913.localdomain sudo[299024]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:42 np0005541913.localdomain sudo[299105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:56:42 np0005541913.localdomain sudo[299105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:42 np0005541913.localdomain sudo[299105]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:42 np0005541913.localdomain sudo[299123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:56:42 np0005541913.localdomain sudo[299123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:56:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:42.443 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: pgmap v49: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: mgrmap e25: np0005541914.lljzmk(active, since 91s), standbys: np0005541910.kzipdo, np0005541913.mfesdm, np0005541912.qwddia, np0005541911.adcgiw
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:56:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5a6d3d54b0017b5f30a92e0e47dec362fde2b2f5976153eadfc00715b22193bc-merged.mount: Deactivated successfully.
Dec 02 09:56:42 np0005541913.localdomain podman[299158]: 
Dec 02 09:56:42 np0005541913.localdomain podman[299158]: 2025-12-02 09:56:42.773394593 +0000 UTC m=+0.084013149 container create 6677086a3cc75f63e6dfa3973eba378344904ca36077d8c68d609f0bcc6d0449 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_poincare, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-type=git, RELEASE=main, name=rhceph, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, version=7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 09:56:42 np0005541913.localdomain systemd[1]: Started libpod-conmon-6677086a3cc75f63e6dfa3973eba378344904ca36077d8c68d609f0bcc6d0449.scope.
Dec 02 09:56:42 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:56:42 np0005541913.localdomain podman[299158]: 2025-12-02 09:56:42.838587668 +0000 UTC m=+0.149206244 container init 6677086a3cc75f63e6dfa3973eba378344904ca36077d8c68d609f0bcc6d0449 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_poincare, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, release=1763362218, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=rhceph, com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:56:42 np0005541913.localdomain podman[299158]: 2025-12-02 09:56:42.742089406 +0000 UTC m=+0.052708012 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:56:42 np0005541913.localdomain podman[299158]: 2025-12-02 09:56:42.858795988 +0000 UTC m=+0.169414574 container start 6677086a3cc75f63e6dfa3973eba378344904ca36077d8c68d609f0bcc6d0449 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_poincare, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, RELEASE=main, architecture=x86_64, version=7, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph)
Dec 02 09:56:42 np0005541913.localdomain awesome_poincare[299173]: 167 167
Dec 02 09:56:42 np0005541913.localdomain podman[299158]: 2025-12-02 09:56:42.859073575 +0000 UTC m=+0.169692151 container attach 6677086a3cc75f63e6dfa3973eba378344904ca36077d8c68d609f0bcc6d0449 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_poincare, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, architecture=x86_64, name=rhceph)
Dec 02 09:56:42 np0005541913.localdomain systemd[1]: libpod-6677086a3cc75f63e6dfa3973eba378344904ca36077d8c68d609f0bcc6d0449.scope: Deactivated successfully.
Dec 02 09:56:42 np0005541913.localdomain podman[299158]: 2025-12-02 09:56:42.86186189 +0000 UTC m=+0.172480496 container died 6677086a3cc75f63e6dfa3973eba378344904ca36077d8c68d609f0bcc6d0449 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_poincare, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-type=git, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, distribution-scope=public, build-date=2025-11-26T19:44:28Z)
Dec 02 09:56:42 np0005541913.localdomain podman[299178]: 2025-12-02 09:56:42.957920141 +0000 UTC m=+0.083782113 container remove 6677086a3cc75f63e6dfa3973eba378344904ca36077d8c68d609f0bcc6d0449 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_poincare, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph)
Dec 02 09:56:42 np0005541913.localdomain systemd[1]: libpod-conmon-6677086a3cc75f63e6dfa3973eba378344904ca36077d8c68d609f0bcc6d0449.scope: Deactivated successfully.
Dec 02 09:56:43 np0005541913.localdomain sudo[299123]: pam_unix(sudo:session): session closed for user root
Dec 02 09:56:43 np0005541913.localdomain sshd[293959]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:56:43 np0005541913.localdomain systemd[1]: session-66.scope: Deactivated successfully.
Dec 02 09:56:43 np0005541913.localdomain systemd[1]: session-66.scope: Consumed 27.362s CPU time.
Dec 02 09:56:43 np0005541913.localdomain systemd-logind[757]: Session 66 logged out. Waiting for processes to exit.
Dec 02 09:56:43 np0005541913.localdomain systemd-logind[757]: Removed session 66.
Dec 02 09:56:43 np0005541913.localdomain systemd[1]: tmp-crun.AgdNw7.mount: Deactivated successfully.
Dec 02 09:56:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-07f90f91593a877f09f0a53c131e2e823965549b94f5ff30b693931b57a2d15c-merged.mount: Deactivated successfully.
Dec 02 09:56:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:43.706 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:56:44 np0005541913.localdomain podman[299201]: 2025-12-02 09:56:44.461109994 +0000 UTC m=+0.095837465 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 09:56:44 np0005541913.localdomain podman[299201]: 2025-12-02 09:56:44.47139148 +0000 UTC m=+0.106118941 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:56:44 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:56:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:56:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:56:46 np0005541913.localdomain podman[299221]: 2025-12-02 09:56:46.434755426 +0000 UTC m=+0.076538450 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:56:46 np0005541913.localdomain podman[299222]: 2025-12-02 09:56:46.488467803 +0000 UTC m=+0.126887666 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller)
Dec 02 09:56:46 np0005541913.localdomain podman[299221]: 2025-12-02 09:56:46.523318835 +0000 UTC m=+0.165101849 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:56:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:46.526 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:56:46 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:56:46 np0005541913.localdomain podman[299222]: 2025-12-02 09:56:46.576987822 +0000 UTC m=+0.215407735 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Dec 02 09:56:46 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:56:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:46.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:56:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:46.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:56:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:46.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:56:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:47.446 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:47.657 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:56:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:47.658 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:56:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:47.658 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:56:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:47.658 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:56:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:48.710 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:48.949 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.200 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.201 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.202 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.202 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.203 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:56:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:49.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.113 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.114 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.114 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.115 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.115 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(probing) e11 handle_auth_request failed to assign global_id
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.581 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).osd e87 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).osd e87 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).osd e88 e88: 6 total, 6 up, 6 in
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: pgmap v50: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.17121 172.18.0.108:0/2364182550' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/219576174' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: Activating manager daemon np0005541910.kzipdo
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: osdmap e88: 6 total, 6 up, 6 in
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: mgrmap e26: np0005541910.kzipdo(active, starting, since 0.0525072s), standbys: np0005541913.mfesdm, np0005541912.qwddia, np0005541911.adcgiw
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: Standby manager daemon np0005541914.lljzmk started
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: mgrmap e27: np0005541910.kzipdo(active, starting, since 5s), standbys: np0005541913.mfesdm, np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:56:50 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3031756029' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.687 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.688 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.919 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.921 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11707MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.922 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:56:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:50.922 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:56:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:51.265 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:56:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:51.266 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:56:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:51.266 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:56:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:51.307 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:56:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(probing) e11 handle_auth_request failed to assign global_id
Dec 02 09:56:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:51.748 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:56:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:51.754 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:56:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:52.490 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:52.624 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:56:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:52.627 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:56:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:52.628 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 1002...
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Activating special unit Exit the Session...
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Removed slice User Background Tasks Slice.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Stopped target Main User Target.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Stopped target Basic System.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Stopped target Paths.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Stopped target Sockets.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Stopped target Timers.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Closed D-Bus User Message Bus Socket.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Stopped Create User's Volatile Files and Directories.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Removed slice User Application Slice.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Reached target Shutdown.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Finished Exit the Session.
Dec 02 09:56:53 np0005541913.localdomain systemd[25916]: Reached target Exit the Session.
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: user@1002.service: Deactivated successfully.
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 1002.
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: user@1002.service: Consumed 14.194s CPU time, read 0B from disk, written 7.0K to disk.
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1002...
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: run-user-1002.mount: Deactivated successfully.
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: user-runtime-dir@1002.service: Deactivated successfully.
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1002.
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 1002.
Dec 02 09:56:53 np0005541913.localdomain systemd[1]: user-1002.slice: Consumed 4min 30.785s CPU time.
Dec 02 09:56:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:53.748 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(probing) e11 handle_auth_request failed to assign global_id
Dec 02 09:56:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:57.493 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:56:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:56:58 np0005541913.localdomain podman[299317]: 2025-12-02 09:56:58.450745498 +0000 UTC m=+0.086425413 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 09:56:58 np0005541913.localdomain podman[299317]: 2025-12-02 09:56:58.46613041 +0000 UTC m=+0.101810325 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 09:56:58 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:56:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:56:58.751 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:02.520 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:57:03.042 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:57:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:57:03.042 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:57:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:57:03.047 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:57:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:03.781 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:57:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:57:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:57:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:57:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:57:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:57:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:57:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(probing) e11 handle_auth_request failed to assign global_id
Dec 02 09:57:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:57:05 np0005541913.localdomain podman[299337]: 2025-12-02 09:57:05.445020077 +0000 UTC m=+0.086922817 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Dec 02 09:57:05 np0005541913.localdomain podman[299337]: 2025-12-02 09:57:05.478926684 +0000 UTC m=+0.120829354 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 09:57:05 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:57:05 np0005541913.localdomain sshd[296438]: Received disconnect from 192.168.122.11 port 46514:11: disconnected by user
Dec 02 09:57:05 np0005541913.localdomain sshd[296438]: Disconnected from user tripleo-admin 192.168.122.11 port 46514
Dec 02 09:57:05 np0005541913.localdomain sshd[296418]: pam_unix(sshd:session): session closed for user tripleo-admin
Dec 02 09:57:05 np0005541913.localdomain systemd[1]: session-67.scope: Deactivated successfully.
Dec 02 09:57:05 np0005541913.localdomain systemd[1]: session-67.scope: Consumed 1.682s CPU time.
Dec 02 09:57:05 np0005541913.localdomain systemd-logind[757]: Session 67 logged out. Waiting for processes to exit.
Dec 02 09:57:05 np0005541913.localdomain systemd-logind[757]: Removed session 67.
Dec 02 09:57:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:57:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:57:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:57:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:57:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:57:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18712 "" "Go-http-client/1.1"
Dec 02 09:57:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:57:07 np0005541913.localdomain podman[299355]: 2025-12-02 09:57:07.44824962 +0000 UTC m=+0.082919450 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 09:57:07 np0005541913.localdomain podman[299355]: 2025-12-02 09:57:07.490125041 +0000 UTC m=+0.124794841 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Dec 02 09:57:07 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:57:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:07.523 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:08.784 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:57:09 np0005541913.localdomain podman[299373]: 2025-12-02 09:57:09.149407821 +0000 UTC m=+0.078378288 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:57:09 np0005541913.localdomain podman[299373]: 2025-12-02 09:57:09.186103223 +0000 UTC m=+0.115073700 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:57:09 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:57:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:57:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 4864 writes, 22K keys, 4864 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4864 writes, 600 syncs, 8.11 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 88 writes, 289 keys, 88 commit groups, 1.0 writes per commit group, ingest: 0.31 MB, 0.00 MB/s
                                                          Interval WAL: 88 writes, 31 syncs, 2.84 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/128556610' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2995218045' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/916855140' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3993520404' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/243203575' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1607212135' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1607212135' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.703411) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669430703543, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11592, "num_deletes": 273, "total_data_size": 19469679, "memory_usage": 20179728, "flush_reason": "Manual Compaction"}
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669430827800, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 16103374, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11597, "table_properties": {"data_size": 16038466, "index_size": 36066, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27781, "raw_key_size": 295547, "raw_average_key_size": 26, "raw_value_size": 15847482, "raw_average_value_size": 1428, "num_data_blocks": 1382, "num_entries": 11095, "num_filter_entries": 11095, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 1764669385, "file_creation_time": 1764669430, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 124458 microseconds, and 34571 cpu microseconds.
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.827878) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 16103374 bytes OK
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.827910) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.829225) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.829261) EVENT_LOG_v1 {"time_micros": 1764669430829250, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.829285) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 19388766, prev total WAL file size 19388766, number of live WAL files 2.
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.832922) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end)
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(15MB) 8(1887B)]
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669430833029, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 16105261, "oldest_snapshot_seqno": -1}
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10846 keys, 16100143 bytes, temperature: kUnknown
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669430917423, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 16100143, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16035892, "index_size": 36054, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27141, "raw_key_size": 290805, "raw_average_key_size": 26, "raw_value_size": 15848133, "raw_average_value_size": 1461, "num_data_blocks": 1381, "num_entries": 10846, "num_filter_entries": 10846, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764669430, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.917742) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 16100143 bytes
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.919287) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 190.7 rd, 190.6 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(15.4, 0.0 +0.0 blob) out(15.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 11100, records dropped: 254 output_compression: NoCompression
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.919311) EVENT_LOG_v1 {"time_micros": 1764669430919301, "job": 4, "event": "compaction_finished", "compaction_time_micros": 84474, "compaction_time_cpu_micros": 22649, "output_level": 6, "num_output_files": 1, "total_output_size": 16100143, "num_input_records": 11100, "num_output_records": 10846, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669430921107, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669430921151, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec 02 09:57:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:57:10.832796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:57:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:12.526 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:13.786 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: mgr handle_mgr_map Activating!
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: mgr handle_mgr_map I am now activating
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: balancer
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [balancer INFO root] Starting
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [balancer INFO root] Optimize plan auto_2025-12-02_09:57:14
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [cephadm WARNING root] removing stray HostCache host record np0005541910.localdomain.devices.0
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005541910.localdomain.devices.0
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: cephadm
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: crash
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: devicehealth
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: iostat
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [devicehealth INFO root] Starting
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: nfs
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:14 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: orchestrator
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: pg_autoscaler
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: progress
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] _maybe_adjust
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Loading...
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7f781e193400>, <progress.module.GhostEvent object at 0x7f781e193430>, <progress.module.GhostEvent object at 0x7f781e193460>, <progress.module.GhostEvent object at 0x7f781e193490>, <progress.module.GhostEvent object at 0x7f781e1934c0>, <progress.module.GhostEvent object at 0x7f781e1934f0>, <progress.module.GhostEvent object at 0x7f781e193520>, <progress.module.GhostEvent object at 0x7f781e193550>, <progress.module.GhostEvent object at 0x7f781e193580>, <progress.module.GhostEvent object at 0x7f781e1935b0>, <progress.module.GhostEvent object at 0x7f781e1935e0>, <progress.module.GhostEvent object at 0x7f781e193610>, <progress.module.GhostEvent object at 0x7f781e193640>, <progress.module.GhostEvent object at 0x7f781e193670>, <progress.module.GhostEvent object at 0x7f781e1936a0>, <progress.module.GhostEvent object at 0x7f781e1936d0>, <progress.module.GhostEvent object at 0x7f781e193700>, <progress.module.GhostEvent object at 0x7f781e193730>, <progress.module.GhostEvent object at 0x7f781e193760>, <progress.module.GhostEvent object at 0x7f781e193790>, <progress.module.GhostEvent object at 0x7f781e1937c0>, <progress.module.GhostEvent object at 0x7f781e1937f0>, <progress.module.GhostEvent object at 0x7f781e193820>, <progress.module.GhostEvent object at 0x7f781e193850>, <progress.module.GhostEvent object at 0x7f781e193880>, <progress.module.GhostEvent object at 0x7f781e1938b0>, <progress.module.GhostEvent object at 0x7f781e1938e0>, <progress.module.GhostEvent object at 0x7f781e193910>, <progress.module.GhostEvent object at 0x7f781e193940>, <progress.module.GhostEvent object at 0x7f781e193970>, <progress.module.GhostEvent object at 0x7f781e1939a0>, <progress.module.GhostEvent object at 0x7f781e1939d0>, <progress.module.GhostEvent object at 0x7f781e193a00>, <progress.module.GhostEvent object at 0x7f781e193a30>, <progress.module.GhostEvent object at 0x7f781e193a60>, <progress.module.GhostEvent object at 0x7f781e193a90>, <progress.module.GhostEvent object at 0x7f781e193ac0>, <progress.module.GhostEvent object at 0x7f781e193af0>, <progress.module.GhostEvent object at 0x7f781e193b20>, <progress.module.GhostEvent object at 0x7f781e193b50>, <progress.module.GhostEvent object at 0x7f781e193b80>, <progress.module.GhostEvent object at 0x7f781e193bb0>, <progress.module.GhostEvent object at 0x7f781e193be0>, <progress.module.GhostEvent object at 0x7f781e193c10>, <progress.module.GhostEvent object at 0x7f781e193c40>, <progress.module.GhostEvent object at 0x7f781e193c70>, <progress.module.GhostEvent object at 0x7f781e193ca0>, <progress.module.GhostEvent object at 0x7f781e193cd0>, <progress.module.GhostEvent object at 0x7f781e193d00>, <progress.module.GhostEvent object at 0x7f781e193d30>] historic events
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Loaded OSDMap, ready.
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] recovery thread starting
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] starting setup
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: rbd_support
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: restful
Dec 02 09:57:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 09:57:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.2 total, 600.0 interval
                                                          Cumulative writes: 5937 writes, 25K keys, 5937 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5937 writes, 864 syncs, 6.87 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 215 writes, 593 keys, 215 commit groups, 1.0 writes per commit group, ingest: 0.70 MB, 0.00 MB/s
                                                          Interval WAL: 215 writes, 84 syncs, 2.56 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: status
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [restful INFO root] server_addr: :: server_port: 8003
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: telemetry
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [restful WARNING root] server not running: no certificate configured
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] PerfHandler: starting
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_task_task: vms, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_task_task: volumes, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: mgr load Constructed class from module: volumes
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_task_task: images, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_task_task: backups, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.092+0000 7f780f0b6640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.092+0000 7f780f0b6640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.092+0000 7f780f0b6640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.092+0000 7f780f0b6640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.092+0000 7f780f0b6640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] TaskHandler: starting
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] setup complete
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.116+0000 7f780b8af640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.116+0000 7f780b8af640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.116+0000 7f780b8af640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.116+0000 7f780b8af640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:15.116+0000 7f780b8af640 -1 client.0 error registering admin socket command: (17) File exists
Dec 02 09:57:15 np0005541913.localdomain sshd[299538]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:57:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:57:15 np0005541913.localdomain sshd[299538]: Accepted publickey for ceph-admin from 192.168.122.107 port 38824 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 09:57:15 np0005541913.localdomain systemd-logind[757]: New session 69 of user ceph-admin.
Dec 02 09:57:15 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 1002.
Dec 02 09:57:15 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Dec 02 09:57:15 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Dec 02 09:57:15 np0005541913.localdomain podman[299540]: 2025-12-02 09:57:15.360846849 +0000 UTC m=+0.087297377 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, config_id=multipathd)
Dec 02 09:57:15 np0005541913.localdomain systemd[1]: Starting User Manager for UID 1002...
Dec 02 09:57:15 np0005541913.localdomain podman[299540]: 2025-12-02 09:57:15.393811021 +0000 UTC m=+0.120261489 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 09:57:15 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Queued start job for default target Main User Target.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Created slice User Application Slice.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Reached target Paths.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Reached target Timers.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Starting D-Bus User Message Bus Socket...
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Starting Create User's Volatile Files and Directories...
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Finished Create User's Volatile Files and Directories.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Listening on D-Bus User Message Bus Socket.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Reached target Sockets.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Reached target Basic System.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Reached target Main User Target.
Dec 02 09:57:15 np0005541913.localdomain systemd[299560]: Startup finished in 140ms.
Dec 02 09:57:15 np0005541913.localdomain systemd[1]: Started User Manager for UID 1002.
Dec 02 09:57:15 np0005541913.localdomain systemd[1]: Started Session 69 of User ceph-admin.
Dec 02 09:57:15 np0005541913.localdomain sshd[299538]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.34369 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:57:15 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:15 np0005541913.localdomain sudo[299577]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:57:15 np0005541913.localdomain sudo[299577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:15 np0005541913.localdomain sudo[299577]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:15 np0005541913.localdomain sudo[299595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:57:15 np0005541913.localdomain sudo[299595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cherrypy.error] [02/Dec/2025:09:57:16] ENGINE Bus STARTING
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : [02/Dec/2025:09:57:16] ENGINE Bus STARTING
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 1003...
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Activating special unit Exit the Session...
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Stopped target Main User Target.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Stopped target Basic System.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Stopped target Paths.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Stopped target Sockets.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Stopped target Timers.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Closed D-Bus User Message Bus Socket.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Stopped Create User's Volatile Files and Directories.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Removed slice User Application Slice.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Reached target Shutdown.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Finished Exit the Session.
Dec 02 09:57:16 np0005541913.localdomain systemd[296422]: Reached target Exit the Session.
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 1003.
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: user-1003.slice: Consumed 2.225s CPU time.
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cherrypy.error] [02/Dec/2025:09:57:16] ENGINE Serving on http://172.18.0.107:8765
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : [02/Dec/2025:09:57:16] ENGINE Serving on http://172.18.0.107:8765
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cherrypy.error] [02/Dec/2025:09:57:16] ENGINE Serving on https://172.18.0.107:7150
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : [02/Dec/2025:09:57:16] ENGINE Serving on https://172.18.0.107:7150
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cherrypy.error] [02/Dec/2025:09:57:16] ENGINE Bus STARTED
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : [02/Dec/2025:09:57:16] ENGINE Bus STARTED
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cherrypy.error] [02/Dec/2025:09:57:16] ENGINE Client ('172.18.0.107', 38106) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : [02/Dec/2025:09:57:16] ENGINE Client ('172.18.0.107', 38106) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 02 09:57:16 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:16 np0005541913.localdomain podman[299708]: 2025-12-02 09:57:16.600705736 +0000 UTC m=+0.093947525 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, ceph=True, GIT_CLEAN=True, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: tmp-crun.83BfAD.mount: Deactivated successfully.
Dec 02 09:57:16 np0005541913.localdomain podman[299727]: 2025-12-02 09:57:16.700944499 +0000 UTC m=+0.080445994 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:57:16 np0005541913.localdomain podman[299727]: 2025-12-02 09:57:16.833531356 +0000 UTC m=+0.213033041 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:57:16 np0005541913.localdomain podman[299728]: 2025-12-02 09:57:16.856527932 +0000 UTC m=+0.236024947 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 09:57:16 np0005541913.localdomain podman[299708]: 2025-12-02 09:57:16.886544605 +0000 UTC m=+0.379786364 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64)
Dec 02 09:57:16 np0005541913.localdomain podman[299728]: 2025-12-02 09:57:16.917108253 +0000 UTC m=+0.296605268 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 09:57:16 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:57:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:17.528 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:17 np0005541913.localdomain sudo[299595]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:17 np0005541913.localdomain ceph-mgr[288059]: [devicehealth INFO root] Check health
Dec 02 09:57:17 np0005541913.localdomain sudo[299884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:57:17 np0005541913.localdomain sudo[299884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:17 np0005541913.localdomain sudo[299884]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:17 np0005541913.localdomain sudo[299902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:57:17 np0005541913.localdomain sudo[299902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:18 np0005541913.localdomain sudo[299902]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(probing) e11 handle_auth_request failed to assign global_id
Dec 02 09:57:18 np0005541913.localdomain sudo[299951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:57:18 np0005541913.localdomain sudo[299951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:18 np0005541913.localdomain sudo[299951]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:18 np0005541913.localdomain sudo[299969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 09:57:18 np0005541913.localdomain sudo[299969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.34408 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Saving service mon spec with placement label:mon
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(synchronizing).osd e89 e89: 6 total, 6 up, 6 in
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: Activating manager daemon np0005541913.mfesdm
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: Manager daemon np0005541910.kzipdo is unresponsive, replacing it with standby daemon np0005541913.mfesdm
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: osdmap e89: 6 total, 6 up, 6 in
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: mgrmap e28: np0005541913.mfesdm(active, starting, since 0.0539087s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mds metadata", "who": "mds.np0005541914.sqgqkj"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mds metadata", "who": "mds.np0005541913.maexpe"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mds metadata", "who": "mds.np0005541912.ghcwcm"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr metadata", "who": "np0005541913.mfesdm", "id": "np0005541913.mfesdm"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr metadata", "who": "np0005541912.qwddia", "id": "np0005541912.qwddia"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr metadata", "who": "np0005541911.adcgiw", "id": "np0005541911.adcgiw"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr metadata", "who": "np0005541914.lljzmk", "id": "np0005541914.lljzmk"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mds metadata"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd metadata"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: Manager daemon np0005541913.mfesdm is now available
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: removing stray HostCache host record np0005541910.localdomain.devices.0
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541910.localdomain.devices.0"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005541910.localdomain.devices.0"}]': finished
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541910.localdomain.devices.0"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005541910.localdomain.devices.0"}]': finished
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541913.mfesdm/mirror_snapshot_schedule"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541913.mfesdm/trash_purge_schedule"} : dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: mgrmap e29: np0005541913.mfesdm(active, since 1.15788s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='client.34369 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:57:16] ENGINE Bus STARTING
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:57:16] ENGINE Serving on http://172.18.0.107:8765
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:57:16] ENGINE Serving on https://172.18.0.107:7150
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:57:16] ENGINE Bus STARTED
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:57:16] ENGINE Client ('172.18.0.107', 38106) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: Cluster is now healthy
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: mgrmap e30: np0005541913.mfesdm(active, since 2s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:18.790 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Adjusting osd_memory_target on np0005541914.localdomain to 836.6M
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005541914.localdomain to 836.6M
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005541914.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005541914.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Adjusting osd_memory_target on np0005541912.localdomain to 836.6M
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005541912.localdomain to 836.6M
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005541912.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005541912.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:57:18 np0005541913.localdomain sudo[299969]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Adjusting osd_memory_target on np0005541913.localdomain to 836.6M
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005541913.localdomain to 836.6M
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005541913.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005541913.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain sudo[300006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:57:19 np0005541913.localdomain sudo[300006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300006]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain sudo[300024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:57:19 np0005541913.localdomain sudo[300024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300024]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain sudo[300042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:19 np0005541913.localdomain sudo[300042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300042]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain sudo[300060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:19 np0005541913.localdomain sudo[300060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300060]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain sudo[300078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:19 np0005541913.localdomain sudo[300078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300078]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain sudo[300112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:19 np0005541913.localdomain sudo[300112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300112]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain sudo[300130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:19 np0005541913.localdomain sudo[300130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300130]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain sudo[300148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain sudo[300148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300148]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:19 np0005541913.localdomain sudo[300166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:19 np0005541913.localdomain sudo[300166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300166]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain sudo[300184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:19 np0005541913.localdomain sudo[300184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300184]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:19 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:19 np0005541913.localdomain sudo[300202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:19 np0005541913.localdomain sudo[300202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300202]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain sudo[300220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:19 np0005541913.localdomain sudo[300220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300220]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:19 np0005541913.localdomain sudo[300238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:19 np0005541913.localdomain sudo[300238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:19 np0005541913.localdomain sudo[300238]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain sudo[300272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:20 np0005541913.localdomain sudo[300272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300272]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.34414 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541913", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:57:20 np0005541913.localdomain sudo[300290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:20 np0005541913.localdomain sudo[300290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300290]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541911.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541911.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain sudo[300308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:20 np0005541913.localdomain sudo[300308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300308]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain sudo[300326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:57:20 np0005541913.localdomain sudo[300326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300326]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain sudo[300344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:57:20 np0005541913.localdomain sudo[300344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300344]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain sudo[300362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:57:20 np0005541913.localdomain sudo[300362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300362]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain sudo[300380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:20 np0005541913.localdomain sudo[300380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300380]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain sudo[300398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:57:20 np0005541913.localdomain sudo[300398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300398]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:20 np0005541913.localdomain sudo[300432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:57:20 np0005541913.localdomain sudo[300432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300432]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='client.34408 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Saving service mon spec with placement label:mon
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config rm", "who": "osd/host:np0005541911", "name": "osd_memory_target"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541914.localdomain to 836.6M
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541914.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541912.localdomain to 836.6M
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541912.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:20 np0005541913.localdomain sudo[300450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541913.localdomain to 836.6M
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541913.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: mgrmap e31: np0005541913.mfesdm(active, since 4s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: from='client.34414 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541913", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541911.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain sudo[300450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300450]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain sudo[300468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain sudo[300468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300468]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:20 np0005541913.localdomain sudo[300486]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:20 np0005541913.localdomain sudo[300486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300486]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:20 np0005541913.localdomain sudo[300504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:20 np0005541913.localdomain sudo[300504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:20 np0005541913.localdomain sudo[300504]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:21 np0005541913.localdomain sudo[300522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:57:21 np0005541913.localdomain sudo[300522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:21 np0005541913.localdomain sudo[300522]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:21 np0005541913.localdomain sudo[300540]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:21 np0005541913.localdomain sudo[300540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:21 np0005541913.localdomain sudo[300540]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:21 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:21 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:21 np0005541913.localdomain sudo[300558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:57:21 np0005541913.localdomain sudo[300558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:21 np0005541913.localdomain sudo[300558]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:21 np0005541913.localdomain sudo[300592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:57:21 np0005541913.localdomain sudo[300592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:21 np0005541913.localdomain sudo[300592]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:21 np0005541913.localdomain sudo[300610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:57:21 np0005541913.localdomain sudo[300610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:21 np0005541913.localdomain sudo[300610]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(probing) e11 handle_auth_request failed to assign global_id
Dec 02 09:57:21 np0005541913.localdomain sudo[300628]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:21 np0005541913.localdomain sudo[300628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:21 np0005541913.localdomain sudo[300628]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:21 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:21 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:21 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev 8f464c39-1004-431c-b9c0-0812e6f27fcc (Updating node-proxy deployment (+4 -> 4))
Dec 02 09:57:21 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev 8f464c39-1004-431c-b9c0-0812e6f27fcc (Updating node-proxy deployment (+4 -> 4))
Dec 02 09:57:21 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event 8f464c39-1004-431c-b9c0-0812e6f27fcc (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Dec 02 09:57:21 np0005541913.localdomain sudo[300646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:57:21 np0005541913.localdomain sudo[300646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:21 np0005541913.localdomain sudo[300646]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:22 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005541911 (monmap changed)...
Dec 02 09:57:22 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005541911 (monmap changed)...
Dec 02 09:57:22 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:57:22 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:57:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:22.531 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:22 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Dec 02 09:57:22 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:22 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/2604409311' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mon.np0005541911 (monmap changed)...
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:22 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:57:23 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:57:23 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:57:23 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:57:23 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:57:23 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:23 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:23.792 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:23 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:57:23 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:57:24 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s
Dec 02 09:57:24 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:24 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:25 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:57:25 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:57:25 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Writing back 50 completed events
Dec 02 09:57:25 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:25 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:26 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005541914 (monmap changed)...
Dec 02 09:57:26 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005541914 (monmap changed)...
Dec 02 09:57:26 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:57:26 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:57:26 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 02 09:57:26 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:26 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:27 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev 4a745475-b94b-4d1e-a5ff-bbe9d0eeddf1 (Updating node-proxy deployment (+4 -> 4))
Dec 02 09:57:27 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev 4a745475-b94b-4d1e-a5ff-bbe9d0eeddf1 (Updating node-proxy deployment (+4 -> 4))
Dec 02 09:57:27 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event 4a745475-b94b-4d1e-a5ff-bbe9d0eeddf1 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Dec 02 09:57:27 np0005541913.localdomain sudo[300664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:57:27 np0005541913.localdomain sudo[300664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:27 np0005541913.localdomain sudo[300664]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:27.533 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:27 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:27 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:28 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.34423 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541911", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:57:28 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 09:57:28 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:28 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:28.795 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:57:29 np0005541913.localdomain systemd[1]: tmp-crun.DVDNEs.mount: Deactivated successfully.
Dec 02 09:57:29 np0005541913.localdomain podman[300682]: 2025-12-02 09:57:29.481346046 +0000 UTC m=+0.109899862 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3)
Dec 02 09:57:29 np0005541913.localdomain podman[300682]: 2025-12-02 09:57:29.496135931 +0000 UTC m=+0.124689767 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:57:29 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.26982 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005541911"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Remove daemons mon.np0005541911
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005541911
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005541911: new quorum should be ['np0005541914', 'np0005541912'] (from ['np0005541914', 'np0005541912'])
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005541911: new quorum should be ['np0005541914', 'np0005541912'] (from ['np0005541914', 'np0005541912'])
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005541911 from monmap...
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing monitor np0005541911 from monmap...
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005541911 from np0005541911.localdomain -- ports []
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005541911 from np0005541911.localdomain -- ports []
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: client.44339 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: client.44339 ms_handle_reset on v2:172.18.0.108:3300/0
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: client.26949 ms_handle_reset on v2:172.18.0.108:3300/0
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:29 np0005541913.localdomain sshd[300700]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541911.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:29 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Writing back 50 completed events
Dec 02 09:57:30 np0005541913.localdomain sudo[300702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:57:30 np0005541913.localdomain sudo[300702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300702]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain sudo[300720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:57:30 np0005541913.localdomain sudo[300720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300720]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain sudo[300738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:30 np0005541913.localdomain sudo[300738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300738]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain sudo[300756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:30 np0005541913.localdomain sudo[300756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300756]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain sudo[300774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:30 np0005541913.localdomain sudo[300774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300774]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain sudo[300808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:30 np0005541913.localdomain sudo[300808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300808]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain sudo[300826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:30 np0005541913.localdomain sudo[300826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300826]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain sudo[300844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain sudo[300844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300844]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 09:57:30 np0005541913.localdomain sudo[300862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:30 np0005541913.localdomain sudo[300862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300862]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain sudo[300880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:30 np0005541913.localdomain sudo[300880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300880]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (2) No such file or directory
Dec 02 09:57:30 np0005541913.localdomain sudo[300898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:30 np0005541913.localdomain sudo[300898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300898]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mon.np0005541914 (monmap changed)...
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/1130361570' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='client.34423 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541911", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='client.26982 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005541911"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: Remove daemons mon.np0005541911
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: Safe to remove mon.np0005541911: new quorum should be ['np0005541914', 'np0005541912'] (from ['np0005541914', 'np0005541912'])
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: Removing monitor np0005541911 from monmap...
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: Removing daemon mon.np0005541911 from np0005541911.localdomain -- ports []
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 calling monitor election
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 calling monitor election
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 is new leader, mons np0005541914,np0005541912 in quorum (ranks 0,1)
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: monmap epoch 12
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: last_changed 2025-12-02T09:57:29.744140+0000
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: min_mon_release 18 (reef)
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: election_strategy: 1
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: osdmap e89: 6 total, 6 up, 6 in
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: mgrmap e31: np0005541913.mfesdm(active, since 15s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:57:30 np0005541913.localdomain ceph-mon[298296]: overall HEALTH_OK
Dec 02 09:57:30 np0005541913.localdomain sudo[300916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:30 np0005541913.localdomain sudo[300916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300916]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:30 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (22) Invalid argument
Dec 02 09:57:30 np0005541913.localdomain sudo[300934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:30 np0005541913.localdomain sudo[300934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:30 np0005541913.localdomain sudo[300934]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:31 np0005541913.localdomain sudo[300968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:31 np0005541913.localdomain sudo[300968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:31 np0005541913.localdomain sudo[300968]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:31 np0005541913.localdomain sudo[300986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:31 np0005541913.localdomain sudo[300986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:31 np0005541913.localdomain sudo[300986]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:31 np0005541913.localdomain sudo[301004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:31 np0005541913.localdomain sudo[301004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:31 np0005541913.localdomain sudo[301004]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:31 np0005541913.localdomain sshd[300700]: Received disconnect from 103.42.181.150 port 59292:11:  [preauth]
Dec 02 09:57:31 np0005541913.localdomain sshd[300700]: Disconnected from authenticating user root 103.42.181.150 port 59292 [preauth]
Dec 02 09:57:31 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:31 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (22) Invalid argument
Dec 02 09:57:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:32.540 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:32 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 09:57:32 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:32 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (22) Invalid argument
Dec 02 09:57:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@-1(probing) e13  my rank is now 2 (was -1)
Dec 02 09:57:32 np0005541913.localdomain ceph-mon[298296]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:57:32 np0005541913.localdomain ceph-mon[298296]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 02 09:57:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:33 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:33 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (22) Invalid argument
Dec 02 09:57:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:33.797 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:57:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:57:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:57:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:57:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:57:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:57:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:57:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:57:34 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:34 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:34 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (22) Invalid argument
Dec 02 09:57:35 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:35 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (22) Invalid argument
Dec 02 09:57:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e13 handle_auth_request failed to assign global_id
Dec 02 09:57:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:57:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:57:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:57:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:57:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:57:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18714 "" "Go-http-client/1.1"
Dec 02 09:57:36 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev d755f4ef-61cb-47a4-8905-95367e9e015f (Updating mon deployment (+1 -> 4))
Dec 02 09:57:36 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:57:36 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:57:36 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.34428 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541911.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:36 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Removed label mon from host np0005541911.localdomain
Dec 02 09:57:36 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removed label mon from host np0005541911.localdomain
Dec 02 09:57:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:57:36 np0005541913.localdomain podman[301022]: 2025-12-02 09:57:36.448777346 +0000 UTC m=+0.087278286 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:57:36 np0005541913.localdomain podman[301022]: 2025-12-02 09:57:36.458024563 +0000 UTC m=+0.096525503 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 02 09:57:36 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:57:36 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:36 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:36 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (22) Invalid argument
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e13 handle_auth_request failed to assign global_id
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e13 handle_auth_request failed to assign global_id
Dec 02 09:57:37 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.34431 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541911.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Removed label mgr from host np0005541911.localdomain
Dec 02 09:57:37 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005541911.localdomain
Dec 02 09:57:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:37.580 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:37 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:37 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541913: (22) Invalid argument
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: Updating np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 calling monitor election
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 calling monitor election
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 is new leader, mons np0005541914,np0005541912 in quorum (ranks 0,1)
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: monmap epoch 13
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: last_changed 2025-12-02T09:57:30.836166+0000
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: min_mon_release 18 (reef)
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: election_strategy: 1
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: osdmap e89: 6 total, 6 up, 6 in
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mgrmap e31: np0005541913.mfesdm(active, since 21s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: Health check failed: 1/3 mons down, quorum np0005541914,np0005541912 (MON_DOWN)
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005541914,np0005541912
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005541914,np0005541912
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]:     mon.np0005541913 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum)
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: Deploying daemon mon.np0005541911 on np0005541911.localdomain
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='client.34428 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541911.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: Removed label mon from host np0005541911.localdomain
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e13 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e13 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:37 np0005541913.localdomain ceph-mon[298296]: mgrc update_daemon_metadata mon.np0005541913 metadata {addrs=[v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005541913.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005541913.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Dec 02 09:57:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:57:38 np0005541913.localdomain podman[301040]: 2025-12-02 09:57:38.451422424 +0000 UTC m=+0.090621785 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e13  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e13  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e13 handle_auth_request failed to assign global_id
Dec 02 09:57:38 np0005541913.localdomain podman[301040]: 2025-12-02 09:57:38.468051169 +0000 UTC m=+0.107250540 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec 02 09:57:38 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev d755f4ef-61cb-47a4-8905-95367e9e015f (Updating mon deployment (+1 -> 4))
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event d755f4ef-61cb-47a4-8905-95367e9e015f (Updating mon deployment (+1 -> 4)) in 2 seconds
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913 calling monitor election
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 calling monitor election
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913 calling monitor election
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 calling monitor election
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 is new leader, mons np0005541914,np0005541912,np0005541913 in quorum (ranks 0,1,2)
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: monmap epoch 13
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: last_changed 2025-12-02T09:57:30.836166+0000
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: min_mon_release 18 (reef)
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: election_strategy: 1
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: osdmap e89: 6 total, 6 up, 6 in
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mgrmap e31: np0005541913.mfesdm(active, since 23s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005541914,np0005541912)
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: Cluster is now healthy
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: overall HEALTH_OK
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev 06c59072-bd04-4a28-9ce6-b5cc50ffbfcc (Updating node-proxy deployment (+4 -> 4))
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev 06c59072-bd04-4a28-9ce6-b5cc50ffbfcc (Updating node-proxy deployment (+4 -> 4))
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event 06c59072-bd04-4a28-9ce6-b5cc50ffbfcc (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.44380 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541911.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Removed label _admin from host np0005541911.localdomain
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005541911.localdomain
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541913 172.18.0.107:0/3224144201; not ready for session (expect reconnect)
Dec 02 09:57:38 np0005541913.localdomain sudo[301060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:57:38 np0005541913.localdomain sudo[301060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:38 np0005541913.localdomain sudo[301060]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:38.800 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e13  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541911 172.18.0.105:0/480338482; not ready for session (expect reconnect)
Dec 02 09:57:38 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541911: (2) No such file or directory
Dec 02 09:57:39 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541911: (22) Invalid argument
Dec 02 09:57:39 np0005541913.localdomain ceph-mon[298296]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:57:39 np0005541913.localdomain ceph-mon[298296]: paxos.2).electionLogic(52) init, last seen epoch 52
Dec 02 09:57:39 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:57:39 np0005541913.localdomain podman[301078]: 2025-12-02 09:57:39.449227824 +0000 UTC m=+0.088895940 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:57:39 np0005541913.localdomain podman[301078]: 2025-12-02 09:57:39.458856113 +0000 UTC m=+0.098524159 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:57:39 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:57:39 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_report got status from non-daemon mon.np0005541913
Dec 02 09:57:39 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:39.756+0000 7f783c290640 -1 mgr.server handle_report got status from non-daemon mon.np0005541913
Dec 02 09:57:39 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541911 172.18.0.105:0/480338482; not ready for session (expect reconnect)
Dec 02 09:57:39 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541911: (22) Invalid argument
Dec 02 09:57:40 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Writing back 50 completed events
Dec 02 09:57:40 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:40 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541911 172.18.0.105:0/480338482; not ready for session (expect reconnect)
Dec 02 09:57:40 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541911: (22) Invalid argument
Dec 02 09:57:41 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541911 172.18.0.105:0/480338482; not ready for session (expect reconnect)
Dec 02 09:57:41 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541911: (22) Invalid argument
Dec 02 09:57:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:42.614 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:42 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:42 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541911 172.18.0.105:0/480338482; not ready for session (expect reconnect)
Dec 02 09:57:42 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541911: (22) Invalid argument
Dec 02 09:57:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:43.804 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:43 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541911 172.18.0.105:0/480338482; not ready for session (expect reconnect)
Dec 02 09:57:43 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541911: (22) Invalid argument
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: paxos.2).electionLogic(53) init, last seen epoch 53, mid-election, bumping
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='client.44380 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005541911.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: Removed label _admin from host np0005541911.localdomain
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 calling monitor election
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 calling monitor election
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913 calling monitor election
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541911 calling monitor election
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 is new leader, mons np0005541914,np0005541912,np0005541913,np0005541911 in quorum (ranks 0,1,2,3)
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: monmap epoch 14
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: last_changed 2025-12-02T09:57:38.994501+0000
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: min_mon_release 18 (reef)
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: election_strategy: 1
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: 3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: osdmap e89: 6 total, 6 up, 6 in
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: mgrmap e31: np0005541913.mfesdm(active, since 29s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: overall HEALTH_OK
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Removing np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain sudo[301102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:57:44 np0005541913.localdomain sudo[301102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:44 np0005541913.localdomain sudo[301102]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Removing np0005541911.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing np0005541911.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:44 np0005541913.localdomain sudo[301120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:57:44 np0005541913.localdomain sudo[301120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:44 np0005541913.localdomain sudo[301120]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Removing np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:44 np0005541913.localdomain sudo[301138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:44 np0005541913.localdomain sudo[301138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:44 np0005541913.localdomain sudo[301138]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:44 np0005541913.localdomain sudo[301156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:44 np0005541913.localdomain sudo[301156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:44 np0005541913.localdomain sudo[301156]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:44 np0005541913.localdomain sudo[301174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:44 np0005541913.localdomain sudo[301174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:44 np0005541913.localdomain sudo[301174]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:44 np0005541913.localdomain sudo[301208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:44 np0005541913.localdomain sudo[301208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:44 np0005541913.localdomain sudo[301208]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain sudo[301226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:44 np0005541913.localdomain sudo[301226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:44 np0005541913.localdomain sudo[301226]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain sudo[301244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain sudo[301244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:44 np0005541913.localdomain sudo[301244]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:44 np0005541913.localdomain sudo[301262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:44 np0005541913.localdomain sudo[301262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:44 np0005541913.localdomain sudo[301262]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:44 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541911 172.18.0.105:0/480338482; not ready for session (expect reconnect)
Dec 02 09:57:45 np0005541913.localdomain sudo[301280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:45 np0005541913.localdomain sudo[301280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:45 np0005541913.localdomain sudo[301280]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] scanning for idle connections..
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] cleaning up connections: []
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] scanning for idle connections..
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] cleaning up connections: []
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] scanning for idle connections..
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] cleaning up connections: []
Dec 02 09:57:45 np0005541913.localdomain sudo[301298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:45 np0005541913.localdomain sudo[301298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:45 np0005541913.localdomain sudo[301298]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: Removing np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: Removing np0005541911.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: Removing np0005541911.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541911"} : dispatch
Dec 02 09:57:45 np0005541913.localdomain sudo[301316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:45 np0005541913.localdomain sudo[301316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:45 np0005541913.localdomain sudo[301316]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:45 np0005541913.localdomain sudo[301334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:45 np0005541913.localdomain sudo[301334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:45 np0005541913.localdomain sudo[301334]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:45 np0005541913.localdomain sudo[301368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:45 np0005541913.localdomain sudo[301368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:45 np0005541913.localdomain sudo[301368]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:45 np0005541913.localdomain sudo[301386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:45 np0005541913.localdomain sudo[301386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:57:45 np0005541913.localdomain sudo[301386]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:45 np0005541913.localdomain sudo[301405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:45 np0005541913.localdomain sudo[301405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:45 np0005541913.localdomain sudo[301405]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:45 np0005541913.localdomain podman[301404]: 2025-12-02 09:57:45.566820463 +0000 UTC m=+0.064228029 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 09:57:45 np0005541913.localdomain podman[301404]: 2025-12-02 09:57:45.580180851 +0000 UTC m=+0.077588467 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 09:57:45 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev 6600d02e-864d-4dee-93ce-68068586e2f3 (Updating mgr deployment (-1 -> 3))
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Removing daemon mgr.np0005541911.adcgiw from np0005541911.localdomain -- ports [8765]
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing daemon mgr.np0005541911.adcgiw from np0005541911.localdomain -- ports [8765]
Dec 02 09:57:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon).osd e89 _set_new_cache_sizes cache_size:1019640581 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:57:45 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_report got status from non-daemon mon.np0005541911
Dec 02 09:57:45 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:57:45.997+0000 7f783c290640 -1 mgr.server handle_report got status from non-daemon mon.np0005541911
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:46.628 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:46 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:46.890 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:46.891 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:57:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:46.891 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:57:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:57:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:57:47 np0005541913.localdomain podman[301440]: 2025-12-02 09:57:47.44832009 +0000 UTC m=+0.084539594 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 09:57:47 np0005541913.localdomain podman[301440]: 2025-12-02 09:57:47.457913786 +0000 UTC m=+0.094133310 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:57:47 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:57:47 np0005541913.localdomain ceph-mon[298296]: Removing daemon mgr.np0005541911.adcgiw from np0005541911.localdomain -- ports [8765]
Dec 02 09:57:47 np0005541913.localdomain podman[301441]: 2025-12-02 09:57:47.553526635 +0000 UTC m=+0.186653326 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 02 09:57:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:47.616 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:47 np0005541913.localdomain podman[301441]: 2025-12-02 09:57:47.617055245 +0000 UTC m=+0.250181996 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller)
Dec 02 09:57:47 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:57:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:47.677 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:57:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:47.677 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:57:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:47.678 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:57:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:47.678 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.np0005541911.adcgiw
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing key for mgr.np0005541911.adcgiw
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev 6600d02e-864d-4dee-93ce-68068586e2f3 (Updating mgr deployment (-1 -> 3))
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event 6600d02e-864d-4dee-93ce-68068586e2f3 (Updating mgr deployment (-1 -> 3)) in 2 seconds
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev cf2d4045-cc42-456b-be11-4a88f1bde21b (Updating mon deployment (-1 -> 3))
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005541911: new quorum should be ['np0005541914', 'np0005541912', 'np0005541913'] (from ['np0005541914', 'np0005541912', 'np0005541913'])
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005541911: new quorum should be ['np0005541914', 'np0005541912', 'np0005541913'] (from ['np0005541914', 'np0005541912', 'np0005541913'])
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005541911 from monmap...
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing monitor np0005541911 from monmap...
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005541911 from np0005541911.localdomain -- ports []
Dec 02 09:57:47 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005541911 from np0005541911.localdomain -- ports []
Dec 02 09:57:47 np0005541913.localdomain ceph-mon[298296]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:57:47 np0005541913.localdomain ceph-mon[298296]: paxos.2).electionLogic(56) init, last seen epoch 56
Dec 02 09:57:47 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:48 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:48.806 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:49 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Writing back 50 completed events
Dec 02 09:57:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:49.131 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:57:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:49.213 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:57:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:49.213 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:57:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:49.214 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:49.215 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:49.215 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:49.216 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:57:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:49.217 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:49.839 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:50.595 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:57:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:50.596 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:57:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:50.596 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:57:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:50.596 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:57:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:50.597 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:57:50 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e15 handle_auth_request failed to assign global_id
Dec 02 09:57:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e15 handle_auth_request failed to assign global_id
Dec 02 09:57:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e15 handle_auth_request failed to assign global_id
Dec 02 09:57:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e15 handle_auth_request failed to assign global_id
Dec 02 09:57:52 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:52.668 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:52 np0005541913.localdomain ceph-mon[298296]: paxos.2).electionLogic(57) init, last seen epoch 57, mid-election, bumping
Dec 02 09:57:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:57:52 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev cf2d4045-cc42-456b-be11-4a88f1bde21b (Updating mon deployment (-1 -> 3))
Dec 02 09:57:52 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event cf2d4045-cc42-456b-be11-4a88f1bde21b (Updating mon deployment (-1 -> 3)) in 5 seconds
Dec 02 09:57:53 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev 151b76ef-7ac5-4a24-9563-da569f49d4b6 (Updating node-proxy deployment (+4 -> 4))
Dec 02 09:57:53 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev 151b76ef-7ac5-4a24-9563-da569f49d4b6 (Updating node-proxy deployment (+4 -> 4))
Dec 02 09:57:53 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event 151b76ef-7ac5-4a24-9563-da569f49d4b6 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: Removing key for mgr.np0005541911.adcgiw
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: Safe to remove mon.np0005541911: new quorum should be ['np0005541914', 'np0005541912', 'np0005541913'] (from ['np0005541914', 'np0005541912', 'np0005541913'])
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: Removing monitor np0005541911 from monmap...
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: Removing daemon mon.np0005541911 from np0005541911.localdomain -- ports []
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 calling monitor election
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 calling monitor election
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913 calling monitor election
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 is new leader, mons np0005541914,np0005541912 in quorum (ranks 0,1)
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 calling monitor election
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: overall HEALTH_OK
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 calling monitor election
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 is new leader, mons np0005541914,np0005541912,np0005541913 in quorum (ranks 0,1,2)
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: monmap epoch 15
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: last_changed 2025-12-02T09:57:47.906570+0000
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: min_mon_release 18 (reef)
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: election_strategy: 1
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: osdmap e89: 6 total, 6 up, 6 in
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: mgrmap e31: np0005541913.mfesdm(active, since 38s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: overall HEALTH_OK
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:53 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.54101 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005541911.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:53 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Added label _no_schedule to host np0005541911.localdomain
Dec 02 09:57:53 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005541911.localdomain
Dec 02 09:57:53 np0005541913.localdomain sudo[301500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:57:53 np0005541913.localdomain sudo[301500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:53 np0005541913.localdomain sudo[301500]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:53 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005541911.localdomain
Dec 02 09:57:53 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005541911.localdomain
Dec 02 09:57:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:53.832 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: from='client.54101 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005541911.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: Added label _no_schedule to host np0005541911.localdomain
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005541911.localdomain
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:57:54 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2425549797' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.065 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.128 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.129 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.350 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.352 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11678MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.353 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.354 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:57:54 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.54116 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005541911.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.627 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.628 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.628 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:57:54 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.690 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 09:57:54 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:54 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:54 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:54 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:54 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:54 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.743 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.744 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.756 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.774 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_ABM,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 09:57:54 np0005541913.localdomain sudo[301529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:57:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:54.810 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:57:54 np0005541913.localdomain sudo[301529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:54 np0005541913.localdomain sudo[301529]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:54 np0005541913.localdomain sudo[301548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:57:54 np0005541913.localdomain sudo[301548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:54 np0005541913.localdomain sudo[301548]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:54 np0005541913.localdomain sudo[301566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:54 np0005541913.localdomain sudo[301566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:54 np0005541913.localdomain sudo[301566]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain sudo[301603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:55 np0005541913.localdomain sudo[301603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301603]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2425549797' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:57:55 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3063913601' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:55 np0005541913.localdomain sudo[301621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:55 np0005541913.localdomain sudo[301621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301621]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain sudo[301655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:55 np0005541913.localdomain sudo[301655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301655]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:57:55 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1003759709' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.268 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.274 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.292 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.295 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.296 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.942s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.296 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.296 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.313 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.313 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:55.313 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 02 09:57:55 np0005541913.localdomain sudo[301674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:57:55 np0005541913.localdomain sudo[301674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301674]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:55 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:55 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:55 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:55 np0005541913.localdomain sudo[301693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:57:55 np0005541913.localdomain sudo[301693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301693]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:55 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:55 np0005541913.localdomain sudo[301711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:55 np0005541913.localdomain sudo[301711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301711]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain sudo[301729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:57:55 np0005541913.localdomain sudo[301729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301729]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain sudo[301747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:55 np0005541913.localdomain sudo[301747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301747]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain sudo[301765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:57:55 np0005541913.localdomain sudo[301765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301765]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon).osd e89 _set_new_cache_sizes cache_size:1020047553 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:57:55 np0005541913.localdomain sudo[301783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:55 np0005541913.localdomain sudo[301783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301783]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.34455 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005541911.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:55 np0005541913.localdomain sudo[301817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:55 np0005541913.localdomain sudo[301817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:55 np0005541913.localdomain sudo[301817]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:55 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Removed host np0005541911.localdomain
Dec 02 09:57:55 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removed host np0005541911.localdomain
Dec 02 09:57:56 np0005541913.localdomain sudo[301835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:57:56 np0005541913.localdomain sudo[301835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:56 np0005541913.localdomain sudo[301835]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='client.54116 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005541911.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1003759709' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3093024393' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541911.localdomain"} : dispatch
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541911.localdomain"} : dispatch
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005541911.localdomain"}]': finished
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:56 np0005541913.localdomain sudo[301853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:57:56 np0005541913.localdomain sudo[301853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:56 np0005541913.localdomain sudo[301853]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:56 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev d67aba63-6539-453f-9253-9dfdf27f5a23 (Updating node-proxy deployment (+3 -> 3))
Dec 02 09:57:56 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev d67aba63-6539-453f-9253-9dfdf27f5a23 (Updating node-proxy deployment (+3 -> 3))
Dec 02 09:57:56 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event d67aba63-6539-453f-9253-9dfdf27f5a23 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Dec 02 09:57:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:56.313 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:56.314 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:56.314 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:56.315 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:57:56 np0005541913.localdomain sudo[301871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:57:56 np0005541913.localdomain sudo[301871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:57:56 np0005541913.localdomain sudo[301871]: pam_unix(sudo:session): session closed for user root
Dec 02 09:57:56 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:57:56 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:57:56 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:57:56 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:57:56 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: from='client.34455 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005541911.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: Removed host np0005541911.localdomain
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:57:57 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/464441794' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:57 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Dec 02 09:57:57 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Dec 02 09:57:57 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:57:57 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:57:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:57.713 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:57 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Writing back 50 completed events
Dec 02 09:57:58 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Dec 02 09:57:58 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Dec 02 09:57:58 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:57:58 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.2 (monmap changed)...
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2656442173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:57:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:57:58 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:57:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:57:58.877 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:57:59 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.5 (monmap changed)...
Dec 02 09:57:59 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:57:59 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:57:59 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:57:59 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:57:59 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:58:00 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:58:00 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:58:00 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:58:00 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:58:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:58:00 np0005541913.localdomain podman[301889]: 2025-12-02 09:58:00.438867668 +0000 UTC m=+0.075978274 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:00 np0005541913.localdomain podman[301889]: 2025-12-02 09:58:00.455134374 +0000 UTC m=+0.092245030 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 02 09:58:00 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:58:00 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054604 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:01 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:58:01 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:58:01 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:58:01 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:58:01 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:58:01 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:58:02 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:58:02 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:58:02 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:58:02 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:58:02 np0005541913.localdomain sudo[301908]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:02 np0005541913.localdomain sudo[301908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:02 np0005541913.localdomain sudo[301908]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:02 np0005541913.localdomain sudo[301926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:02 np0005541913.localdomain sudo[301926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:02 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:02.767 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:02 np0005541913.localdomain podman[301962]: 
Dec 02 09:58:03 np0005541913.localdomain podman[301962]: 2025-12-02 09:58:03.012631938 +0000 UTC m=+0.085710584 container create 177ccc4e336d401093607b1140c85bb334cbbad92533297013beb877093cbc21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_nightingale, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=1763362218, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:58:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:58:03.042 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:58:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:58:03.043 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:58:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:58:03.043 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:58:03 np0005541913.localdomain systemd[1]: Started libpod-conmon-177ccc4e336d401093607b1140c85bb334cbbad92533297013beb877093cbc21.scope.
Dec 02 09:58:03 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:03 np0005541913.localdomain podman[301962]: 2025-12-02 09:58:03.071648388 +0000 UTC m=+0.144727004 container init 177ccc4e336d401093607b1140c85bb334cbbad92533297013beb877093cbc21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_nightingale, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, architecture=x86_64, release=1763362218, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, ceph=True, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main)
Dec 02 09:58:03 np0005541913.localdomain podman[301962]: 2025-12-02 09:58:02.975034562 +0000 UTC m=+0.048113228 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:03 np0005541913.localdomain systemd[1]: tmp-crun.vbUCFm.mount: Deactivated successfully.
Dec 02 09:58:03 np0005541913.localdomain podman[301962]: 2025-12-02 09:58:03.080571257 +0000 UTC m=+0.153649893 container start 177ccc4e336d401093607b1140c85bb334cbbad92533297013beb877093cbc21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_nightingale, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, ceph=True, io.buildah.version=1.41.4, name=rhceph, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:58:03 np0005541913.localdomain podman[301962]: 2025-12-02 09:58:03.080815283 +0000 UTC m=+0.153893919 container attach 177ccc4e336d401093607b1140c85bb334cbbad92533297013beb877093cbc21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_nightingale, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:58:03 np0005541913.localdomain systemd[1]: libpod-177ccc4e336d401093607b1140c85bb334cbbad92533297013beb877093cbc21.scope: Deactivated successfully.
Dec 02 09:58:03 np0005541913.localdomain sad_nightingale[301977]: 167 167
Dec 02 09:58:03 np0005541913.localdomain podman[301962]: 2025-12-02 09:58:03.083269939 +0000 UTC m=+0.156348555 container died 177ccc4e336d401093607b1140c85bb334cbbad92533297013beb877093cbc21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_nightingale, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, architecture=x86_64, distribution-scope=public, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git)
Dec 02 09:58:03 np0005541913.localdomain podman[301982]: 2025-12-02 09:58:03.165765316 +0000 UTC m=+0.076045986 container remove 177ccc4e336d401093607b1140c85bb334cbbad92533297013beb877093cbc21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_nightingale, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.41.4, name=rhceph, build-date=2025-11-26T19:44:28Z, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Dec 02 09:58:03 np0005541913.localdomain systemd[1]: libpod-conmon-177ccc4e336d401093607b1140c85bb334cbbad92533297013beb877093cbc21.scope: Deactivated successfully.
Dec 02 09:58:03 np0005541913.localdomain sudo[301926]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:03 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Dec 02 09:58:03 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Dec 02 09:58:03 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:03 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:03 np0005541913.localdomain sudo[301998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:03 np0005541913.localdomain sudo[301998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:03 np0005541913.localdomain sudo[301998]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:03 np0005541913.localdomain sudo[302016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:03 np0005541913.localdomain sudo[302016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:03 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:58:03 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:58:03 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:03 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:03 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:58:03 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:03 np0005541913.localdomain podman[302050]: 
Dec 02 09:58:03 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.34458 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:58:03 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Saving service mon spec with placement label:mon
Dec 02 09:58:03 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Dec 02 09:58:03 np0005541913.localdomain podman[302050]: 2025-12-02 09:58:03.860470636 +0000 UTC m=+0.061655741 container create dbbf4e9bcf1f79f2fe2aa09bf07261ca44d6be15391406d489322dfe9146805b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_clarke, RELEASE=main, release=1763362218, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, name=rhceph)
Dec 02 09:58:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:03.911 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:03 np0005541913.localdomain systemd[1]: Started libpod-conmon-dbbf4e9bcf1f79f2fe2aa09bf07261ca44d6be15391406d489322dfe9146805b.scope.
Dec 02 09:58:03 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:03 np0005541913.localdomain podman[302050]: 2025-12-02 09:58:03.842709171 +0000 UTC m=+0.043894286 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:03 np0005541913.localdomain podman[302050]: 2025-12-02 09:58:03.955829057 +0000 UTC m=+0.157014152 container init dbbf4e9bcf1f79f2fe2aa09bf07261ca44d6be15391406d489322dfe9146805b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_clarke, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, name=rhceph, GIT_CLEAN=True)
Dec 02 09:58:03 np0005541913.localdomain podman[302050]: 2025-12-02 09:58:03.963602095 +0000 UTC m=+0.164787220 container start dbbf4e9bcf1f79f2fe2aa09bf07261ca44d6be15391406d489322dfe9146805b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_clarke, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, RELEASE=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git)
Dec 02 09:58:03 np0005541913.localdomain compassionate_clarke[302065]: 167 167
Dec 02 09:58:03 np0005541913.localdomain podman[302050]: 2025-12-02 09:58:03.965416074 +0000 UTC m=+0.166601229 container attach dbbf4e9bcf1f79f2fe2aa09bf07261ca44d6be15391406d489322dfe9146805b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_clarke, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.41.4, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1763362218, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Dec 02 09:58:03 np0005541913.localdomain systemd[1]: libpod-dbbf4e9bcf1f79f2fe2aa09bf07261ca44d6be15391406d489322dfe9146805b.scope: Deactivated successfully.
Dec 02 09:58:03 np0005541913.localdomain podman[302050]: 2025-12-02 09:58:03.967312074 +0000 UTC m=+0.168497209 container died dbbf4e9bcf1f79f2fe2aa09bf07261ca44d6be15391406d489322dfe9146805b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_clarke, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, version=7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph)
Dec 02 09:58:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bd9d6e487cb31c04755d1ff5516e54b619ec9e78adb0e55268308706e703e260-merged.mount: Deactivated successfully.
Dec 02 09:58:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6a5b75bee3978808b04bcfe9431e5596a747ae3099fb3ef6c2978ab18db9e2ed-merged.mount: Deactivated successfully.
Dec 02 09:58:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:58:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:58:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:58:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:58:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:58:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:58:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:58:04 np0005541913.localdomain podman[302070]: 2025-12-02 09:58:04.070635639 +0000 UTC m=+0.094391457 container remove dbbf4e9bcf1f79f2fe2aa09bf07261ca44d6be15391406d489322dfe9146805b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_clarke, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.expose-services=)
Dec 02 09:58:04 np0005541913.localdomain systemd[1]: libpod-conmon-dbbf4e9bcf1f79f2fe2aa09bf07261ca44d6be15391406d489322dfe9146805b.scope: Deactivated successfully.
Dec 02 09:58:04 np0005541913.localdomain sudo[302016]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:04 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Dec 02 09:58:04 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Dec 02 09:58:04 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:58:04 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/486568655' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/486568655' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 09:58:04 np0005541913.localdomain sudo[302093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:04 np0005541913.localdomain sudo[302093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:04 np0005541913.localdomain sudo[302093]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:04 np0005541913.localdomain sudo[302111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:04 np0005541913.localdomain sudo[302111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.0 (monmap changed)...
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/486568655' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/486568655' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 09:58:04 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.641151) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669484641203, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1882, "num_deletes": 283, "total_data_size": 4743595, "memory_usage": 4961392, "flush_reason": "Manual Compaction"}
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669484668156, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3555029, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11598, "largest_seqno": 13479, "table_properties": {"data_size": 3546648, "index_size": 4630, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 23990, "raw_average_key_size": 22, "raw_value_size": 3527072, "raw_average_value_size": 3296, "num_data_blocks": 194, "num_entries": 1070, "num_filter_entries": 1070, "num_deletions": 267, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669438, "oldest_key_time": 1764669438, "file_creation_time": 1764669484, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 27059 microseconds, and 9071 cpu microseconds.
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.668208) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3555029 bytes OK
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.668236) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.670200) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.670223) EVENT_LOG_v1 {"time_micros": 1764669484670216, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.670249) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 4732778, prev total WAL file size 4732778, number of live WAL files 2.
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.671357) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323839' seq:72057594037927935, type:22 .. '6B760031353530' seq:0, type:0; will stop at (end)
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3471KB)], [15(15MB)]
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669484671440, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 19655172, "oldest_snapshot_seqno": -1}
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 11403 keys, 18677977 bytes, temperature: kUnknown
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669484793801, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 18677977, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18609805, "index_size": 38567, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28549, "raw_key_size": 305959, "raw_average_key_size": 26, "raw_value_size": 18411853, "raw_average_value_size": 1614, "num_data_blocks": 1472, "num_entries": 11403, "num_filter_entries": 11403, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764669484, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.794145) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 18677977 bytes
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.796108) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.5 rd, 152.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 15.4 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(10.8) write-amplify(5.3) OK, records in: 11916, records dropped: 513 output_compression: NoCompression
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.796132) EVENT_LOG_v1 {"time_micros": 1764669484796120, "job": 6, "event": "compaction_finished", "compaction_time_micros": 122477, "compaction_time_cpu_micros": 49487, "output_level": 6, "num_output_files": 1, "total_output_size": 18677977, "num_input_records": 11916, "num_output_records": 11403, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669484796731, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669484798363, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.671218) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.798460) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.798470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.798473) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.798476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:04 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:04.798478) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:04 np0005541913.localdomain podman[302145]: 
Dec 02 09:58:04 np0005541913.localdomain podman[302145]: 2025-12-02 09:58:04.928981798 +0000 UTC m=+0.080847625 container create 934816128e30bbff845bac51c03aee07ad177a04c6c089934ffdc11b2214d311 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_booth, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhceph, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main)
Dec 02 09:58:04 np0005541913.localdomain systemd[1]: Started libpod-conmon-934816128e30bbff845bac51c03aee07ad177a04c6c089934ffdc11b2214d311.scope.
Dec 02 09:58:04 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:04 np0005541913.localdomain podman[302145]: 2025-12-02 09:58:04.99332361 +0000 UTC m=+0.145189367 container init 934816128e30bbff845bac51c03aee07ad177a04c6c089934ffdc11b2214d311 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_booth, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, GIT_CLEAN=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-11-26T19:44:28Z)
Dec 02 09:58:04 np0005541913.localdomain podman[302145]: 2025-12-02 09:58:04.896688653 +0000 UTC m=+0.048554470 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:05 np0005541913.localdomain podman[302145]: 2025-12-02 09:58:05.002103755 +0000 UTC m=+0.153969512 container start 934816128e30bbff845bac51c03aee07ad177a04c6c089934ffdc11b2214d311 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_booth, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main)
Dec 02 09:58:05 np0005541913.localdomain podman[302145]: 2025-12-02 09:58:05.002255559 +0000 UTC m=+0.154121316 container attach 934816128e30bbff845bac51c03aee07ad177a04c6c089934ffdc11b2214d311 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_booth, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.component=rhceph-container, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, release=1763362218, io.buildah.version=1.41.4, distribution-scope=public, GIT_CLEAN=True, vcs-type=git)
Dec 02 09:58:05 np0005541913.localdomain elated_booth[302159]: 167 167
Dec 02 09:58:05 np0005541913.localdomain systemd[1]: libpod-934816128e30bbff845bac51c03aee07ad177a04c6c089934ffdc11b2214d311.scope: Deactivated successfully.
Dec 02 09:58:05 np0005541913.localdomain podman[302145]: 2025-12-02 09:58:05.006883692 +0000 UTC m=+0.158749549 container died 934816128e30bbff845bac51c03aee07ad177a04c6c089934ffdc11b2214d311 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_booth, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, architecture=x86_64, ceph=True, release=1763362218, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, name=rhceph, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main)
Dec 02 09:58:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c33e50fd7f93b8f94a634a4ca052d5b58d0a040d2bd768832d781517f963fa6a-merged.mount: Deactivated successfully.
Dec 02 09:58:05 np0005541913.localdomain podman[302164]: 2025-12-02 09:58:05.109457967 +0000 UTC m=+0.088416377 container remove 934816128e30bbff845bac51c03aee07ad177a04c6c089934ffdc11b2214d311 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_booth, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git)
Dec 02 09:58:05 np0005541913.localdomain systemd[1]: libpod-conmon-934816128e30bbff845bac51c03aee07ad177a04c6c089934ffdc11b2214d311.scope: Deactivated successfully.
Dec 02 09:58:05 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.54149 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541914", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:58:05 np0005541913.localdomain sudo[302111]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:05 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:58:05 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:58:05 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:58:05 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: from='client.34458 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: Saving service mon spec with placement label:mon
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.3 (monmap changed)...
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:05 np0005541913.localdomain sudo[302190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:05 np0005541913.localdomain sudo[302190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:05 np0005541913.localdomain sudo[302190]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:05 np0005541913.localdomain sudo[302208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:05 np0005541913.localdomain sudo[302208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:58:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:58:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:58:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:58:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:58:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18717 "" "Go-http-client/1.1"
Dec 02 09:58:06 np0005541913.localdomain podman[302243]: 
Dec 02 09:58:06 np0005541913.localdomain podman[302243]: 2025-12-02 09:58:06.147214315 +0000 UTC m=+0.143420458 container create e5f9cd4f03f99397d6af1454563a2adea60726b577981d31ac2b2fce09a12e6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_khayyam, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=)
Dec 02 09:58:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-e5f9cd4f03f99397d6af1454563a2adea60726b577981d31ac2b2fce09a12e6f.scope.
Dec 02 09:58:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:06 np0005541913.localdomain podman[302243]: 2025-12-02 09:58:06.115164488 +0000 UTC m=+0.111370671 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:06 np0005541913.localdomain podman[302243]: 2025-12-02 09:58:06.221714829 +0000 UTC m=+0.217920992 container init e5f9cd4f03f99397d6af1454563a2adea60726b577981d31ac2b2fce09a12e6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_khayyam, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=)
Dec 02 09:58:06 np0005541913.localdomain podman[302243]: 2025-12-02 09:58:06.232540089 +0000 UTC m=+0.228746222 container start e5f9cd4f03f99397d6af1454563a2adea60726b577981d31ac2b2fce09a12e6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_khayyam, GIT_BRANCH=main, RELEASE=main, build-date=2025-11-26T19:44:28Z, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 02 09:58:06 np0005541913.localdomain podman[302243]: 2025-12-02 09:58:06.232890678 +0000 UTC m=+0.229096811 container attach e5f9cd4f03f99397d6af1454563a2adea60726b577981d31ac2b2fce09a12e6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_khayyam, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, RELEASE=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True)
Dec 02 09:58:06 np0005541913.localdomain recursing_khayyam[302258]: 167 167
Dec 02 09:58:06 np0005541913.localdomain systemd[1]: libpod-e5f9cd4f03f99397d6af1454563a2adea60726b577981d31ac2b2fce09a12e6f.scope: Deactivated successfully.
Dec 02 09:58:06 np0005541913.localdomain podman[302243]: 2025-12-02 09:58:06.237715518 +0000 UTC m=+0.233921651 container died e5f9cd4f03f99397d6af1454563a2adea60726b577981d31ac2b2fce09a12e6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_khayyam, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Dec 02 09:58:06 np0005541913.localdomain podman[302263]: 2025-12-02 09:58:06.345582224 +0000 UTC m=+0.093408651 container remove e5f9cd4f03f99397d6af1454563a2adea60726b577981d31ac2b2fce09a12e6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_khayyam, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, architecture=x86_64, build-date=2025-11-26T19:44:28Z, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:58:06 np0005541913.localdomain systemd[1]: libpod-conmon-e5f9cd4f03f99397d6af1454563a2adea60726b577981d31ac2b2fce09a12e6f.scope: Deactivated successfully.
Dec 02 09:58:06 np0005541913.localdomain sudo[302208]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:58:06 np0005541913.localdomain sudo[302279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: from='client.54149 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541914", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:06 np0005541913.localdomain sudo[302279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:58:06 np0005541913.localdomain sudo[302279]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:06 np0005541913.localdomain sudo[302298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:06 np0005541913.localdomain sudo[302298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:06 np0005541913.localdomain podman[302296]: 2025-12-02 09:58:06.675082161 +0000 UTC m=+0.100578712 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:58:06 np0005541913.localdomain podman[302296]: 2025-12-02 09:58:06.712230085 +0000 UTC m=+0.137726596 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 09:58:06 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.44434 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005541914"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Remove daemons mon.np0005541914
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005541914
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005541914: new quorum should be ['np0005541912', 'np0005541913'] (from ['np0005541912', 'np0005541913'])
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005541914: new quorum should be ['np0005541912', 'np0005541913'] (from ['np0005541912', 'np0005541913'])
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005541914 from monmap...
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing monitor np0005541914 from monmap...
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005541914 from np0005541914.localdomain -- ports []
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005541914 from np0005541914.localdomain -- ports []
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@2(peon) e16  my rank is now 1 (was 2)
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: client.44339 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 02 09:58:06 np0005541913.localdomain ceph-mgr[288059]: client.26949 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(probing) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541912"} v 0)
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: paxos.1).electionLogic(62) init, last seen epoch 62
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e16 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e16 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541913"} v 0)
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:58:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: from='client.44434 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005541914"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: Remove daemons mon.np0005541914
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "quorum_status"} : dispatch
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon rm", "name": "np0005541914"} : dispatch
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: Safe to remove mon.np0005541914: new quorum should be ['np0005541912', 'np0005541913'] (from ['np0005541912', 'np0005541913'])
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: Removing monitor np0005541914 from monmap...
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: Removing daemon mon.np0005541914 from np0005541914.localdomain -- ports []
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 calling monitor election
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913 calling monitor election
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 is new leader, mons np0005541912,np0005541913 in quorum (ranks 0,1)
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: monmap epoch 16
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: last_changed 2025-12-02T09:58:06.915003+0000
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: min_mon_release 18 (reef)
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: election_strategy: 1
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: osdmap e89: 6 total, 6 up, 6 in
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: mgrmap e31: np0005541913.mfesdm(active, since 52s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: overall HEALTH_OK
Dec 02 09:58:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-eb82e691c46520bece461b33e7dd2f630c337784f48462c20df3075c80f0bed0-merged.mount: Deactivated successfully.
Dec 02 09:58:07 np0005541913.localdomain podman[302351]: 2025-12-02 09:58:07.11942062 +0000 UTC m=+0.049910295 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:07 np0005541913.localdomain podman[302351]: 
Dec 02 09:58:07 np0005541913.localdomain podman[302351]: 2025-12-02 09:58:07.319788563 +0000 UTC m=+0.250278178 container create b8490e4f98024b09e7349bb7c81fe786447b4dd6621857380c291c518497ecd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_driscoll, io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 02 09:58:07 np0005541913.localdomain systemd[1]: Started libpod-conmon-b8490e4f98024b09e7349bb7c81fe786447b4dd6621857380c291c518497ecd7.scope.
Dec 02 09:58:07 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:07 np0005541913.localdomain podman[302351]: 2025-12-02 09:58:07.37650874 +0000 UTC m=+0.306998365 container init b8490e4f98024b09e7349bb7c81fe786447b4dd6621857380c291c518497ecd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_driscoll, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1763362218, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main)
Dec 02 09:58:07 np0005541913.localdomain podman[302351]: 2025-12-02 09:58:07.386494818 +0000 UTC m=+0.316984443 container start b8490e4f98024b09e7349bb7c81fe786447b4dd6621857380c291c518497ecd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_driscoll, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, release=1763362218, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph)
Dec 02 09:58:07 np0005541913.localdomain podman[302351]: 2025-12-02 09:58:07.386701383 +0000 UTC m=+0.317191008 container attach b8490e4f98024b09e7349bb7c81fe786447b4dd6621857380c291c518497ecd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_driscoll, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=)
Dec 02 09:58:07 np0005541913.localdomain wonderful_driscoll[302365]: 167 167
Dec 02 09:58:07 np0005541913.localdomain systemd[1]: libpod-b8490e4f98024b09e7349bb7c81fe786447b4dd6621857380c291c518497ecd7.scope: Deactivated successfully.
Dec 02 09:58:07 np0005541913.localdomain podman[302351]: 2025-12-02 09:58:07.392893449 +0000 UTC m=+0.323383064 container died b8490e4f98024b09e7349bb7c81fe786447b4dd6621857380c291c518497ecd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_driscoll, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:58:07 np0005541913.localdomain podman[302370]: 2025-12-02 09:58:07.48486495 +0000 UTC m=+0.079971992 container remove b8490e4f98024b09e7349bb7c81fe786447b4dd6621857380c291c518497ecd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_driscoll, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=1763362218, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z)
Dec 02 09:58:07 np0005541913.localdomain systemd[1]: libpod-conmon-b8490e4f98024b09e7349bb7c81fe786447b4dd6621857380c291c518497ecd7.scope: Deactivated successfully.
Dec 02 09:58:07 np0005541913.localdomain sudo[302298]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:07 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005541914 (monmap changed)...
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:07 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005541914 (monmap changed)...
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:07 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:07 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005541914 on np0005541914.localdomain
Dec 02 09:58:07 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005541914 on np0005541914.localdomain
Dec 02 09:58:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:07.804 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:08 np0005541913.localdomain systemd[1]: tmp-crun.pogVUv.mount: Deactivated successfully.
Dec 02 09:58:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-3b3ca6abf0417dfd1a39b7a7b95701977844c5656b6c733d8017eeb759194dbf-merged.mount: Deactivated successfully.
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541914 (monmap changed)...
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541914 on np0005541914.localdomain
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:08 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:08 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Dec 02 09:58:08 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:08 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:58:08 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:58:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:08.956 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:58:09 np0005541913.localdomain podman[302387]: 2025-12-02 09:58:09.169950681 +0000 UTC m=+0.084852001 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public)
Dec 02 09:58:09 np0005541913.localdomain podman[302387]: 2025-12-02 09:58:09.189813134 +0000 UTC m=+0.104714464 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 09:58:09 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.1 (monmap changed)...
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:09 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Dec 02 09:58:09 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:09 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:09 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:58:09 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:58:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:58:10 np0005541913.localdomain podman[302407]: 2025-12-02 09:58:10.446010777 +0000 UTC m=+0.083070694 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:58:10 np0005541913.localdomain podman[302407]: 2025-12-02 09:58:10.458971494 +0000 UTC m=+0.096031411 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:58:10 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:10 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005541914.sqgqkj (monmap changed)...
Dec 02 09:58:10 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005541914.sqgqkj (monmap changed)...
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:10 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005541914.sqgqkj on np0005541914.localdomain
Dec 02 09:58:10 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005541914.sqgqkj on np0005541914.localdomain
Dec 02 09:58:10 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.4 (monmap changed)...
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:11 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005541914.lljzmk (monmap changed)...
Dec 02 09:58:11 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005541914.lljzmk (monmap changed)...
Dec 02 09:58:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 02 09:58:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 02 09:58:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:11 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:58:11 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541914.sqgqkj (monmap changed)...
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541914.sqgqkj on np0005541914.localdomain
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541914.lljzmk (monmap changed)...
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:12 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:12.877 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:13 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 02 09:58:13 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:58:13 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:13 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:13 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:13 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:13 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:13 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:13 np0005541913.localdomain sudo[302430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:58:13 np0005541913.localdomain sudo[302430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:13 np0005541913.localdomain sudo[302430]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:13 np0005541913.localdomain sudo[302448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:58:13 np0005541913.localdomain sudo[302448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:13 np0005541913.localdomain sudo[302448]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:13 np0005541913.localdomain sudo[302466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:58:13 np0005541913.localdomain sudo[302466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:13 np0005541913.localdomain sudo[302466]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:13 np0005541913.localdomain sudo[302484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:13 np0005541913.localdomain sudo[302484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:13 np0005541913.localdomain sudo[302484]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:13.981 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:14 np0005541913.localdomain sudo[302502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:58:14 np0005541913.localdomain sudo[302502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302502]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain sudo[302536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:58:14 np0005541913.localdomain sudo[302536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302536]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain sudo[302554]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:58:14 np0005541913.localdomain sudo[302554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302554]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:14 np0005541913.localdomain sudo[302572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:58:14 np0005541913.localdomain sudo[302572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302572]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:14 np0005541913.localdomain sudo[302590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:58:14 np0005541913.localdomain sudo[302590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302590]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain sudo[302608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:58:14 np0005541913.localdomain sudo[302608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302608]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain sudo[302626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:58:14 np0005541913.localdomain sudo[302626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302626]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain sudo[302644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:14 np0005541913.localdomain sudo[302644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302644]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: [balancer INFO root] Optimize plan auto_2025-12-02_09:58:14
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: [balancer INFO root] do_upmap
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: [balancer INFO root] pools ['images', '.mgr', 'manila_metadata', 'vms', 'manila_data', 'volumes', 'backups']
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: [balancer INFO root] prepared 0/10 changes
Dec 02 09:58:14 np0005541913.localdomain sudo[302662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:58:14 np0005541913.localdomain sudo[302662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:14 np0005541913.localdomain sudo[302662]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain sudo[302696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:58:14 np0005541913.localdomain sudo[302696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302696]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain sudo[302714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:58:14 np0005541913.localdomain sudo[302714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302714]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:14 np0005541913.localdomain sudo[302732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:14 np0005541913.localdomain sudo[302732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:14 np0005541913.localdomain sudo[302732]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] _maybe_adjust
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0033244564838079286 of space, bias 1.0, pg target 0.6648912967615858 quantized to 32 (current 32)
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.1810441094360693e-06 of space, bias 4.0, pg target 0.001741927228736274 quantized to 16 (current 16)
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] scanning for idle connections..
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] cleaning up connections: []
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] scanning for idle connections..
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] cleaning up connections: []
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] scanning for idle connections..
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [volumes INFO mgr_util] cleaning up connections: []
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev e3621e36-1451-4294-bcba-377dbfd809b2 (Updating node-proxy deployment (+3 -> 3))
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev e3621e36-1451-4294-bcba-377dbfd809b2 (Updating node-proxy deployment (+3 -> 3))
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event e3621e36-1451-4294-bcba-377dbfd809b2 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:58:15 np0005541913.localdomain sudo[302750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:58:15 np0005541913.localdomain sudo[302750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:58:15 np0005541913.localdomain sudo[302750]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:15 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:58:15 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:58:15 np0005541913.localdomain podman[302768]: 2025-12-02 09:58:15.800528605 +0000 UTC m=+0.086087153 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:58:15 np0005541913.localdomain podman[302768]: 2025-12-02 09:58:15.81899137 +0000 UTC m=+0.104549938 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 09:58:15 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.104 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.136 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.137 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a6458d9-88ba-458b-9149-431f1bf926d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:58:16.105820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c2325c4-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': 'b46e31a5b3663171f380b99df0e98f1138a8a54a4fff4b93a0941cdb9da02a54'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:58:16.105820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c233d48-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': '8014f02d9f821a9c491bdd689599d057764c955b2d72d4874447710b4d14b49b'}]}, 'timestamp': '2025-12-02 09:58:16.138303', '_unique_id': '4f4a8828cb9a467bb4e14622c64f7771'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.139 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.146 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ec9a029-4200-412f-af18-127d6bb91dc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.141284', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c2495e4-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': 'c552f2ce375524dee2acd5284e1109db1763988e0f26fcd330cfc3f85546c609'}]}, 'timestamp': '2025-12-02 09:58:16.147176', '_unique_id': '777f7b92af4d4c23ad48b5077dd6020d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.148 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.149 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.149 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bd12528-eed9-440f-b690-5a8362836d54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.149861', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c25135c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': '099f28fc0260231bbc43f5e29d822301e32bb258153d67312e5973a5a8b4eb3c'}]}, 'timestamp': '2025-12-02 09:58:16.150351', '_unique_id': '35b4af476df04ca189cf0f9eb36b5bee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.151 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.152 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.153 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb2b7538-0df1-4e3f-b3b9-71c0aebc2788', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:58:16.152809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c258602-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': 'b0e51e21c83046b7dbc1da17eb672868e50b89940136f97dd315159c789f85eb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:58:16.152809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c2595e8-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': '81d3e2a5527d8cb07e9aa4ed903cf91fb31c58ca8395586b2505415d00f2810f'}]}, 'timestamp': '2025-12-02 09:58:16.153707', '_unique_id': 'dbf22671b9e744aa8e29657ee94ecb4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.167 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.168 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96f3876c-b006-4ba0-ad4a-767a842e90a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:58:16.156494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c27c82c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.375581492, 'message_signature': 'f3e2efc5190d06d75bdb3ef43a931a872aad0dff31a056ef4269f4913d7c1f5a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:58:16.156494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c27d998-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.375581492, 'message_signature': 'd77d9edb32a2168ada945ae9ebd944616141fb3ae6725d14d3778bac2d122b53'}]}, 'timestamp': '2025-12-02 09:58:16.168498', '_unique_id': '816ec79913204a3692e63154c3b7d375'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.169 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.170 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ca412db-ce3e-4116-b99b-64e96af4cb97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.170781', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c284482-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': '4dd2810405d20b99e7f233d8c356e56a4a28b70549db75b8319f90cba2915ca4'}]}, 'timestamp': '2025-12-02 09:58:16.171306', '_unique_id': '237c7417d050472d8a47c53f1d46a7d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.173 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.191 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '507ca41d-92ad-4756-9886-35c2787f0e81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:58:16.173494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6c2b6194-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.410034845, 'message_signature': '72802dd934055397578692510bca27d4041dfc0ba76b3d0b49128af5c087a71f'}]}, 'timestamp': '2025-12-02 09:58:16.191698', '_unique_id': 'a0b993f6ad474616b30c8cfeb6452c97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.194 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '298f19f2-371a-45a9-8772-70c3e3b347b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.194576', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c2be77c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': '48f3ecf8c47d68155da838a8b505f5f9140e66a2f01fd24584ad3fc8810aff3f'}]}, 'timestamp': '2025-12-02 09:58:16.195374', '_unique_id': '4e4ba8609bef4558ab77236ee14cf985'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.197 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e0235d8-f5da-40b5-944d-e67d9eabd97e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.197566', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c2c5c52-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': '19be324fe464ff250467d199bc0ede06b0ea369472aef0132b27f25e2f0958ac'}]}, 'timestamp': '2025-12-02 09:58:16.198080', '_unique_id': '79e9d8e89f994659ace165be72c81246'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.200 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.200 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fc1463f-71e8-40f6-b49e-a5ca7ec4fe6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.200357', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c2cc714-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': 'bb9d38f5f86e4c6538b910516d63f35e31d6fba8f81d540ea6deac6aaeb3a58e'}]}, 'timestamp': '2025-12-02 09:58:16.200840', '_unique_id': 'f99dfa9184114834b53293a5f40e7bd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.202 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.202 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 14590000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e68f05a0-a752-45f6-9397-c842123a1f96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14590000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T09:58:16.202937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6c2d2d30-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.410034845, 'message_signature': '47428d6bb6db1bb65cf762a32614bd8c81eca71f27991b56e77034463f05e704'}]}, 'timestamp': '2025-12-02 09:58:16.203410', '_unique_id': 'cf4d4964cf78454a837e198e8fd46aa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.204 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55080ed0-c558-4f8e-aab8-35f82fcfe8c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.204870', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c2d7a88-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': 'f07e8a5c5593d2af49187c9580ba004166daa2c5dc97436c77913bfa3cd2d160'}]}, 'timestamp': '2025-12-02 09:58:16.205316', '_unique_id': '30acf3002da942b38805a60222317b44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61946cb6-2b44-4c5e-b0ee-0b8275d0bf42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.206590', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c2db868-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': '6635af5ca6b9933d99a3272d10fae66658b4920b4223008e48b281a8b9444885'}]}, 'timestamp': '2025-12-02 09:58:16.206900', '_unique_id': '7cd1241fcee14c3fb14fc5e0f0c03b9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.208 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9654b410-02f8-4bbc-9b02-20739e2c7c10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.208189', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c2df5b2-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': '1688479c048809062f5234faa0bef87b033c6dc5593fc51f79e5cf969c032174'}]}, 'timestamp': '2025-12-02 09:58:16.208465', '_unique_id': 'b7e83225227e4b0da58fd66c394691ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d810936-eb54-461e-b6d3-3571b7591119', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:58:16.209819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c2e3522-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.375581492, 'message_signature': 'ce7b0aab59e1e8656a4b1892afc97c1d0d4d2e436f53d3bc3951f05190b03af6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:58:16.209819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c2e3e82-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.375581492, 'message_signature': '59fb1f8cc89e8467f62d5ea05c1be75747abc2f9b9f302a521f7781648f08eb3'}]}, 'timestamp': '2025-12-02 09:58:16.210311', '_unique_id': '57a0a3aa4c89481493809c2d853eef55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.211 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b69c64c2-5392-4c4a-922f-84a5d161b735', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T09:58:16.211607', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '6c2e7c08-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.360381316, 'message_signature': '026767cde7c5c83475b2908c1004e7a50bf77803110eaf6dad0d6f84a108b20c'}]}, 'timestamp': '2025-12-02 09:58:16.211910', '_unique_id': '2c91fe62ae2247eeb4ff5f8ba1b0549a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.213 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d626e27-42ab-4e58-869a-64135d5ec3ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:58:16.213144', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c2eb70e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': 'ef7b10423c4abc8320372d76636bb5ae7a6b5cd5f6a55beb87bb8ee27cab9d8b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:58:16.213144', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c2ec06e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': 'a6405f00fdf831b964f03d0461b03c0e83d734ef17623f9bdabf58ca4c2daeb6'}]}, 'timestamp': '2025-12-02 09:58:16.213661', '_unique_id': '3c0677731c774e7a8b2f85f025bb107f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.214 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f4a1f80-493f-40af-a552-023b17ceeab2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:58:16.214932', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c2efd72-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': '29b243fc3de3276ca1f208f1a48c26bd7d62f5a2241ddc421f40cfe07f4b9cce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:58:16.214932', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c2f06f0-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': 'f7ff16a26245c52eb41c4867ec87cdf1c80f0230f1467af316c6d806718c2e54'}]}, 'timestamp': '2025-12-02 09:58:16.215444', '_unique_id': '68511aec8b2246219888296157afaf0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.216 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.217 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.217 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9bf2700-0458-4b6c-a31f-0d6d8c7f94e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:58:16.217028', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c2f4f84-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.375581492, 'message_signature': 'f9990d208416e4cdbfe864da646f4f6e98e0ff49f0c84187360c9e198422f4e5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:58:16.217028', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c2f5dda-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.375581492, 'message_signature': 'e1a4311c1b230949f6235538989479505267c6d3103d3d11ee2e1e32f3ae213f'}]}, 'timestamp': '2025-12-02 09:58:16.217707', '_unique_id': '6c1812423ef54b03868f3aee31341bfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.219 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.219 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a78e658-1584-4918-94c0-119ccfbc5530', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:58:16.219190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c2fa380-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': '2d5286a2e3afcbe566e1102c29e188dc6dd2ce20b6a2b3c6ac2b4880f82257b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:58:16.219190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c2fad08-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': 'de8eacc1da6b7c4c25f0009ce01e53b541bb90b7f1a3fa7f8fe5ece2166cdfed'}]}, 'timestamp': '2025-12-02 09:58:16.219719', '_unique_id': '87de4fc210854e84bfd9b2604867a6db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.221 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.221 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d4f3aee-9f63-4875-921a-6a2d60fd3ce4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T09:58:16.221009', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c2fea5c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': '7773ee5da22130d434608566ffbc9bd28686a88942268934dcda0805fcc0ef76'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T09:58:16.221009', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c2ff59c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11658.324901576, 'message_signature': '78f8faf97453666faf9c3dd03ba9f84b2edab4789154e17655dc48652c38ef2a'}]}, 'timestamp': '2025-12-02 09:58:16.221555', '_unique_id': '50a60e15af6b49d2b0f9f70a77306af0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 09:58:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 09:58:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:16 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:16 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Dec 02 09:58:16 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:16 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:16 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:58:16 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:17 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Dec 02 09:58:17 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:17 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:17 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:58:17 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:58:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:17.872 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:17.919 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:17.922 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Triggering sync for uuid b254bb7f-2891-4b37-9c44-9700e301ce16 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 02 09:58:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:17.923 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:58:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:17.923 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:58:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:17.947 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:58:18 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Writing back 50 completed events
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.2 (monmap changed)...
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:58:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:58:18 np0005541913.localdomain systemd[1]: tmp-crun.ZMdQJ7.mount: Deactivated successfully.
Dec 02 09:58:18 np0005541913.localdomain podman[302787]: 2025-12-02 09:58:18.448182753 +0000 UTC m=+0.087668777 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 09:58:18 np0005541913.localdomain podman[302787]: 2025-12-02 09:58:18.457914634 +0000 UTC m=+0.097400698 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:58:18 np0005541913.localdomain podman[302788]: 2025-12-02 09:58:18.502739013 +0000 UTC m=+0.137397007 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 02 09:58:18 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:58:18 np0005541913.localdomain podman[302788]: 2025-12-02 09:58:18.569078669 +0000 UTC m=+0.203736693 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 09:58:18 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:58:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:18 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:18 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:58:18 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:58:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:18.983 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.5 (monmap changed)...
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:19 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:58:19 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:19 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:58:19 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:58:19 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.54159 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005541914.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:19 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:58:19 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:20 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:58:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:58:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:58:20 np0005541913.localdomain sudo[302834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:20 np0005541913.localdomain sudo[302834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:20 np0005541913.localdomain sudo[302834]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:20 np0005541913.localdomain sudo[302852]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:20 np0005541913.localdomain sudo[302852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:20 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:20 np0005541913.localdomain podman[302887]: 
Dec 02 09:58:20 np0005541913.localdomain podman[302887]: 2025-12-02 09:58:20.935512111 +0000 UTC m=+0.081551263 container create 869d45fb29f7c3bb109928203df54752dd1e0618b10dc28537452fdbdc934a5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_germain, release=1763362218, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True)
Dec 02 09:58:20 np0005541913.localdomain systemd[1]: Started libpod-conmon-869d45fb29f7c3bb109928203df54752dd1e0618b10dc28537452fdbdc934a5d.scope.
Dec 02 09:58:21 np0005541913.localdomain podman[302887]: 2025-12-02 09:58:20.903172486 +0000 UTC m=+0.049211718 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:21 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:21 np0005541913.localdomain podman[302887]: 2025-12-02 09:58:21.032812015 +0000 UTC m=+0.178851167 container init 869d45fb29f7c3bb109928203df54752dd1e0618b10dc28537452fdbdc934a5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_germain, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 09:58:21 np0005541913.localdomain podman[302887]: 2025-12-02 09:58:21.046227364 +0000 UTC m=+0.192266506 container start 869d45fb29f7c3bb109928203df54752dd1e0618b10dc28537452fdbdc934a5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_germain, GIT_CLEAN=True, RELEASE=main, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:58:21 np0005541913.localdomain podman[302887]: 2025-12-02 09:58:21.046494871 +0000 UTC m=+0.192534053 container attach 869d45fb29f7c3bb109928203df54752dd1e0618b10dc28537452fdbdc934a5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_germain, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, ceph=True, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:58:21 np0005541913.localdomain vigilant_germain[302902]: 167 167
Dec 02 09:58:21 np0005541913.localdomain systemd[1]: libpod-869d45fb29f7c3bb109928203df54752dd1e0618b10dc28537452fdbdc934a5d.scope: Deactivated successfully.
Dec 02 09:58:21 np0005541913.localdomain podman[302887]: 2025-12-02 09:58:21.050373265 +0000 UTC m=+0.196412437 container died 869d45fb29f7c3bb109928203df54752dd1e0618b10dc28537452fdbdc934a5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_germain, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, version=7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, release=1763362218, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:58:21 np0005541913.localdomain podman[302907]: 2025-12-02 09:58:21.143341593 +0000 UTC m=+0.082849178 container remove 869d45fb29f7c3bb109928203df54752dd1e0618b10dc28537452fdbdc934a5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_germain, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.buildah.version=1.41.4, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, version=7, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218)
Dec 02 09:58:21 np0005541913.localdomain systemd[1]: libpod-conmon-869d45fb29f7c3bb109928203df54752dd1e0618b10dc28537452fdbdc934a5d.scope: Deactivated successfully.
Dec 02 09:58:21 np0005541913.localdomain sudo[302852]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:21 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Dec 02 09:58:21 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:21 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:21 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:21 np0005541913.localdomain sudo[302924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:21 np0005541913.localdomain sudo[302924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:21 np0005541913.localdomain sudo[302924]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:21 np0005541913.localdomain sudo[302942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:21 np0005541913.localdomain sudo[302942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: from='client.54159 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005541914.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: Deploying daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:58:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:21 np0005541913.localdomain podman[302977]: 
Dec 02 09:58:21 np0005541913.localdomain podman[302977]: 2025-12-02 09:58:21.848559243 +0000 UTC m=+0.072552112 container create 74fc70ec71210004b22e826c911dc0546428ab86bf8aa6a7ec0f290ad96d1991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_lederberg, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, build-date=2025-11-26T19:44:28Z, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Dec 02 09:58:21 np0005541913.localdomain systemd[1]: Started libpod-conmon-74fc70ec71210004b22e826c911dc0546428ab86bf8aa6a7ec0f290ad96d1991.scope.
Dec 02 09:58:21 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:21 np0005541913.localdomain podman[302977]: 2025-12-02 09:58:21.921824233 +0000 UTC m=+0.145817102 container init 74fc70ec71210004b22e826c911dc0546428ab86bf8aa6a7ec0f290ad96d1991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_lederberg, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, name=rhceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph)
Dec 02 09:58:21 np0005541913.localdomain podman[302977]: 2025-12-02 09:58:21.822349632 +0000 UTC m=+0.046342471 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:21 np0005541913.localdomain podman[302977]: 2025-12-02 09:58:21.932312384 +0000 UTC m=+0.156305253 container start 74fc70ec71210004b22e826c911dc0546428ab86bf8aa6a7ec0f290ad96d1991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_lederberg, io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, ceph=True, io.openshift.expose-services=, RELEASE=main)
Dec 02 09:58:21 np0005541913.localdomain podman[302977]: 2025-12-02 09:58:21.932568901 +0000 UTC m=+0.156561820 container attach 74fc70ec71210004b22e826c911dc0546428ab86bf8aa6a7ec0f290ad96d1991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_lederberg, GIT_BRANCH=main, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, version=7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True)
Dec 02 09:58:21 np0005541913.localdomain zen_lederberg[302992]: 167 167
Dec 02 09:58:21 np0005541913.localdomain systemd[1]: libpod-74fc70ec71210004b22e826c911dc0546428ab86bf8aa6a7ec0f290ad96d1991.scope: Deactivated successfully.
Dec 02 09:58:21 np0005541913.localdomain podman[302977]: 2025-12-02 09:58:21.938994544 +0000 UTC m=+0.162987453 container died 74fc70ec71210004b22e826c911dc0546428ab86bf8aa6a7ec0f290ad96d1991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_lederberg, RELEASE=main, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, ceph=True, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 02 09:58:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6205354ee967f315095454d8cdd88af2da233efb04fd9b8ee27fa535d7feac96-merged.mount: Deactivated successfully.
Dec 02 09:58:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8e2362ccaa47a0e1a71729b7761a70346bd0162d86045abf1bc609ed0da6a617-merged.mount: Deactivated successfully.
Dec 02 09:58:22 np0005541913.localdomain podman[302997]: 2025-12-02 09:58:22.027322867 +0000 UTC m=+0.079234491 container remove 74fc70ec71210004b22e826c911dc0546428ab86bf8aa6a7ec0f290ad96d1991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_lederberg, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, architecture=x86_64, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:58:22 np0005541913.localdomain systemd[1]: libpod-conmon-74fc70ec71210004b22e826c911dc0546428ab86bf8aa6a7ec0f290ad96d1991.scope: Deactivated successfully.
Dec 02 09:58:22 np0005541913.localdomain sudo[302942]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:22 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Dec 02 09:58:22 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.250219) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669502250271, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 956, "num_deletes": 251, "total_data_size": 1547847, "memory_usage": 1568032, "flush_reason": "Manual Compaction"}
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669502257649, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 885725, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13484, "largest_seqno": 14435, "table_properties": {"data_size": 881034, "index_size": 2162, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 12440, "raw_average_key_size": 21, "raw_value_size": 870973, "raw_average_value_size": 1538, "num_data_blocks": 93, "num_entries": 566, "num_filter_entries": 566, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669485, "oldest_key_time": 1764669485, "file_creation_time": 1764669502, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 7466 microseconds, and 2729 cpu microseconds.
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.257691) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 885725 bytes OK
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.257711) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.259478) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.259494) EVENT_LOG_v1 {"time_micros": 1764669502259489, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.259514) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1542608, prev total WAL file size 1542608, number of live WAL files 2.
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.260009) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end)
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(864KB)], [18(17MB)]
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669502260066, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19563702, "oldest_snapshot_seqno": -1}
Dec 02 09:58:22 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:58:22 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:58:22 np0005541913.localdomain sudo[303021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:22 np0005541913.localdomain sudo[303021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:22 np0005541913.localdomain sudo[303021]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:22 np0005541913.localdomain sudo[303039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:22 np0005541913.localdomain sudo[303039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 11439 keys, 16326909 bytes, temperature: kUnknown
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669502386015, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 16326909, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16260316, "index_size": 36924, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28613, "raw_key_size": 307813, "raw_average_key_size": 26, "raw_value_size": 16063502, "raw_average_value_size": 1404, "num_data_blocks": 1401, "num_entries": 11439, "num_filter_entries": 11439, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764669502, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.386367) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 16326909 bytes
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.388107) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 155.2 rd, 129.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 17.8 +0.0 blob) out(15.6 +0.0 blob), read-write-amplify(40.5) write-amplify(18.4) OK, records in: 11969, records dropped: 530 output_compression: NoCompression
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.388173) EVENT_LOG_v1 {"time_micros": 1764669502388156, "job": 8, "event": "compaction_finished", "compaction_time_micros": 126043, "compaction_time_cpu_micros": 51486, "output_level": 6, "num_output_files": 1, "total_output_size": 16326909, "num_input_records": 11969, "num_output_records": 11439, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669502388441, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669502390866, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.259911) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.390932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.390942) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.390945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.390948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-09:58:22.390950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.0 (monmap changed)...
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.3 (monmap changed)...
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:22 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:22 np0005541913.localdomain podman[303074]: 
Dec 02 09:58:22 np0005541913.localdomain podman[303074]: 2025-12-02 09:58:22.842154511 +0000 UTC m=+0.081899883 container create edb0264906137dad1b75cc707ab22d27fecb5937c320d69346a2b6b7a83c7565 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_maxwell, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64)
Dec 02 09:58:22 np0005541913.localdomain systemd[1]: Started libpod-conmon-edb0264906137dad1b75cc707ab22d27fecb5937c320d69346a2b6b7a83c7565.scope.
Dec 02 09:58:22 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:22 np0005541913.localdomain podman[303074]: 2025-12-02 09:58:22.808258764 +0000 UTC m=+0.048004196 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:22 np0005541913.localdomain podman[303074]: 2025-12-02 09:58:22.919685676 +0000 UTC m=+0.159431038 container init edb0264906137dad1b75cc707ab22d27fecb5937c320d69346a2b6b7a83c7565 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_maxwell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1763362218, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:58:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:22.957 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:22 np0005541913.localdomain podman[303074]: 2025-12-02 09:58:22.965092331 +0000 UTC m=+0.204837703 container start edb0264906137dad1b75cc707ab22d27fecb5937c320d69346a2b6b7a83c7565 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_maxwell, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=1763362218, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, version=7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 02 09:58:22 np0005541913.localdomain podman[303074]: 2025-12-02 09:58:22.96543639 +0000 UTC m=+0.205181792 container attach edb0264906137dad1b75cc707ab22d27fecb5937c320d69346a2b6b7a83c7565 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_maxwell, architecture=x86_64, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, version=7, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, name=rhceph)
Dec 02 09:58:22 np0005541913.localdomain peaceful_maxwell[303090]: 167 167
Dec 02 09:58:22 np0005541913.localdomain systemd[1]: libpod-edb0264906137dad1b75cc707ab22d27fecb5937c320d69346a2b6b7a83c7565.scope: Deactivated successfully.
Dec 02 09:58:22 np0005541913.localdomain podman[303074]: 2025-12-02 09:58:22.969418046 +0000 UTC m=+0.209163428 container died edb0264906137dad1b75cc707ab22d27fecb5937c320d69346a2b6b7a83c7565 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_maxwell, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True)
Dec 02 09:58:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a81db17e6649e0d98962f92509d22b46b8fe0d5e24d3aa58688c1a1819f3fafa-merged.mount: Deactivated successfully.
Dec 02 09:58:23 np0005541913.localdomain podman[303095]: 2025-12-02 09:58:23.073954154 +0000 UTC m=+0.093002840 container remove edb0264906137dad1b75cc707ab22d27fecb5937c320d69346a2b6b7a83c7565 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_maxwell, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True)
Dec 02 09:58:23 np0005541913.localdomain systemd[1]: libpod-conmon-edb0264906137dad1b75cc707ab22d27fecb5937c320d69346a2b6b7a83c7565.scope: Deactivated successfully.
Dec 02 09:58:23 np0005541913.localdomain sudo[303039]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:23 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:23 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:23 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:58:23 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:58:23 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 02 09:58:23 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:23 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:23 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:23 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:58:23 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:58:23 np0005541913.localdomain sudo[303120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:23 np0005541913.localdomain sudo[303120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:23 np0005541913.localdomain sudo[303120]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:23 np0005541913.localdomain sudo[303138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:23 np0005541913.localdomain sudo[303138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:23 np0005541913.localdomain podman[303174]: 
Dec 02 09:58:23 np0005541913.localdomain podman[303174]: 2025-12-02 09:58:23.839654983 +0000 UTC m=+0.067981160 container create 68b30b4e39b49bba39e737696f46b8b0f63d6406e9acb0017b44fcb26cec85a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_aryabhata, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.41.4, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:58:23 np0005541913.localdomain systemd[1]: Started libpod-conmon-68b30b4e39b49bba39e737696f46b8b0f63d6406e9acb0017b44fcb26cec85a3.scope.
Dec 02 09:58:23 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:23 np0005541913.localdomain podman[303174]: 2025-12-02 09:58:23.902401341 +0000 UTC m=+0.130727498 container init 68b30b4e39b49bba39e737696f46b8b0f63d6406e9acb0017b44fcb26cec85a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_aryabhata, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, architecture=x86_64, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, description=Red Hat Ceph Storage 7)
Dec 02 09:58:23 np0005541913.localdomain podman[303174]: 2025-12-02 09:58:23.806588517 +0000 UTC m=+0.034914724 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:23 np0005541913.localdomain podman[303174]: 2025-12-02 09:58:23.908857525 +0000 UTC m=+0.137183682 container start 68b30b4e39b49bba39e737696f46b8b0f63d6406e9acb0017b44fcb26cec85a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_aryabhata, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.tags=rhceph ceph, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 02 09:58:23 np0005541913.localdomain podman[303174]: 2025-12-02 09:58:23.911720241 +0000 UTC m=+0.140046438 container attach 68b30b4e39b49bba39e737696f46b8b0f63d6406e9acb0017b44fcb26cec85a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_aryabhata, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, distribution-scope=public, release=1763362218, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:58:23 np0005541913.localdomain affectionate_aryabhata[303190]: 167 167
Dec 02 09:58:23 np0005541913.localdomain systemd[1]: libpod-68b30b4e39b49bba39e737696f46b8b0f63d6406e9acb0017b44fcb26cec85a3.scope: Deactivated successfully.
Dec 02 09:58:23 np0005541913.localdomain podman[303174]: 2025-12-02 09:58:23.914265669 +0000 UTC m=+0.142591856 container died 68b30b4e39b49bba39e737696f46b8b0f63d6406e9acb0017b44fcb26cec85a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_aryabhata, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, release=1763362218, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True)
Dec 02 09:58:23 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8fa32706b9d449e9bc607c4dfeb310e341a6de88b90949854234d8eebe9cf971-merged.mount: Deactivated successfully.
Dec 02 09:58:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:24.041 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:24 np0005541913.localdomain podman[303195]: 2025-12-02 09:58:24.044912046 +0000 UTC m=+0.119455198 container remove 68b30b4e39b49bba39e737696f46b8b0f63d6406e9acb0017b44fcb26cec85a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_aryabhata, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=1763362218, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, RELEASE=main)
Dec 02 09:58:24 np0005541913.localdomain systemd[1]: libpod-conmon-68b30b4e39b49bba39e737696f46b8b0f63d6406e9acb0017b44fcb26cec85a3.scope: Deactivated successfully.
Dec 02 09:58:24 np0005541913.localdomain sudo[303138]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:24 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:58:24 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:58:24 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:58:24 np0005541913.localdomain sudo[303211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:24 np0005541913.localdomain sudo[303211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:24 np0005541913.localdomain sudo[303211]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 02 09:58:24 np0005541913.localdomain sudo[303229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 02 09:58:24 np0005541913.localdomain sudo[303229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:24 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:24 np0005541913.localdomain podman[303264]: 
Dec 02 09:58:24 np0005541913.localdomain podman[303264]: 2025-12-02 09:58:24.660288442 +0000 UTC m=+0.053767710 container create c4c0f5b875fb591be114a63f04a4a357c9053eecc1b5a62bcd9a6bc7e3ca4a88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_khayyam, name=rhceph, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., ceph=True, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 09:58:24 np0005541913.localdomain systemd[1]: Started libpod-conmon-c4c0f5b875fb591be114a63f04a4a357c9053eecc1b5a62bcd9a6bc7e3ca4a88.scope.
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 02 09:58:24 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541914 172.18.0.108:0/645205908; not ready for session (expect reconnect)
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:24 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541914: (2) No such file or directory
Dec 02 09:58:24 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:24 np0005541913.localdomain podman[303264]: 2025-12-02 09:58:24.724703425 +0000 UTC m=+0.118182683 container init c4c0f5b875fb591be114a63f04a4a357c9053eecc1b5a62bcd9a6bc7e3ca4a88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_khayyam, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container, release=1763362218, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:58:24 np0005541913.localdomain podman[303264]: 2025-12-02 09:58:24.73126297 +0000 UTC m=+0.124742238 container start c4c0f5b875fb591be114a63f04a4a357c9053eecc1b5a62bcd9a6bc7e3ca4a88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_khayyam, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, name=rhceph, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-type=git, architecture=x86_64)
Dec 02 09:58:24 np0005541913.localdomain podman[303264]: 2025-12-02 09:58:24.73164297 +0000 UTC m=+0.125122228 container attach c4c0f5b875fb591be114a63f04a4a357c9053eecc1b5a62bcd9a6bc7e3ca4a88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_khayyam, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-11-26T19:44:28Z, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_BRANCH=main, release=1763362218, io.openshift.expose-services=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_CLEAN=True, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:58:24 np0005541913.localdomain musing_khayyam[303280]: 167 167
Dec 02 09:58:24 np0005541913.localdomain systemd[1]: libpod-c4c0f5b875fb591be114a63f04a4a357c9053eecc1b5a62bcd9a6bc7e3ca4a88.scope: Deactivated successfully.
Dec 02 09:58:24 np0005541913.localdomain podman[303264]: 2025-12-02 09:58:24.734459466 +0000 UTC m=+0.127938724 container died c4c0f5b875fb591be114a63f04a4a357c9053eecc1b5a62bcd9a6bc7e3ca4a88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_khayyam, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:58:24 np0005541913.localdomain podman[303264]: 2025-12-02 09:58:24.643279317 +0000 UTC m=+0.036758645 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:24 np0005541913.localdomain podman[303285]: 2025-12-02 09:58:24.816162403 +0000 UTC m=+0.072610704 container remove c4c0f5b875fb591be114a63f04a4a357c9053eecc1b5a62bcd9a6bc7e3ca4a88 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_khayyam, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, version=7, ceph=True, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Dec 02 09:58:24 np0005541913.localdomain systemd[1]: libpod-conmon-c4c0f5b875fb591be114a63f04a4a357c9053eecc1b5a62bcd9a6bc7e3ca4a88.scope: Deactivated successfully.
Dec 02 09:58:24 np0005541913.localdomain sudo[303229]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:24 np0005541913.localdomain sudo[303301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:24 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6f39e0b73471d4bf964bb7a0a6c573f1334a946fb2daea93a513a94e450e9745-merged.mount: Deactivated successfully.
Dec 02 09:58:24 np0005541913.localdomain sudo[303301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:24 np0005541913.localdomain sudo[303301]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:25 np0005541913.localdomain sudo[303319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:58:25 np0005541913.localdomain sudo[303319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:25 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:58:25 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:58:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:25 np0005541913.localdomain sudo[303319]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:25 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541914 172.18.0.108:0/645205908; not ready for session (expect reconnect)
Dec 02 09:58:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:25 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:25 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541914: (2) No such file or directory
Dec 02 09:58:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:26 np0005541913.localdomain ceph-mon[298296]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:26 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:26 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 02 09:58:26 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541914 172.18.0.108:0/645205908; not ready for session (expect reconnect)
Dec 02 09:58:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:26 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:26 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541914: (2) No such file or directory
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 02 09:58:27 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] update: starting ev a93bf519-2b05-4bf0-ace4-a29a71bb2529 (Updating node-proxy deployment (+3 -> 3))
Dec 02 09:58:27 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] complete: finished ev a93bf519-2b05-4bf0-ace4-a29a71bb2529 (Updating node-proxy deployment (+3 -> 3))
Dec 02 09:58:27 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Completed event a93bf519-2b05-4bf0-ace4-a29a71bb2529 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e16 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain sudo[303370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:58:27 np0005541913.localdomain sudo[303370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:27 np0005541913.localdomain sudo[303370]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(probing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541912"} v 0)
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: log_channel(cluster) log [INF] : mon.np0005541913 calling monitor election
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: paxos.1).electionLogic(64) init, last seen epoch 64
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541913"} v 0)
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541914: (22) Invalid argument
Dec 02 09:58:27 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:58:27 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541914 172.18.0.108:0/645205908; not ready for session (expect reconnect)
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:27 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:27 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541914: (22) Invalid argument
Dec 02 09:58:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:27.998 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:28 np0005541913.localdomain ceph-mgr[288059]: [progress INFO root] Writing back 50 completed events
Dec 02 09:58:28 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 02 09:58:28 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:28 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541914 172.18.0.108:0/645205908; not ready for session (expect reconnect)
Dec 02 09:58:28 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:28 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:28 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541914: (22) Invalid argument
Dec 02 09:58:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:29.091 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:29 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541914 172.18.0.108:0/645205908; not ready for session (expect reconnect)
Dec 02 09:58:29 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:29 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:29 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541914: (22) Invalid argument
Dec 02 09:58:30 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:30 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541914 172.18.0.108:0/645205908; not ready for session (expect reconnect)
Dec 02 09:58:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:30 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:30 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541914: (22) Invalid argument
Dec 02 09:58:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:58:31 np0005541913.localdomain podman[303388]: 2025-12-02 09:58:31.420701302 +0000 UTC m=+0.066122590 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:58:31 np0005541913.localdomain podman[303388]: 2025-12-02 09:58:31.461069292 +0000 UTC m=+0.106490570 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute)
Dec 02 09:58:31 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:58:31 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541914 172.18.0.108:0/645205908; not ready for session (expect reconnect)
Dec 02 09:58:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:31 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:31 np0005541913.localdomain ceph-mgr[288059]: mgr finish mon failed to return metadata for mon.np0005541914: (22) Invalid argument
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: paxos.1).electionLogic(65) init, last seen epoch 65, mid-election, bumping
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(electing) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:58:32 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:58:32 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 calling monitor election
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913 calling monitor election
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541912 (monmap changed)...
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541914 calling monitor election
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541912 is new leader, mons np0005541912,np0005541913,np0005541914 in quorum (ranks 0,1,2)
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: monmap epoch 17
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: last_changed 2025-12-02T09:58:27.543539+0000
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: created 2025-12-02T07:44:17.411659+0000
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: min_mon_release 18 (reef)
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: election_strategy: 1
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541914
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: fsmap cephfs:1 {0=mds.np0005541912.ghcwcm=up:active} 2 up:standby
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: osdmap e89: 6 total, 6 up, 6 in
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mgrmap e31: np0005541913.mfesdm(active, since 78s), standbys: np0005541912.qwddia, np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541912.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: overall HEALTH_OK
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:32 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_open ignoring open from mon.np0005541914 172.18.0.108:0/645205908; not ready for session (expect reconnect)
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005541914"} v 0)
Dec 02 09:58:32 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:33.028 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541912 on np0005541912.localdomain
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:33 np0005541913.localdomain ceph-mgr[288059]: mgr.server handle_report got status from non-daemon mon.np0005541914
Dec 02 09:58:33 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:33.711+0000 7f783c290640 -1 mgr.server handle_report got status from non-daemon mon.np0005541914
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:33 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Dec 02 09:58:33 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:33 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:58:33 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 02 09:58:33 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1722694794' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:58:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:58:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:58:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:58:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:58:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:58:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:58:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:58:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:58:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:34.100 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:34 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:34 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.2 (monmap changed)...
Dec 02 09:58:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:58:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:34 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:58:34 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/1722694794' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:58:34 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:35 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Dec 02 09:58:35 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Dec 02 09:58:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Dec 02 09:58:35 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:58:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:35 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:35 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:58:35 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:58:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.5 (monmap changed)...
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:36 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:58:36 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:58:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:58:36 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:58:36 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:58:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:58:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:58:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:58:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18718 "" "Go-http-client/1.1"
Dec 02 09:58:36 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:36 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:58:36 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:36 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:36 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:58:36 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541912.ghcwcm (monmap changed)...
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541912.ghcwcm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541912.ghcwcm on np0005541912.localdomain
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541912.qwddia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:58:37 np0005541913.localdomain podman[303408]: 2025-12-02 09:58:37.443993558 +0000 UTC m=+0.082782056 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 09:58:37 np0005541913.localdomain podman[303408]: 2025-12-02 09:58:37.449913466 +0000 UTC m=+0.088702004 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 02 09:58:37 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:37 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:58:37 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:37 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:37 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:58:37 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:58:37 np0005541913.localdomain sudo[303425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:37 np0005541913.localdomain sudo[303425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:37 np0005541913.localdomain sudo[303425]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:38 np0005541913.localdomain sudo[303443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:38 np0005541913.localdomain sudo[303443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:38 np0005541913.localdomain ceph-mgr[288059]: log_channel(audit) log [DBG] : from='client.64100 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:58:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:38.077 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:38 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO root] Reconfig service osd.default_drive_group
Dec 02 09:58:38 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain.devices.0}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541912.localdomain}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain.devices.0}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541914.localdomain}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541912.qwddia (monmap changed)...
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541912.qwddia on np0005541912.localdomain
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541913.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:38 np0005541913.localdomain podman[303479]: 
Dec 02 09:58:38 np0005541913.localdomain podman[303479]: 2025-12-02 09:58:38.53565333 +0000 UTC m=+0.072116101 container create 9dd2786e0e70226825c3c2a0970a3d7ae2fb2853670c4582a697173dd015ce82 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_wu, version=7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218)
Dec 02 09:58:38 np0005541913.localdomain systemd[1]: Started libpod-conmon-9dd2786e0e70226825c3c2a0970a3d7ae2fb2853670c4582a697173dd015ce82.scope.
Dec 02 09:58:38 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:38 np0005541913.localdomain podman[303479]: 2025-12-02 09:58:38.499955524 +0000 UTC m=+0.036418375 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:38 np0005541913.localdomain podman[303479]: 2025-12-02 09:58:38.606786993 +0000 UTC m=+0.143249764 container init 9dd2786e0e70226825c3c2a0970a3d7ae2fb2853670c4582a697173dd015ce82 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_wu, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-type=git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:58:38 np0005541913.localdomain podman[303479]: 2025-12-02 09:58:38.61711138 +0000 UTC m=+0.153574131 container start 9dd2786e0e70226825c3c2a0970a3d7ae2fb2853670c4582a697173dd015ce82 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_wu, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7)
Dec 02 09:58:38 np0005541913.localdomain podman[303479]: 2025-12-02 09:58:38.617593472 +0000 UTC m=+0.154056243 container attach 9dd2786e0e70226825c3c2a0970a3d7ae2fb2853670c4582a697173dd015ce82 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_wu, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, RELEASE=main)
Dec 02 09:58:38 np0005541913.localdomain frosty_wu[303494]: 167 167
Dec 02 09:58:38 np0005541913.localdomain systemd[1]: libpod-9dd2786e0e70226825c3c2a0970a3d7ae2fb2853670c4582a697173dd015ce82.scope: Deactivated successfully.
Dec 02 09:58:38 np0005541913.localdomain podman[303479]: 2025-12-02 09:58:38.622109774 +0000 UTC m=+0.158572545 container died 9dd2786e0e70226825c3c2a0970a3d7ae2fb2853670c4582a697173dd015ce82 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_wu, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, RELEASE=main, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, version=7, build-date=2025-11-26T19:44:28Z, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mgr[288059]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:38 np0005541913.localdomain podman[303499]: 2025-12-02 09:58:38.709222854 +0000 UTC m=+0.081386268 container remove 9dd2786e0e70226825c3c2a0970a3d7ae2fb2853670c4582a697173dd015ce82 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_wu, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_BRANCH=main, RELEASE=main, release=1763362218, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=)
Dec 02 09:58:38 np0005541913.localdomain systemd[1]: libpod-conmon-9dd2786e0e70226825c3c2a0970a3d7ae2fb2853670c4582a697173dd015ce82.scope: Deactivated successfully.
Dec 02 09:58:38 np0005541913.localdomain sudo[303443]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Dec 02 09:58:38 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 09:58:38 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:38 np0005541913.localdomain ceph-mgr[288059]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:38 np0005541913.localdomain ceph-mgr[288059]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:38 np0005541913.localdomain sudo[303515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:38 np0005541913.localdomain sudo[303515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:38 np0005541913.localdomain sudo[303515]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:38 np0005541913.localdomain sudo[303533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:38 np0005541913.localdomain sudo[303533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:39.137 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541913 (monmap changed)...
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541913 on np0005541913.localdomain
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: from='client.64100 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: Reconfig service osd.default_drive_group
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 172.18.0.107:0/3692232454' entity='mgr.np0005541913.mfesdm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:39 np0005541913.localdomain podman[303565]: 2025-12-02 09:58:39.460436256 +0000 UTC m=+0.098140867 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc.)
Dec 02 09:58:39 np0005541913.localdomain podman[303565]: 2025-12-02 09:58:39.481090569 +0000 UTC m=+0.118795150 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible)
Dec 02 09:58:39 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:58:39 np0005541913.localdomain podman[303573]: 
Dec 02 09:58:39 np0005541913.localdomain podman[303573]: 2025-12-02 09:58:39.511334338 +0000 UTC m=+0.126242159 container create a3a86c175695f9f10a232a73c4d6576ddcce9249b6e81b67295a3e00112da92e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_stonebraker, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1763362218, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 09:58:39 np0005541913.localdomain systemd[1]: Started libpod-conmon-a3a86c175695f9f10a232a73c4d6576ddcce9249b6e81b67295a3e00112da92e.scope.
Dec 02 09:58:39 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-eaee7d597b472a45c4e8a1b44b0f12246d330ebb129c6cd5dd8357cd72d3c3a8-merged.mount: Deactivated successfully.
Dec 02 09:58:39 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:39 np0005541913.localdomain podman[303573]: 2025-12-02 09:58:39.477202815 +0000 UTC m=+0.092110656 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:39 np0005541913.localdomain podman[303573]: 2025-12-02 09:58:39.583300224 +0000 UTC m=+0.198208025 container init a3a86c175695f9f10a232a73c4d6576ddcce9249b6e81b67295a3e00112da92e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_stonebraker, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, ceph=True, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7)
Dec 02 09:58:39 np0005541913.localdomain podman[303573]: 2025-12-02 09:58:39.593563079 +0000 UTC m=+0.208470880 container start a3a86c175695f9f10a232a73c4d6576ddcce9249b6e81b67295a3e00112da92e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_stonebraker, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., release=1763362218, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=)
Dec 02 09:58:39 np0005541913.localdomain podman[303573]: 2025-12-02 09:58:39.594716919 +0000 UTC m=+0.209624790 container attach a3a86c175695f9f10a232a73c4d6576ddcce9249b6e81b67295a3e00112da92e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_stonebraker, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Dec 02 09:58:39 np0005541913.localdomain zen_stonebraker[303601]: 167 167
Dec 02 09:58:39 np0005541913.localdomain systemd[1]: libpod-a3a86c175695f9f10a232a73c4d6576ddcce9249b6e81b67295a3e00112da92e.scope: Deactivated successfully.
Dec 02 09:58:39 np0005541913.localdomain podman[303573]: 2025-12-02 09:58:39.617265193 +0000 UTC m=+0.232172994 container died a3a86c175695f9f10a232a73c4d6576ddcce9249b6e81b67295a3e00112da92e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_stonebraker, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Dec 02 09:58:39 np0005541913.localdomain podman[303606]: 2025-12-02 09:58:39.71020684 +0000 UTC m=+0.081521442 container remove a3a86c175695f9f10a232a73c4d6576ddcce9249b6e81b67295a3e00112da92e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_stonebraker, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, architecture=x86_64, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:58:39 np0005541913.localdomain systemd[1]: libpod-conmon-a3a86c175695f9f10a232a73c4d6576ddcce9249b6e81b67295a3e00112da92e.scope: Deactivated successfully.
Dec 02 09:58:39 np0005541913.localdomain sudo[303533]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain.devices.0}] v 0)
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005541913.localdomain}] v 0)
Dec 02 09:58:39 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 e90: 6 total, 6 up, 6 in
Dec 02 09:58:40 np0005541913.localdomain sshd[299538]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 09:58:40 np0005541913.localdomain systemd[1]: session-69.scope: Deactivated successfully.
Dec 02 09:58:40 np0005541913.localdomain systemd[1]: session-69.scope: Consumed 21.232s CPU time.
Dec 02 09:58:40 np0005541913.localdomain systemd-logind[757]: Session 69 logged out. Waiting for processes to exit.
Dec 02 09:58:40 np0005541913.localdomain systemd-logind[757]: Removed session 69.
Dec 02 09:58:40 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: ignoring --setuser ceph since I am not root
Dec 02 09:58:40 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: ignoring --setgroup ceph since I am not root
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: pidfile_write: ignore empty --pid-file
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'alerts'
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'balancer'
Dec 02 09:58:40 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:40.144+0000 7f2d05246140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'cephadm'
Dec 02 09:58:40 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:40.212+0000 7f2d05246140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 02 09:58:40 np0005541913.localdomain sshd[303653]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 09:58:40 np0005541913.localdomain sshd[303653]: Accepted publickey for ceph-admin from 192.168.122.106 port 34772 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 09:58:40 np0005541913.localdomain systemd-logind[757]: New session 71 of user ceph-admin.
Dec 02 09:58:40 np0005541913.localdomain systemd[1]: Started Session 71 of User ceph-admin.
Dec 02 09:58:40 np0005541913.localdomain sshd[303653]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.0 (monmap changed)...
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/3934454104' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: Activating manager daemon np0005541912.qwddia
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: osdmap e90: 6 total, 6 up, 6 in
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/3934454104' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: mgrmap e32: np0005541912.qwddia(active, starting, since 0.0452632s), standbys: np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mds metadata", "who": "mds.np0005541914.sqgqkj"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mds metadata", "who": "mds.np0005541913.maexpe"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mds metadata", "who": "mds.np0005541912.ghcwcm"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mgr metadata", "who": "np0005541912.qwddia", "id": "np0005541912.qwddia"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mgr metadata", "who": "np0005541911.adcgiw", "id": "np0005541911.adcgiw"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mgr metadata", "who": "np0005541914.lljzmk", "id": "np0005541914.lljzmk"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26470 ' entity='mgr.np0005541913.mfesdm' 
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mds metadata"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd metadata"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mon metadata"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: Manager daemon np0005541912.qwddia is now available
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: removing stray HostCache host record np0005541911.localdomain.devices.0
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541911.localdomain.devices.0"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005541911.localdomain.devices.0"}]': finished
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005541911.localdomain.devices.0"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005541911.localdomain.devices.0"}]': finished
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541912.qwddia/mirror_snapshot_schedule"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541912.qwddia/trash_purge_schedule"} : dispatch
Dec 02 09:58:40 np0005541913.localdomain sudo[303657]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:40 np0005541913.localdomain sudo[303657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:58:40 np0005541913.localdomain sudo[303657]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-359bd3fde74013314e3669a5a3d0da96f652ca71e67c6b8dad8904cb260b40e3-merged.mount: Deactivated successfully.
Dec 02 09:58:40 np0005541913.localdomain sudo[303676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 09:58:40 np0005541913.localdomain sudo[303676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:40 np0005541913.localdomain podman[303675]: 2025-12-02 09:58:40.609829942 +0000 UTC m=+0.094514380 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:58:40 np0005541913.localdomain podman[303675]: 2025-12-02 09:58:40.641040808 +0000 UTC m=+0.125725206 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 09:58:40 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:58:40 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'crash'
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 02 09:58:40 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'dashboard'
Dec 02 09:58:40 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:40.867+0000 7f2d05246140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 02 09:58:41 np0005541913.localdomain podman[303793]: 2025-12-02 09:58:41.347393918 +0000 UTC m=+0.086668690 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, RELEASE=main, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'devicehealth'
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'diskprediction_local'
Dec 02 09:58:41 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:41.441+0000 7f2d05246140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 02 09:58:41 np0005541913.localdomain podman[303793]: 2025-12-02 09:58:41.455908101 +0000 UTC m=+0.195182883 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.buildah.version=1.41.4, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z)
Dec 02 09:58:41 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 02 09:58:41 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 02 09:58:41 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]:   from numpy import show_config as show_numpy_config
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'influx'
Dec 02 09:58:41 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:41.591+0000 7f2d05246140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'insights'
Dec 02 09:58:41 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:41.649+0000 7f2d05246140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'iostat'
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 02 09:58:41 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'k8sevents'
Dec 02 09:58:41 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:41.765+0000 7f2d05246140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 02 09:58:41 np0005541913.localdomain ceph-mon[298296]: mgrmap e33: np0005541912.qwddia(active, since 1.09574s), standbys: np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:58:41 np0005541913.localdomain ceph-mon[298296]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:42 np0005541913.localdomain sudo[303676]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'localpool'
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'mds_autoscaler'
Dec 02 09:58:42 np0005541913.localdomain sudo[303913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:42 np0005541913.localdomain sudo[303913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:42 np0005541913.localdomain sudo[303913]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:42 np0005541913.localdomain sudo[303931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 09:58:42 np0005541913.localdomain sudo[303931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'mirroring'
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'nfs'
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'orchestrator'
Dec 02 09:58:42 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:42.520+0000 7f2d05246140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'osd_perf_query'
Dec 02 09:58:42 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:42.672+0000 7f2d05246140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:42.739+0000 7f2d05246140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'osd_support'
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:42.797+0000 7f2d05246140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'pg_autoscaler'
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'progress'
Dec 02 09:58:42 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:42.868+0000 7f2d05246140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain sudo[303931]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:42.935+0000 7f2d05246140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 02 09:58:42 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'prometheus'
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:58:41] ENGINE Bus STARTING
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:58:41] ENGINE Serving on http://172.18.0.106:8765
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:58:41] ENGINE Serving on https://172.18.0.106:7150
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:58:41] ENGINE Bus STARTED
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:09:58:41] ENGINE Client ('172.18.0.106', 43976) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: mgrmap e34: np0005541912.qwddia(active, since 2s), standbys: np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:43.081 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:43 np0005541913.localdomain sudo[303982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:43 np0005541913.localdomain sudo[303982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:43 np0005541913.localdomain sudo[303982]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:43 np0005541913.localdomain sudo[304000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 09:58:43 np0005541913.localdomain sudo[304000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:43 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 02 09:58:43 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'rbd_support'
Dec 02 09:58:43 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:43.254+0000 7f2d05246140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 02 09:58:43 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 02 09:58:43 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'restful'
Dec 02 09:58:43 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:43.339+0000 7f2d05246140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 02 09:58:43 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'rgw'
Dec 02 09:58:43 np0005541913.localdomain sudo[304000]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:43 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 02 09:58:43 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'rook'
Dec 02 09:58:43 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:43.684+0000 7f2d05246140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 02 09:58:43 np0005541913.localdomain sudo[304036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:58:43 np0005541913.localdomain sudo[304036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:43 np0005541913.localdomain sudo[304036]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:43.879 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:43 np0005541913.localdomain sudo[304054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:58:43 np0005541913.localdomain sudo[304054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:43 np0005541913.localdomain sudo[304054]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain sudo[304072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:58:44 np0005541913.localdomain sudo[304072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304072]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain sudo[304090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:44 np0005541913.localdomain sudo[304090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304090]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'selftest'
Dec 02 09:58:44 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:44.129+0000 7f2d05246140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:44.139 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:44 np0005541913.localdomain sudo[304108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:58:44 np0005541913.localdomain sudo[304108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304108]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'snap_schedule'
Dec 02 09:58:44 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:44.192+0000 7f2d05246140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'stats'
Dec 02 09:58:44 np0005541913.localdomain sudo[304142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:58:44 np0005541913.localdomain sudo[304142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304142]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'status'
Dec 02 09:58:44 np0005541913.localdomain sudo[304160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 09:58:44 np0005541913.localdomain sudo[304160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304160]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'telegraf'
Dec 02 09:58:44 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:44.398+0000 7f2d05246140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain sudo[304178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 09:58:44 np0005541913.localdomain sudo[304178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304178]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'telemetry'
Dec 02 09:58:44 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:44.462+0000 7f2d05246140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain sudo[304196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:58:44 np0005541913.localdomain sudo[304196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304196]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain sudo[304214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:58:44 np0005541913.localdomain sudo[304214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304214]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:44.614+0000 7f2d05246140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'test_orchestrator'
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541912.localdomain to 836.6M
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541913.localdomain to 836.6M
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541912.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541914.localdomain to 836.6M
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541914.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541913.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 09:58:44 np0005541913.localdomain ceph-mon[298296]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:44 np0005541913.localdomain sudo[304232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:58:44 np0005541913.localdomain sudo[304232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304232]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain sudo[304250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:44 np0005541913.localdomain sudo[304250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304250]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'volumes'
Dec 02 09:58:44 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:44.773+0000 7f2d05246140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain sudo[304268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:58:44 np0005541913.localdomain sudo[304268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304268]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain sudo[304302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:58:44 np0005541913.localdomain sudo[304302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:44 np0005541913.localdomain sudo[304302]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 02 09:58:44 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Loading python module 'zabbix'
Dec 02 09:58:44 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:44.974+0000 7f2d05246140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 02 09:58:45 np0005541913.localdomain sudo[304320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 09:58:45 np0005541913.localdomain sudo[304320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304320]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain ceph-mgr[288059]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 02 09:58:45 np0005541913.localdomain ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-mgr-np0005541913-mfesdm[288055]: 2025-12-02T09:58:45.041+0000 7f2d05246140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 02 09:58:45 np0005541913.localdomain ceph-mgr[288059]: ms_deliver_dispatch: unhandled message 0x563adf5ab1e0 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Dec 02 09:58:45 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.106:6810/2383186409
Dec 02 09:58:45 np0005541913.localdomain sudo[304338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:45 np0005541913.localdomain sudo[304338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304338]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain sudo[304356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 09:58:45 np0005541913.localdomain sudo[304356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304356]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain sudo[304374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 09:58:45 np0005541913.localdomain sudo[304374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304374]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain sudo[304392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:58:45 np0005541913.localdomain sudo[304392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304392]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain sudo[304410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:45 np0005541913.localdomain sudo[304410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304410]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain sudo[304428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:58:45 np0005541913.localdomain sudo[304428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304428]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain sudo[304462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:58:45 np0005541913.localdomain sudo[304462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304462]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain sudo[304480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 09:58:45 np0005541913.localdomain sudo[304480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304480]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain sudo[304498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 02 09:58:45 np0005541913.localdomain sudo[304498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304498]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:45 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:45 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:45 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 09:58:45 np0005541913.localdomain ceph-mon[298296]: mgrmap e35: np0005541912.qwddia(active, since 4s), standbys: np0005541911.adcgiw, np0005541914.lljzmk
Dec 02 09:58:45 np0005541913.localdomain ceph-mon[298296]: Standby manager daemon np0005541913.mfesdm started
Dec 02 09:58:45 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:58:45 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:58:45 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 09:58:45 np0005541913.localdomain sudo[304516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:58:45 np0005541913.localdomain sudo[304516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain sudo[304516]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:45 np0005541913.localdomain sudo[304534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 09:58:45 np0005541913.localdomain sudo[304534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:58:45 np0005541913.localdomain sudo[304534]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:46 np0005541913.localdomain sudo[304558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:58:46 np0005541913.localdomain sudo[304558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:46 np0005541913.localdomain sudo[304558]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:46 np0005541913.localdomain podman[304552]: 2025-12-02 09:58:46.047138397 +0000 UTC m=+0.085980541 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:58:46 np0005541913.localdomain podman[304552]: 2025-12-02 09:58:46.056498728 +0000 UTC m=+0.095340902 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 09:58:46 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:58:46 np0005541913.localdomain sudo[304586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:46 np0005541913.localdomain sudo[304586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:46 np0005541913.localdomain sudo[304586]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:46 np0005541913.localdomain sudo[304607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:58:46 np0005541913.localdomain sudo[304607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:46 np0005541913.localdomain sudo[304607]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:46 np0005541913.localdomain sudo[304641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:58:46 np0005541913.localdomain sudo[304641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:46 np0005541913.localdomain sudo[304641]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:46 np0005541913.localdomain sudo[304659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 09:58:46 np0005541913.localdomain sudo[304659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:46 np0005541913.localdomain sudo[304659]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:46 np0005541913.localdomain sudo[304677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:58:46 np0005541913.localdomain sudo[304677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:46 np0005541913.localdomain sudo[304677]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:46 np0005541913.localdomain sudo[304695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:58:46 np0005541913.localdomain sudo[304695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:46 np0005541913.localdomain sudo[304695]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: mgrmap e36: np0005541912.qwddia(active, since 5s), standbys: np0005541911.adcgiw, np0005541914.lljzmk, np0005541913.mfesdm
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mgr metadata", "who": "np0005541913.mfesdm", "id": "np0005541913.mfesdm"} : dispatch
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 02 09:58:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:47.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:47.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 0 B/s wr, 21 op/s
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.2 on np0005541912.localdomain
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 02 09:58:47 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:48.119 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:48 np0005541913.localdomain sudo[304713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:48 np0005541913.localdomain sudo[304713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:58:48 np0005541913.localdomain sudo[304713]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:58:48 np0005541913.localdomain sudo[304733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:48 np0005541913.localdomain sudo[304733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:48 np0005541913.localdomain podman[304732]: 2025-12-02 09:58:48.801774337 +0000 UTC m=+0.090859092 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:58:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:48.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:48.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:58:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:48.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:58:48 np0005541913.localdomain podman[304732]: 2025-12-02 09:58:48.853053189 +0000 UTC m=+0.142138004 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 09:58:48 np0005541913.localdomain podman[304731]: 2025-12-02 09:58:48.862308257 +0000 UTC m=+0.155607505 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:58:48 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:58:48 np0005541913.localdomain podman[304731]: 2025-12-02 09:58:48.897299683 +0000 UTC m=+0.190598941 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 09:58:48 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:58:48 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.5 on np0005541912.localdomain
Dec 02 09:58:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 02 09:58:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:49.144 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:49 np0005541913.localdomain podman[304811]: 
Dec 02 09:58:49 np0005541913.localdomain podman[304811]: 2025-12-02 09:58:49.255901659 +0000 UTC m=+0.085920310 container create 82bdf7eb7c8f4955a3639abfb870a5c57c2d8af7c36367832891f6860620874d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_kare, architecture=x86_64, name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:58:49 np0005541913.localdomain systemd[1]: Started libpod-conmon-82bdf7eb7c8f4955a3639abfb870a5c57c2d8af7c36367832891f6860620874d.scope.
Dec 02 09:58:49 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:49 np0005541913.localdomain podman[304811]: 2025-12-02 09:58:49.331823141 +0000 UTC m=+0.161841822 container init 82bdf7eb7c8f4955a3639abfb870a5c57c2d8af7c36367832891f6860620874d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_kare, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, release=1763362218, io.openshift.expose-services=, version=7, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Dec 02 09:58:49 np0005541913.localdomain podman[304811]: 2025-12-02 09:58:49.232234475 +0000 UTC m=+0.062253156 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:49 np0005541913.localdomain podman[304811]: 2025-12-02 09:58:49.34522762 +0000 UTC m=+0.175246291 container start 82bdf7eb7c8f4955a3639abfb870a5c57c2d8af7c36367832891f6860620874d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_kare, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:58:49 np0005541913.localdomain podman[304811]: 2025-12-02 09:58:49.345516557 +0000 UTC m=+0.175535208 container attach 82bdf7eb7c8f4955a3639abfb870a5c57c2d8af7c36367832891f6860620874d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_kare, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, version=7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, architecture=x86_64, release=1763362218, RELEASE=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:58:49 np0005541913.localdomain bold_kare[304827]: 167 167
Dec 02 09:58:49 np0005541913.localdomain podman[304811]: 2025-12-02 09:58:49.34976496 +0000 UTC m=+0.179783661 container died 82bdf7eb7c8f4955a3639abfb870a5c57c2d8af7c36367832891f6860620874d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_kare, GIT_BRANCH=main, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph)
Dec 02 09:58:49 np0005541913.localdomain systemd[1]: libpod-82bdf7eb7c8f4955a3639abfb870a5c57c2d8af7c36367832891f6860620874d.scope: Deactivated successfully.
Dec 02 09:58:49 np0005541913.localdomain podman[304832]: 2025-12-02 09:58:49.445860122 +0000 UTC m=+0.084970735 container remove 82bdf7eb7c8f4955a3639abfb870a5c57c2d8af7c36367832891f6860620874d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_kare, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=)
Dec 02 09:58:49 np0005541913.localdomain systemd[1]: libpod-conmon-82bdf7eb7c8f4955a3639abfb870a5c57c2d8af7c36367832891f6860620874d.scope: Deactivated successfully.
Dec 02 09:58:49 np0005541913.localdomain sudo[304733]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:49.715 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:58:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:49.716 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:58:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:49.717 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:58:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:49.717 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:58:49 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a08429a249406af603dcdaecdd29fef51c19efe8d0cdad6822fb29e4836e89fe-merged.mount: Deactivated successfully.
Dec 02 09:58:49 np0005541913.localdomain ceph-mon[298296]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 0 B/s wr, 15 op/s
Dec 02 09:58:49 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.0 on np0005541913.localdomain
Dec 02 09:58:49 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:49 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:49 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:49 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:49 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 02 09:58:49 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:49 np0005541913.localdomain sudo[304856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:49 np0005541913.localdomain sudo[304856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:49 np0005541913.localdomain sudo[304856]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:50 np0005541913.localdomain sudo[304874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:50 np0005541913.localdomain sudo[304874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:50 np0005541913.localdomain podman[304909]: 
Dec 02 09:58:50 np0005541913.localdomain podman[304909]: 2025-12-02 09:58:50.481500924 +0000 UTC m=+0.063942102 container create 3059120fd8d1cb4a27b0e155b51ecbc33e90069a7383c8d5bf331e53dd0e4500 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_raman, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, release=1763362218, ceph=True, RELEASE=main, io.openshift.expose-services=)
Dec 02 09:58:50 np0005541913.localdomain systemd[1]: Started libpod-conmon-3059120fd8d1cb4a27b0e155b51ecbc33e90069a7383c8d5bf331e53dd0e4500.scope.
Dec 02 09:58:50 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:50 np0005541913.localdomain podman[304909]: 2025-12-02 09:58:50.547424319 +0000 UTC m=+0.129865457 container init 3059120fd8d1cb4a27b0e155b51ecbc33e90069a7383c8d5bf331e53dd0e4500 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_raman, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:58:50 np0005541913.localdomain podman[304909]: 2025-12-02 09:58:50.450843744 +0000 UTC m=+0.033284952 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:50 np0005541913.localdomain podman[304909]: 2025-12-02 09:58:50.557790856 +0000 UTC m=+0.140232034 container start 3059120fd8d1cb4a27b0e155b51ecbc33e90069a7383c8d5bf331e53dd0e4500 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_raman, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=rhceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=1763362218, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Dec 02 09:58:50 np0005541913.localdomain podman[304909]: 2025-12-02 09:58:50.55831496 +0000 UTC m=+0.140756198 container attach 3059120fd8d1cb4a27b0e155b51ecbc33e90069a7383c8d5bf331e53dd0e4500 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_raman, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1763362218, GIT_BRANCH=main, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, version=7)
Dec 02 09:58:50 np0005541913.localdomain systemd[1]: tmp-crun.n4p3Tq.mount: Deactivated successfully.
Dec 02 09:58:50 np0005541913.localdomain objective_raman[304924]: 167 167
Dec 02 09:58:50 np0005541913.localdomain systemd[1]: libpod-3059120fd8d1cb4a27b0e155b51ecbc33e90069a7383c8d5bf331e53dd0e4500.scope: Deactivated successfully.
Dec 02 09:58:50 np0005541913.localdomain podman[304909]: 2025-12-02 09:58:50.562022599 +0000 UTC m=+0.144463777 container died 3059120fd8d1cb4a27b0e155b51ecbc33e90069a7383c8d5bf331e53dd0e4500 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_raman, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, build-date=2025-11-26T19:44:28Z, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4)
Dec 02 09:58:50 np0005541913.localdomain podman[304929]: 2025-12-02 09:58:50.656574379 +0000 UTC m=+0.084647916 container remove 3059120fd8d1cb4a27b0e155b51ecbc33e90069a7383c8d5bf331e53dd0e4500 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_raman, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4)
Dec 02 09:58:50 np0005541913.localdomain systemd[1]: libpod-conmon-3059120fd8d1cb4a27b0e155b51ecbc33e90069a7383c8d5bf331e53dd0e4500.scope: Deactivated successfully.
Dec 02 09:58:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ac97adbaeaa07c45666865af2f09771ba03cfb59d8c9d7b93b5393a12e595687-merged.mount: Deactivated successfully.
Dec 02 09:58:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:50.813 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:58:50 np0005541913.localdomain sudo[304874]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:50 np0005541913.localdomain sudo[304953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:50 np0005541913.localdomain sudo[304953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:50 np0005541913.localdomain sudo[304953]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:50.985 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:58:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:50.987 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:58:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:50.988 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:50.988 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:50.989 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:51.060 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:58:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:51.061 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:58:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:51.062 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:58:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:51.062 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:58:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:51.063 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:58:51 np0005541913.localdomain sudo[304971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:51 np0005541913.localdomain sudo[304971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.3 (monmap changed)...
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.3 on np0005541913.localdomain
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541913.maexpe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:58:51 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/481852052' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:58:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:51.510 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:58:51 np0005541913.localdomain podman[305026]: 
Dec 02 09:58:51 np0005541913.localdomain podman[305026]: 2025-12-02 09:58:51.548233039 +0000 UTC m=+0.072060220 container create 7f0ab745fe4f63d6d07ecbcdbd98db4d8c61bf630adbe4b627aa3675662d3595 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_golick, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, RELEASE=main, release=1763362218, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 09:58:51 np0005541913.localdomain systemd[1]: Started libpod-conmon-7f0ab745fe4f63d6d07ecbcdbd98db4d8c61bf630adbe4b627aa3675662d3595.scope.
Dec 02 09:58:51 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:51 np0005541913.localdomain podman[305026]: 2025-12-02 09:58:51.609743025 +0000 UTC m=+0.133570226 container init 7f0ab745fe4f63d6d07ecbcdbd98db4d8c61bf630adbe4b627aa3675662d3595 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_golick, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph)
Dec 02 09:58:51 np0005541913.localdomain podman[305026]: 2025-12-02 09:58:51.513044328 +0000 UTC m=+0.036871609 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:51 np0005541913.localdomain podman[305026]: 2025-12-02 09:58:51.620964075 +0000 UTC m=+0.144791286 container start 7f0ab745fe4f63d6d07ecbcdbd98db4d8c61bf630adbe4b627aa3675662d3595 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_golick, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container)
Dec 02 09:58:51 np0005541913.localdomain podman[305026]: 2025-12-02 09:58:51.621244072 +0000 UTC m=+0.145071273 container attach 7f0ab745fe4f63d6d07ecbcdbd98db4d8c61bf630adbe4b627aa3675662d3595 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_golick, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_BRANCH=main, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 02 09:58:51 np0005541913.localdomain charming_golick[305044]: 167 167
Dec 02 09:58:51 np0005541913.localdomain systemd[1]: libpod-7f0ab745fe4f63d6d07ecbcdbd98db4d8c61bf630adbe4b627aa3675662d3595.scope: Deactivated successfully.
Dec 02 09:58:51 np0005541913.localdomain podman[305026]: 2025-12-02 09:58:51.624854579 +0000 UTC m=+0.148681800 container died 7f0ab745fe4f63d6d07ecbcdbd98db4d8c61bf630adbe4b627aa3675662d3595 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_golick, release=1763362218, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.component=rhceph-container)
Dec 02 09:58:51 np0005541913.localdomain podman[305049]: 2025-12-02 09:58:51.721909646 +0000 UTC m=+0.085378975 container remove 7f0ab745fe4f63d6d07ecbcdbd98db4d8c61bf630adbe4b627aa3675662d3595 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_golick, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, ceph=True)
Dec 02 09:58:51 np0005541913.localdomain systemd[1]: libpod-conmon-7f0ab745fe4f63d6d07ecbcdbd98db4d8c61bf630adbe4b627aa3675662d3595.scope: Deactivated successfully.
Dec 02 09:58:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:51.773 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:58:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:51.774 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:58:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8fd105a16b88c1101e745e78ce212ec91e56d84cd378b4e67ee076944015e6d0-merged.mount: Deactivated successfully.
Dec 02 09:58:51 np0005541913.localdomain sudo[304971]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:51 np0005541913.localdomain sudo[305066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:51 np0005541913.localdomain sudo[305066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:51 np0005541913.localdomain sudo[305066]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.024 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:58:52 np0005541913.localdomain sudo[305084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.028 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11645MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:58:52 np0005541913.localdomain sudo[305084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.028 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.029 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 0 B/s wr, 12 op/s
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541913.maexpe (monmap changed)...
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541913.maexpe on np0005541913.localdomain
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/481852052' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541913.mfesdm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.432 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.432 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.433 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:58:52 np0005541913.localdomain podman[305119]: 
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.468 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:58:52 np0005541913.localdomain podman[305119]: 2025-12-02 09:58:52.478510381 +0000 UTC m=+0.065790901 container create fd3e0dd49df6381d5475dccb563881dc12cf94581bfbd88e1a6ce5cea5a4b8ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_spence, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=)
Dec 02 09:58:52 np0005541913.localdomain systemd[1]: Started libpod-conmon-fd3e0dd49df6381d5475dccb563881dc12cf94581bfbd88e1a6ce5cea5a4b8ce.scope.
Dec 02 09:58:52 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:58:52 np0005541913.localdomain podman[305119]: 2025-12-02 09:58:52.537238983 +0000 UTC m=+0.124519493 container init fd3e0dd49df6381d5475dccb563881dc12cf94581bfbd88e1a6ce5cea5a4b8ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_spence, RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container)
Dec 02 09:58:52 np0005541913.localdomain podman[305119]: 2025-12-02 09:58:52.444217324 +0000 UTC m=+0.031497914 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:58:52 np0005541913.localdomain podman[305119]: 2025-12-02 09:58:52.545867754 +0000 UTC m=+0.133148264 container start fd3e0dd49df6381d5475dccb563881dc12cf94581bfbd88e1a6ce5cea5a4b8ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_spence, io.buildah.version=1.41.4, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, name=rhceph)
Dec 02 09:58:52 np0005541913.localdomain podman[305119]: 2025-12-02 09:58:52.545972827 +0000 UTC m=+0.133253337 container attach fd3e0dd49df6381d5475dccb563881dc12cf94581bfbd88e1a6ce5cea5a4b8ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_spence, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 09:58:52 np0005541913.localdomain crazy_spence[305134]: 167 167
Dec 02 09:58:52 np0005541913.localdomain systemd[1]: libpod-fd3e0dd49df6381d5475dccb563881dc12cf94581bfbd88e1a6ce5cea5a4b8ce.scope: Deactivated successfully.
Dec 02 09:58:52 np0005541913.localdomain podman[305119]: 2025-12-02 09:58:52.549837501 +0000 UTC m=+0.137118061 container died fd3e0dd49df6381d5475dccb563881dc12cf94581bfbd88e1a6ce5cea5a4b8ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_spence, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, RELEASE=main)
Dec 02 09:58:52 np0005541913.localdomain podman[305139]: 2025-12-02 09:58:52.615197189 +0000 UTC m=+0.059941284 container remove fd3e0dd49df6381d5475dccb563881dc12cf94581bfbd88e1a6ce5cea5a4b8ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_spence, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 09:58:52 np0005541913.localdomain systemd[1]: libpod-conmon-fd3e0dd49df6381d5475dccb563881dc12cf94581bfbd88e1a6ce5cea5a4b8ce.scope: Deactivated successfully.
Dec 02 09:58:52 np0005541913.localdomain sudo[305084]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-2e9e9dea0bcca0663144b7219617296296099a694380acd88c4f301816f08291-merged.mount: Deactivated successfully.
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:58:52 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1246894869' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.898 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:58:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:52.906 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:58:53 np0005541913.localdomain ceph-mon[298296]: from='client.64127 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:58:53 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541913.mfesdm (monmap changed)...
Dec 02 09:58:53 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541913.mfesdm on np0005541913.localdomain
Dec 02 09:58:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005541914.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 02 09:58:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:53 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1246894869' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:58:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:53.146 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:53.307 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:58:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:53.309 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:58:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:53.310 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:58:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:54.144 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:54.148 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:54.149 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:54.150 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:58:54 np0005541913.localdomain ceph-mon[298296]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Dec 02 09:58:54 np0005541913.localdomain ceph-mon[298296]: Reconfiguring crash.np0005541914 (monmap changed)...
Dec 02 09:58:54 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon crash.np0005541914 on np0005541914.localdomain
Dec 02 09:58:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 02 09:58:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.1 (monmap changed)...
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.1 on np0005541914.localdomain
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: from='client.64130 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: Saving service mon spec with placement label:mon
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:58:56 np0005541913.localdomain ceph-mon[298296]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Dec 02 09:58:56 np0005541913.localdomain ceph-mon[298296]: Reconfiguring osd.4 (monmap changed)...
Dec 02 09:58:56 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon osd.4 on np0005541914.localdomain
Dec 02 09:58:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005541914.sqgqkj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 02 09:58:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:57 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mds.mds.np0005541914.sqgqkj (monmap changed)...
Dec 02 09:58:57 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mds.mds.np0005541914.sqgqkj on np0005541914.localdomain
Dec 02 09:58:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005541914.lljzmk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 02 09:58:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "mgr services"} : dispatch
Dec 02 09:58:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:57 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1267034607' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:58:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:58.184 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mgr.np0005541914.lljzmk (monmap changed)...
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mgr.np0005541914.lljzmk on np0005541914.localdomain
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2343414930' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2732574246' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:58:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3324502342' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:58:58 np0005541913.localdomain sudo[305175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 09:58:58 np0005541913.localdomain sudo[305175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:58 np0005541913.localdomain sudo[305175]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:58:59.147 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mon.np0005541914 (monmap changed)...
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mon.np0005541914 on np0005541914.localdomain
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:58:59 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:58:59 np0005541913.localdomain sudo[305193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 09:58:59 np0005541913.localdomain sudo[305193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:58:59 np0005541913.localdomain sudo[305193]: pam_unix(sudo:session): session closed for user root
Dec 02 09:58:59 np0005541913.localdomain sudo[305211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 09:58:59 np0005541913.localdomain sudo[305211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 09:59:00 np0005541913.localdomain podman[305245]: 
Dec 02 09:59:00 np0005541913.localdomain podman[305245]: 2025-12-02 09:59:00.217920647 +0000 UTC m=+0.072220994 container create d20cd8e7b21d40833ca3f831eaa8bdb5120a13999df4e5f1b12d7edb07be8b7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_curran, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Dec 02 09:59:00 np0005541913.localdomain systemd[1]: Started libpod-conmon-d20cd8e7b21d40833ca3f831eaa8bdb5120a13999df4e5f1b12d7edb07be8b7e.scope.
Dec 02 09:59:00 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 09:59:00 np0005541913.localdomain podman[305245]: 2025-12-02 09:59:00.185568601 +0000 UTC m=+0.039869008 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 09:59:00 np0005541913.localdomain podman[305245]: 2025-12-02 09:59:00.285930927 +0000 UTC m=+0.140231274 container init d20cd8e7b21d40833ca3f831eaa8bdb5120a13999df4e5f1b12d7edb07be8b7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_curran, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, architecture=x86_64)
Dec 02 09:59:00 np0005541913.localdomain podman[305245]: 2025-12-02 09:59:00.298145433 +0000 UTC m=+0.152445760 container start d20cd8e7b21d40833ca3f831eaa8bdb5120a13999df4e5f1b12d7edb07be8b7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_curran, version=7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, release=1763362218, vcs-type=git, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 09:59:00 np0005541913.localdomain podman[305245]: 2025-12-02 09:59:00.29836741 +0000 UTC m=+0.152667797 container attach d20cd8e7b21d40833ca3f831eaa8bdb5120a13999df4e5f1b12d7edb07be8b7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_curran, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, RELEASE=main, vcs-type=git, io.buildah.version=1.41.4)
Dec 02 09:59:00 np0005541913.localdomain practical_curran[305261]: 167 167
Dec 02 09:59:00 np0005541913.localdomain systemd[1]: libpod-d20cd8e7b21d40833ca3f831eaa8bdb5120a13999df4e5f1b12d7edb07be8b7e.scope: Deactivated successfully.
Dec 02 09:59:00 np0005541913.localdomain podman[305245]: 2025-12-02 09:59:00.302318365 +0000 UTC m=+0.156618742 container died d20cd8e7b21d40833ca3f831eaa8bdb5120a13999df4e5f1b12d7edb07be8b7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_curran, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, RELEASE=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main)
Dec 02 09:59:00 np0005541913.localdomain podman[305266]: 2025-12-02 09:59:00.407749377 +0000 UTC m=+0.091341846 container remove d20cd8e7b21d40833ca3f831eaa8bdb5120a13999df4e5f1b12d7edb07be8b7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_curran, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, version=7)
Dec 02 09:59:00 np0005541913.localdomain systemd[1]: libpod-conmon-d20cd8e7b21d40833ca3f831eaa8bdb5120a13999df4e5f1b12d7edb07be8b7e.scope: Deactivated successfully.
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: from='client.44508 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005541914", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mon.np0005541912 (monmap changed)...
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mon.np0005541912 on np0005541912.localdomain
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: Reconfiguring mon.np0005541913 (monmap changed)...
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: Reconfiguring daemon mon.np0005541913 on np0005541913.localdomain
Dec 02 09:59:00 np0005541913.localdomain sudo[305211]: pam_unix(sudo:session): session closed for user root
Dec 02 09:59:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:01 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b9f949ad2f04a27d9de6982423a72e234a2336dade60524293046a2d4dbe23ad-merged.mount: Deactivated successfully.
Dec 02 09:59:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:59:01 np0005541913.localdomain ceph-mon[298296]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:59:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 09:59:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:59:02 np0005541913.localdomain systemd[1]: tmp-crun.zpADST.mount: Deactivated successfully.
Dec 02 09:59:02 np0005541913.localdomain podman[305282]: 2025-12-02 09:59:02.461602424 +0000 UTC m=+0.099393571 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 02 09:59:02 np0005541913.localdomain podman[305282]: 2025-12-02 09:59:02.478037904 +0000 UTC m=+0.115829041 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 09:59:02 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:59:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:59:03.043 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:59:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:59:03.044 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:59:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 09:59:03.045 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:59:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:03.189 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:03 np0005541913.localdomain ceph-mon[298296]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:59:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:59:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:59:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:59:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:59:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:59:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:59:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:04.150 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/747153591' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 09:59:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/747153591' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 09:59:05 np0005541913.localdomain ceph-mon[298296]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:05 np0005541913.localdomain ceph-mon[298296]: mgrmap e37: np0005541912.qwddia(active, since 25s), standbys: np0005541914.lljzmk, np0005541913.mfesdm
Dec 02 09:59:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:06 np0005541913.localdomain podman[240799]: time="2025-12-02T09:59:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:59:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:59:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:59:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:59:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18728 "" "Go-http-client/1.1"
Dec 02 09:59:07 np0005541913.localdomain ceph-mon[298296]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:08.243 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:59:08 np0005541913.localdomain podman[305301]: 2025-12-02 09:59:08.457923068 +0000 UTC m=+0.089187978 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 02 09:59:08 np0005541913.localdomain podman[305301]: 2025-12-02 09:59:08.492217716 +0000 UTC m=+0.123482596 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 09:59:08 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:59:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:09.153 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:09 np0005541913.localdomain ceph-mon[298296]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:59:10 np0005541913.localdomain systemd[1]: tmp-crun.tcT7zL.mount: Deactivated successfully.
Dec 02 09:59:10 np0005541913.localdomain podman[305319]: 2025-12-02 09:59:10.439489433 +0000 UTC m=+0.082136010 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public)
Dec 02 09:59:10 np0005541913.localdomain podman[305319]: 2025-12-02 09:59:10.477333405 +0000 UTC m=+0.119979972 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 09:59:10 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:59:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:59:11 np0005541913.localdomain systemd[1]: tmp-crun.olU9q9.mount: Deactivated successfully.
Dec 02 09:59:11 np0005541913.localdomain podman[305338]: 2025-12-02 09:59:11.450762802 +0000 UTC m=+0.090186555 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 09:59:11 np0005541913.localdomain podman[305338]: 2025-12-02 09:59:11.458866318 +0000 UTC m=+0.098290141 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 09:59:11 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:59:11 np0005541913.localdomain ceph-mon[298296]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:13.246 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:13 np0005541913.localdomain ceph-mon[298296]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:14.173 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:15 np0005541913.localdomain ceph-mon[298296]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:16 np0005541913.localdomain systemd[299560]: Starting Mark boot as successful...
Dec 02 09:59:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:59:16 np0005541913.localdomain systemd[299560]: Finished Mark boot as successful.
Dec 02 09:59:16 np0005541913.localdomain systemd[1]: tmp-crun.rTwnU7.mount: Deactivated successfully.
Dec 02 09:59:16 np0005541913.localdomain podman[305361]: 2025-12-02 09:59:16.430797658 +0000 UTC m=+0.069779998 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3)
Dec 02 09:59:16 np0005541913.localdomain podman[305361]: 2025-12-02 09:59:16.445807061 +0000 UTC m=+0.084789411 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 09:59:16 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:59:17 np0005541913.localdomain ceph-mon[298296]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:18.286 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:19.176 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:59:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:59:19 np0005541913.localdomain podman[305382]: 2025-12-02 09:59:19.501478666 +0000 UTC m=+0.137149931 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 09:59:19 np0005541913.localdomain podman[305381]: 2025-12-02 09:59:19.477283489 +0000 UTC m=+0.116260692 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:59:19 np0005541913.localdomain podman[305382]: 2025-12-02 09:59:19.543089149 +0000 UTC m=+0.178760404 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 02 09:59:19 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:59:19 np0005541913.localdomain podman[305381]: 2025-12-02 09:59:19.561163414 +0000 UTC m=+0.200140597 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:59:19 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:59:19 np0005541913.localdomain ceph-mon[298296]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:21 np0005541913.localdomain ceph-mon[298296]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:23.288 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:23 np0005541913.localdomain ceph-mon[298296]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:24.179 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:25 np0005541913.localdomain ceph-mon[298296]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:27 np0005541913.localdomain ceph-mon[298296]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:28.354 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:29.182 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:29 np0005541913.localdomain ceph-mon[298296]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:31 np0005541913.localdomain ceph-mon[298296]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 09:59:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:33.358 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:33 np0005541913.localdomain systemd[1]: tmp-crun.XO446X.mount: Deactivated successfully.
Dec 02 09:59:33 np0005541913.localdomain podman[305431]: 2025-12-02 09:59:33.52266967 +0000 UTC m=+0.155823283 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 09:59:33 np0005541913.localdomain ceph-mon[298296]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:33 np0005541913.localdomain podman[305431]: 2025-12-02 09:59:33.566757362 +0000 UTC m=+0.199910995 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 09:59:33 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 09:59:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:59:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 09:59:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 09:59:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 09:59:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:59:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   09:59:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 09:59:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 09:59:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:34.185 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:35 np0005541913.localdomain ceph-mon[298296]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:36 np0005541913.localdomain podman[240799]: time="2025-12-02T09:59:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 09:59:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:59:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 09:59:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:09:59:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18723 "" "Go-http-client/1.1"
Dec 02 09:59:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 02 09:59:36 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2480513664' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:59:37 np0005541913.localdomain ceph-mon[298296]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:37 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/2480513664' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 02 09:59:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:38.429 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 09:59:39 np0005541913.localdomain podman[305450]: 2025-12-02 09:59:39.1713339 +0000 UTC m=+0.087933180 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 09:59:39 np0005541913.localdomain podman[305450]: 2025-12-02 09:59:39.181161653 +0000 UTC m=+0.097760923 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 02 09:59:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:39.191 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:39 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 09:59:39 np0005541913.localdomain ceph-mon[298296]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:40 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 09:59:41 np0005541913.localdomain podman[305467]: 2025-12-02 09:59:41.435773855 +0000 UTC m=+0.079846893 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, release=1755695350, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 09:59:41 np0005541913.localdomain podman[305467]: 2025-12-02 09:59:41.476094857 +0000 UTC m=+0.120167845 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Dec 02 09:59:41 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 09:59:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 09:59:41 np0005541913.localdomain systemd[1]: tmp-crun.87RCAL.mount: Deactivated successfully.
Dec 02 09:59:41 np0005541913.localdomain podman[305488]: 2025-12-02 09:59:41.623816731 +0000 UTC m=+0.102082690 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:59:41 np0005541913.localdomain podman[305488]: 2025-12-02 09:59:41.637267392 +0000 UTC m=+0.115533331 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 09:59:41 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 09:59:41 np0005541913.localdomain ceph-mon[298296]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:43.430 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:43.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:43 np0005541913.localdomain ceph-mon[298296]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:44.191 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:44 np0005541913.localdomain ceph-mon[298296]: from='client.64157 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:59:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:45 np0005541913.localdomain ceph-mon[298296]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 09:59:47 np0005541913.localdomain podman[305512]: 2025-12-02 09:59:47.447263591 +0000 UTC m=+0.087052957 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 09:59:47 np0005541913.localdomain podman[305512]: 2025-12-02 09:59:47.482873737 +0000 UTC m=+0.122663123 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 09:59:47 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 09:59:47 np0005541913.localdomain ceph-mon[298296]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:48.466 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:49.193 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:49.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:49.917 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:49.917 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 09:59:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:49.918 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 09:59:49 np0005541913.localdomain ceph-mon[298296]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 09:59:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 09:59:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:50.414 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 09:59:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:50.414 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 09:59:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:50.415 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 09:59:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:50.415 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 09:59:50 np0005541913.localdomain podman[305532]: 2025-12-02 09:59:50.443443723 +0000 UTC m=+0.084150829 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:59:50 np0005541913.localdomain podman[305532]: 2025-12-02 09:59:50.454976662 +0000 UTC m=+0.095683818 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 09:59:50 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 09:59:50 np0005541913.localdomain podman[305533]: 2025-12-02 09:59:50.542712096 +0000 UTC m=+0.180567186 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 09:59:50 np0005541913.localdomain podman[305533]: 2025-12-02 09:59:50.605037538 +0000 UTC m=+0.242892578 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 09:59:50 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 09:59:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:50.948 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.059 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.059 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.060 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.060 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 09:59:51 np0005541913.localdomain ceph-mon[298296]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:51 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/4215452679' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.901 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.902 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.903 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.903 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 09:59:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:51.904 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:59:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:59:52 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3862452935' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.372 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.436 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.437 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 09:59:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3862452935' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.652 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.653 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11690MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.654 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.654 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.807 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.808 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.809 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 09:59:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:52.842 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 09:59:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 09:59:53 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3342764562' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:59:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:53.293 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 09:59:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:53.300 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 09:59:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:53.406 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 09:59:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:53.408 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 09:59:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:53.408 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 09:59:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:53.468 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:53 np0005541913.localdomain ceph-mon[298296]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:53 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3342764562' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:59:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:54.196 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:54.409 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:54.409 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:54.410 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:54.410 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:54.824 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 09:59:55 np0005541913.localdomain ceph-mon[298296]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 09:59:57 np0005541913.localdomain ceph-mon[298296]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:57 np0005541913.localdomain ceph-mon[298296]: from='client.64175 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 02 09:59:57 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2437377944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:59:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:58.506 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2802866471' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:59:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/590004416' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 09:59:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 09:59:59.198 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 09:59:59 np0005541913.localdomain ceph-mon[298296]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 09:59:59 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3653965259' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:00:00 np0005541913.localdomain sudo[305627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:00:00 np0005541913.localdomain sudo[305627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:00 np0005541913.localdomain sudo[305627]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:00 np0005541913.localdomain sudo[305645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:00:00 np0005541913.localdomain sudo[305645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:00 np0005541913.localdomain ceph-mon[298296]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 02 10:00:00 np0005541913.localdomain ceph-mon[298296]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 02 10:00:00 np0005541913.localdomain ceph-mon[298296]:     stray daemon mgr.np0005541911.adcgiw on host np0005541911.localdomain not managed by cephadm
Dec 02 10:00:00 np0005541913.localdomain ceph-mon[298296]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 02 10:00:00 np0005541913.localdomain ceph-mon[298296]:     stray host np0005541911.localdomain has 1 stray daemons: ['mgr.np0005541911.adcgiw']
Dec 02 10:00:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:01 np0005541913.localdomain sudo[305645]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:01 np0005541913.localdomain sudo[305695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:00:01 np0005541913.localdomain sudo[305695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:01 np0005541913.localdomain sudo[305695]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:01 np0005541913.localdomain ceph-mon[298296]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:00:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:00:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 10:00:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:00:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:00:03.044 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:00:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:00:03.045 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:00:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:00:03.047 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:00:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:03.544 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:03 np0005541913.localdomain ceph-mon[298296]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/2832629924' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:00:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:00:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:00:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:00:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:00:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:00:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:00:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:00:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:04.200 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:00:04 np0005541913.localdomain podman[305713]: 2025-12-02 10:00:04.449126209 +0000 UTC m=+0.086420031 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec 02 10:00:04 np0005541913.localdomain podman[305713]: 2025-12-02 10:00:04.458361108 +0000 UTC m=+0.095654899 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:00:04 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:00:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2838133861' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:00:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2838133861' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.491984) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669605492034, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2689, "num_deletes": 255, "total_data_size": 7858319, "memory_usage": 8109440, "flush_reason": "Manual Compaction"}
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669605525162, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 4761463, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14440, "largest_seqno": 17124, "table_properties": {"data_size": 4750663, "index_size": 6729, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26320, "raw_average_key_size": 22, "raw_value_size": 4727507, "raw_average_value_size": 3976, "num_data_blocks": 293, "num_entries": 1189, "num_filter_entries": 1189, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669502, "oldest_key_time": 1764669502, "file_creation_time": 1764669605, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 33246 microseconds, and 11601 cpu microseconds.
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.525227) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 4761463 bytes OK
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.525258) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.527263) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.527286) EVENT_LOG_v1 {"time_micros": 1764669605527280, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.527313) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 7845517, prev total WAL file size 7845517, number of live WAL files 2.
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.529661) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end)
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(4649KB)], [21(15MB)]
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669605529714, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 21088372, "oldest_snapshot_seqno": -1}
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 12081 keys, 18532366 bytes, temperature: kUnknown
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669605619148, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 18532366, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18460848, "index_size": 40249, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30213, "raw_key_size": 322973, "raw_average_key_size": 26, "raw_value_size": 18252231, "raw_average_value_size": 1510, "num_data_blocks": 1542, "num_entries": 12081, "num_filter_entries": 12081, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764669605, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.619542) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 18532366 bytes
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.621503) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 235.4 rd, 206.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.5, 15.6 +0.0 blob) out(17.7 +0.0 blob), read-write-amplify(8.3) write-amplify(3.9) OK, records in: 12628, records dropped: 547 output_compression: NoCompression
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.621537) EVENT_LOG_v1 {"time_micros": 1764669605621522, "job": 10, "event": "compaction_finished", "compaction_time_micros": 89574, "compaction_time_cpu_micros": 34056, "output_level": 6, "num_output_files": 1, "total_output_size": 18532366, "num_input_records": 12628, "num_output_records": 12081, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.529544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.621790) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.621797) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.621801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.621806) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:05.621809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.26660 172.18.0.106:0/2630977033' entity='mgr.np0005541912.qwddia' 
Dec 02 10:00:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:00:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:00:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:00:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:00:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:00:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18723 "" "Go-http-client/1.1"
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr fail"} v 0)
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1313402171' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 e91: 6 total, 6 up, 6 in
Dec 02 10:00:06 np0005541913.localdomain sshd[303653]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 02 10:00:06 np0005541913.localdomain systemd[1]: session-71.scope: Deactivated successfully.
Dec 02 10:00:06 np0005541913.localdomain systemd[1]: session-71.scope: Consumed 10.596s CPU time.
Dec 02 10:00:06 np0005541913.localdomain systemd-logind[757]: Session 71 logged out. Waiting for processes to exit.
Dec 02 10:00:06 np0005541913.localdomain systemd-logind[757]: Removed session 71.
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: Activating manager daemon np0005541914.lljzmk
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.200:0/1313402171' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: osdmap e91: 6 total, 6 up, 6 in
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: mgrmap e38: np0005541914.lljzmk(active, starting, since 0.0398616s), standbys: np0005541913.mfesdm
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541912"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541913"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata", "id": "np0005541914"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mds metadata", "who": "mds.np0005541914.sqgqkj"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mds metadata", "who": "mds.np0005541913.maexpe"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mds metadata", "who": "mds.np0005541912.ghcwcm"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr metadata", "who": "np0005541914.lljzmk", "id": "np0005541914.lljzmk"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr metadata", "who": "np0005541913.mfesdm", "id": "np0005541913.mfesdm"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mds metadata"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd metadata"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mon metadata"} : dispatch
Dec 02 10:00:06 np0005541913.localdomain ceph-mon[298296]: Manager daemon np0005541914.lljzmk is now available
Dec 02 10:00:07 np0005541913.localdomain sshd[305730]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:00:07 np0005541913.localdomain sshd[305730]: Accepted publickey for ceph-admin from 192.168.122.108 port 46522 ssh2: RSA SHA256:/YNuT9SDdDKzRDkfkhY38xwkOxxhXakQO/p8xUCPPz0
Dec 02 10:00:07 np0005541913.localdomain systemd-logind[757]: New session 72 of user ceph-admin.
Dec 02 10:00:07 np0005541913.localdomain systemd[1]: Started Session 72 of User ceph-admin.
Dec 02 10:00:07 np0005541913.localdomain sshd[305730]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 02 10:00:07 np0005541913.localdomain sudo[305734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:00:07 np0005541913.localdomain sudo[305734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:07 np0005541913.localdomain sudo[305734]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:07 np0005541913.localdomain sudo[305752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 10:00:07 np0005541913.localdomain sudo[305752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541914.lljzmk/mirror_snapshot_schedule"} : dispatch
Dec 02 10:00:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541914.lljzmk/mirror_snapshot_schedule"} : dispatch
Dec 02 10:00:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541914.lljzmk/trash_purge_schedule"} : dispatch
Dec 02 10:00:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005541914.lljzmk/trash_purge_schedule"} : dispatch
Dec 02 10:00:07 np0005541913.localdomain ceph-mon[298296]: mgrmap e39: np0005541914.lljzmk(active, since 1.04514s), standbys: np0005541913.mfesdm
Dec 02 10:00:08 np0005541913.localdomain systemd[1]: tmp-crun.hfbLnO.mount: Deactivated successfully.
Dec 02 10:00:08 np0005541913.localdomain podman[305841]: 2025-12-02 10:00:08.198569905 +0000 UTC m=+0.101629849 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1763362218, name=rhceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 02 10:00:08 np0005541913.localdomain podman[305841]: 2025-12-02 10:00:08.291334464 +0000 UTC m=+0.194394448 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, release=1763362218, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, architecture=x86_64, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7)
Dec 02 10:00:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:08.593 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:08 np0005541913.localdomain sudo[305752]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:08 np0005541913.localdomain sudo[305960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:00:08 np0005541913.localdomain sudo[305960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:08 np0005541913.localdomain sudo[305960]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:08 np0005541913.localdomain ceph-mon[298296]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:08 np0005541913.localdomain ceph-mon[298296]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Dec 02 10:00:08 np0005541913.localdomain ceph-mon[298296]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Dec 02 10:00:08 np0005541913.localdomain ceph-mon[298296]: Cluster is now healthy
Dec 02 10:00:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:08 np0005541913.localdomain sudo[305978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:00:08 np0005541913.localdomain sudo[305978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:09.202 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:00:09 np0005541913.localdomain podman[306011]: 2025-12-02 10:00:09.455739649 +0000 UTC m=+0.083372367 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Dec 02 10:00:09 np0005541913.localdomain podman[306011]: 2025-12-02 10:00:09.465221745 +0000 UTC m=+0.092854523 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:00:09 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:00:09 np0005541913.localdomain sudo[305978]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:09 np0005541913.localdomain sudo[306047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:00:09 np0005541913.localdomain sudo[306047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:09 np0005541913.localdomain sudo[306047]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:09 np0005541913.localdomain sudo[306065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 02 10:00:09 np0005541913.localdomain sudo[306065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:09 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:10:00:08] ENGINE Bus STARTING
Dec 02 10:00:09 np0005541913.localdomain ceph-mon[298296]: mgrmap e40: np0005541914.lljzmk(active, since 2s), standbys: np0005541913.mfesdm
Dec 02 10:00:09 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:10:00:09] ENGINE Serving on http://172.18.0.108:8765
Dec 02 10:00:09 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:09 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:10:00:09] ENGINE Serving on https://172.18.0.108:7150
Dec 02 10:00:09 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:10:00:09] ENGINE Bus STARTED
Dec 02 10:00:09 np0005541913.localdomain ceph-mon[298296]: [02/Dec/2025:10:00:09] ENGINE Client ('172.18.0.108', 52382) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 02 10:00:09 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:10 np0005541913.localdomain sudo[306065]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.558204) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669610558273, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 490, "num_deletes": 252, "total_data_size": 1628548, "memory_usage": 1641288, "flush_reason": "Manual Compaction"}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669610569275, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1063710, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17129, "largest_seqno": 17614, "table_properties": {"data_size": 1060733, "index_size": 960, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 8165, "raw_average_key_size": 21, "raw_value_size": 1054438, "raw_average_value_size": 2819, "num_data_blocks": 38, "num_entries": 374, "num_filter_entries": 374, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669605, "oldest_key_time": 1764669605, "file_creation_time": 1764669610, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 11234 microseconds, and 3800 cpu microseconds.
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.569333) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1063710 bytes OK
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.569468) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.571391) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.571425) EVENT_LOG_v1 {"time_micros": 1764669610571413, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.571454) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 1625430, prev total WAL file size 1625754, number of live WAL files 2.
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669610572358, "job": 11, "event": "table_file_deletion", "file_number": 23}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669610574985, "job": 11, "event": "table_file_deletion", "file_number": 21}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.575307) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373536' seq:72057594037927935, type:22 .. '6D6772737461740034303038' seq:0, type:0; will stop at (end)
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1038KB)], [24(17MB)]
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669610575414, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 19596076, "oldest_snapshot_seqno": -1}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 11923 keys, 17247392 bytes, temperature: kUnknown
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669610672180, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 17247392, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17181470, "index_size": 35037, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29829, "raw_key_size": 320049, "raw_average_key_size": 26, "raw_value_size": 16980125, "raw_average_value_size": 1424, "num_data_blocks": 1326, "num_entries": 11923, "num_filter_entries": 11923, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764669610, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.672560) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 17247392 bytes
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.674631) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.3 rd, 178.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 17.7 +0.0 blob) out(16.4 +0.0 blob), read-write-amplify(34.6) write-amplify(16.2) OK, records in: 12455, records dropped: 532 output_compression: NoCompression
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.674663) EVENT_LOG_v1 {"time_micros": 1764669610674649, "job": 12, "event": "compaction_finished", "compaction_time_micros": 96873, "compaction_time_cpu_micros": 50530, "output_level": 6, "num_output_files": 1, "total_output_size": 17247392, "num_input_records": 12455, "num_output_records": 11923, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669610674955, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669610677361, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.575179) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.677403) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.677410) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.677413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.677416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:00:10.677419) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:00:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:10 np0005541913.localdomain sudo[306101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 10:00:10 np0005541913.localdomain sudo[306101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:10 np0005541913.localdomain sudo[306101]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:10 np0005541913.localdomain sudo[306119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 10:00:10 np0005541913.localdomain sudo[306119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:10 np0005541913.localdomain sudo[306119]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:10 np0005541913.localdomain sudo[306137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 10:00:10 np0005541913.localdomain sudo[306137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:10 np0005541913.localdomain sudo[306137]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain sudo[306155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 10:00:11 np0005541913.localdomain sudo[306155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain sudo[306155]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain sudo[306173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 10:00:11 np0005541913.localdomain sudo[306173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain sudo[306173]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain sudo[306207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 10:00:11 np0005541913.localdomain sudo[306207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541913.localdomain to 836.6M
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541913.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541914.localdomain to 836.6M
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541914.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:00:11 np0005541913.localdomain sudo[306207]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain sudo[306225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new
Dec 02 10:00:11 np0005541913.localdomain sudo[306225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain sudo[306225]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain sudo[306243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 02 10:00:11 np0005541913.localdomain sudo[306243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain sudo[306243]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain sudo[306261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 10:00:11 np0005541913.localdomain sudo[306261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain sudo[306261]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:00:11 np0005541913.localdomain sudo[306285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 10:00:11 np0005541913.localdomain sudo[306285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain systemd[1]: tmp-crun.Jz9151.mount: Deactivated successfully.
Dec 02 10:00:11 np0005541913.localdomain sudo[306285]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain podman[306279]: 2025-12-02 10:00:11.624194349 +0000 UTC m=+0.095884614 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal)
Dec 02 10:00:11 np0005541913.localdomain podman[306279]: 2025-12-02 10:00:11.641898335 +0000 UTC m=+0.113588600 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:00:11 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:00:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:00:11 np0005541913.localdomain sudo[306317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 10:00:11 np0005541913.localdomain sudo[306317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain sudo[306317]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain sudo[306341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 10:00:11 np0005541913.localdomain sudo[306341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain sudo[306341]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain podman[306334]: 2025-12-02 10:00:11.771669247 +0000 UTC m=+0.092030001 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:00:11 np0005541913.localdomain podman[306334]: 2025-12-02 10:00:11.783945477 +0000 UTC m=+0.104306241 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:00:11 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:00:11 np0005541913.localdomain sudo[306374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 10:00:11 np0005541913.localdomain sudo[306374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain sudo[306374]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:11 np0005541913.localdomain sudo[306408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 10:00:11 np0005541913.localdomain sudo[306408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:11 np0005541913.localdomain sudo[306408]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new
Dec 02 10:00:12 np0005541913.localdomain sudo[306426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306426]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 10:00:12 np0005541913.localdomain sudo[306444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306444]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 02 10:00:12 np0005541913.localdomain sudo[306462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306462]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph
Dec 02 10:00:12 np0005541913.localdomain sudo[306480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306480]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 10:00:12 np0005541913.localdomain sudo[306498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306498]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541912.localdomain to 836.6M
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541912.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/etc/ceph/ceph.conf
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/etc/ceph/ceph.conf
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/etc/ceph/ceph.conf
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.conf
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: mgrmap e41: np0005541914.lljzmk(active, since 4s), standbys: np0005541913.mfesdm
Dec 02 10:00:12 np0005541913.localdomain ceph-mon[298296]: Standby manager daemon np0005541912.qwddia started
Dec 02 10:00:12 np0005541913.localdomain sudo[306516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 10:00:12 np0005541913.localdomain sudo[306516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306516]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 10:00:12 np0005541913.localdomain sudo[306534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306534]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 10:00:12 np0005541913.localdomain sudo[306568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306568]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new
Dec 02 10:00:12 np0005541913.localdomain sudo[306586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306586]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 02 10:00:12 np0005541913.localdomain sudo[306604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306604]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 10:00:12 np0005541913.localdomain sudo[306622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306622]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config
Dec 02 10:00:12 np0005541913.localdomain sudo[306640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306640]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 10:00:12 np0005541913.localdomain sudo[306658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306658]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:12 np0005541913.localdomain sudo[306676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074
Dec 02 10:00:12 np0005541913.localdomain sudo[306676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:12 np0005541913.localdomain sudo[306676]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:13 np0005541913.localdomain sudo[306694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 10:00:13 np0005541913.localdomain sudo[306694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:13 np0005541913.localdomain sudo[306694]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:13 np0005541913.localdomain sudo[306728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 10:00:13 np0005541913.localdomain sudo[306728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:13 np0005541913.localdomain sudo[306728]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:13 np0005541913.localdomain sudo[306746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new
Dec 02 10:00:13 np0005541913.localdomain sudo[306746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:13 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 10:00:13 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 10:00:13 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 02 10:00:13 np0005541913.localdomain ceph-mon[298296]: mgrmap e42: np0005541914.lljzmk(active, since 5s), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:00:13 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "mgr metadata", "who": "np0005541912.qwddia", "id": "np0005541912.qwddia"} : dispatch
Dec 02 10:00:13 np0005541913.localdomain sudo[306746]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:13 np0005541913.localdomain sudo[306764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-c7c8e171-a193-56fb-95fa-8879fcfa7074/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring.new /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 10:00:13 np0005541913.localdomain sudo[306764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:13 np0005541913.localdomain sudo[306764]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:13 np0005541913.localdomain sudo[306782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:00:13 np0005541913.localdomain sudo[306782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:13 np0005541913.localdomain sudo[306782]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:13.627 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:13 np0005541913.localdomain sudo[306800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:00:13 np0005541913.localdomain sudo[306800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:00:13 np0005541913.localdomain sudo[306800]: pam_unix(sudo:session): session closed for user root
Dec 02 10:00:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:14.205 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: Updating np0005541914.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: Updating np0005541912.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: Updating np0005541913.localdomain:/var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/config/ceph.client.admin.keyring
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:00:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.104 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.119 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.120 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b6b5a37-3a03-4761-8c60-9d7d8bd8f95b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:00:16.106208', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b3a70d48-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.325281813, 'message_signature': '75a8f8e0ab12297ccd32ed35f456743ff8e540c0dde77c7d5b7a26bb266ede8a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:00:16.106208', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b3a72120-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.325281813, 'message_signature': 'd6c2d03db7755622c3c6f2565f6f8f66be29ce5caf135cd2a1aa8f2a82c7a42d'}]}, 'timestamp': '2025-12-02 10:00:16.120880', '_unique_id': '667d3e2fdf5c4208afd7803c5b2aa8c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.122 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.127 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdfb64d9-9dff-4535-b673-f6bc1dd7acad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.123898', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3a84726-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': 'd7baa162f90ca88b0b1a0aec7466953aaee669de0293ef2807e5d1a8a183b4c8'}]}, 'timestamp': '2025-12-02 10:00:16.128419', '_unique_id': '89df3b085c0d4835b3d19ff089355d26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.129 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.159 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.159 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95c65201-1a38-4bf4-bef3-50aff3f97b52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:00:16.130898', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b3ad160c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': 'dfb7526caa75aa8ab5096d5deb718bab05775c8f68a76c450272efdf05afc95e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:00:16.130898', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b3ad27a0-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': 'd22c6ba064e1fc200f367de71c1f7a034fae18c3f1f528374ab1d970e0d0e22c'}]}, 'timestamp': '2025-12-02 10:00:16.160356', '_unique_id': 'e08b5d254bf148769640a6cf25bbe62d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.162 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8849388f-1751-4059-8f39-dce0ecd26c2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.162706', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3ad97b2-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': 'ff94a62108827785206381686b7e5d4d83552bce5cf4c8cf94c059f0c42437c1'}]}, 'timestamp': '2025-12-02 10:00:16.163257', '_unique_id': '692e36a6474c4c2eb35bc2b17f92e200'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.165 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77bf126b-caa6-422f-839c-cb5d2eeb6f72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.165470', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3ae021a-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': '35b945549208b309173ac431beb469180e005d3ecd60f370b67a6876e49f5932'}]}, 'timestamp': '2025-12-02 10:00:16.165994', '_unique_id': '68948cce943c412e858c5c17c6e6a2f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.166 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.168 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.169 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c783bc7-f075-40a7-8a8a-1594028124e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:00:16.168869', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b3ae892e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': 'd03761be20919156bb8e4180b0beb24446c312d7c720d726c7e505cc5ae19683'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:00:16.168869', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b3aea256-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': 'a3b26ffddff9c337dba32e6a4facf2935c8a18d1547ea3226dbaffb66b7e4915'}]}, 'timestamp': '2025-12-02 10:00:16.170129', '_unique_id': 'fb23a52902714366a1428828ea9346d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.171 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.172 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.173 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77f45bfa-190a-426f-847d-06170e5e58ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:00:16.172676', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b3af1a92-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.325281813, 'message_signature': '8f16ec67e976340b3dee0404910398dbfe074fd38c56a71a0ef01050ef092ab7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:00:16.172676', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b3af2cee-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.325281813, 'message_signature': '4265b989034e1e66b1dcb5bc0007985b94a93dd96ca77017d996b5e5b92b2b32'}]}, 'timestamp': '2025-12-02 10:00:16.173601', '_unique_id': '7941c4a1e7334b3b982a32a5aba386a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.175 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e619f95d-da3f-4763-8b83-d82aefd0f76a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.175905', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3af98aa-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': 'f8d2ccac784a6fd6902c17ea58eded5ae8c29e3befce4d12842d8ae7e3232b89'}]}, 'timestamp': '2025-12-02 10:00:16.176503', '_unique_id': 'e9f91d927a7f49b48a36e5f72656cfc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.177 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.178 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.178 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03f79bb5-011a-439c-bbdc-b0c4823e39a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.178759', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3b00952-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': '87deb06bdeb2cfc20e36168b83aa5a7c65d103cc09823df51455698515c980e0'}]}, 'timestamp': '2025-12-02 10:00:16.179270', '_unique_id': 'a441fc86ef97402ea97610159659a8d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.181 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.182 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c20efe2-f163-4f9b-8dcb-63245ceeb833', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:00:16.181515', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b3b075fe-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.325281813, 'message_signature': 'a13a9ef0edb2546c2fd1bc4181093794cf23fb953261aab3a1cfc9023b48521c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:00:16.181515', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b3b087f6-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.325281813, 'message_signature': 'd211d61dcd3637f4aa44540c0c94827b1636f11e2b8d7b4499dec71585ca7bd8'}]}, 'timestamp': '2025-12-02 10:00:16.182482', '_unique_id': '99fb58283e144af29cc9918818262c63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.184 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.184 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.185 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe47d1e6-f2ff-4f87-a919-945fe4823cd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:00:16.184872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b3b0f68c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': '4cf597eefd6ea53755eda141f2a2aa8510accd2435c751468925e10b73766139'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:00:16.184872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b3b106e0-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': '9ae455225eaa285467cfc858b19e904dfd32ae37df2a50ca4e7fdf6ba685e250'}]}, 'timestamp': '2025-12-02 10:00:16.185762', '_unique_id': '56bfb80bfa054187ad1b4fa49c8a5e06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '453991f9-19ff-43f7-81c0-33c01546933c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:00:16.188029', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b3b171de-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': 'ffd4858640832c3c7ce1adcfe4c1e84b75db2726d9366b225e7451e176de779f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:00:16.188029', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b3b183a4-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': '6f66a757264df726c4a95e565492b00e0edee7673a5ff7d96fc7e9f8e4c44499'}]}, 'timestamp': '2025-12-02 10:00:16.188920', '_unique_id': 'bbfa3f188214429daccc22ac979e2672'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.189 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.191 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d4f3f71-0ced-4bdb-9750-3d6b451a59af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.191146', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3b1ebe6-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': 'a1ccc66f86a60e8ea356e7bfc2a17baf2ab855f87b0ff949b70921bd09d0d244'}]}, 'timestamp': '2025-12-02 10:00:16.191651', '_unique_id': '5940a4e0be764d1b9c761ac75a3b585b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.193 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.194 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05bf63f6-fd8a-4fe3-bbad-f6e43f0d3dde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:00:16.193823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b3b253f6-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': 'ebf836088d1638774bc18741a8c325e50857ed6b8833d457d917d1a3b941988d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:00:16.193823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b3b265f8-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': 'd6204ebd362632ec993073058af6f553dbfca0866e582146115f392fe00470c2'}]}, 'timestamp': '2025-12-02 10:00:16.194752', '_unique_id': '0b300352f33c4b5ea4fcb173b1f78d0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.197 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6937baeb-c719-411a-88b7-6fe3c35ffdef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.196988', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3b2d010-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': 'a508d2140006d9dd1d5468b635085dd3066d42635a739140cea7f986e6929b7c'}]}, 'timestamp': '2025-12-02 10:00:16.197456', '_unique_id': '63e9d289d8fd4a75aa1b79eb8b80eeb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.199 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d4f811d-3523-40f6-9a2b-e30899bd3cd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:00:16.199659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b3b3385c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': 'daebb99083c8627ccb13bdc0727f3b686afb6502eb1b8bc6e1d857001d5de3a4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:00:16.199659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b3b348b0-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.350007386, 'message_signature': '39abba282f25c19c9afbb23bc1b20da398d91e41674a9db0ccafbeee2ad9f4e2'}]}, 'timestamp': '2025-12-02 10:00:16.200515', '_unique_id': '77a463a484e946a2b6d902754663f762'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.202 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.219 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae22b03e-30c7-468d-94bd-dbb47f1b6d96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:00:16.202882', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b3b62bde-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.438024119, 'message_signature': '24dbb8c3dd54b76c391c85badd8c7616e21bc21dd5dab676710674515318fd74'}]}, 'timestamp': '2025-12-02 10:00:16.219388', '_unique_id': 'd2e3643955ee46a88ea6c6eb61da5cd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.220 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfe1301b-d0a2-4ba3-88da-18eea969306e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.220944', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3b67468-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': '18cf9fa6e90893d3486488071f7cd27c75417efa36faa21b1eb7b3f4451bcd3d'}]}, 'timestamp': '2025-12-02 10:00:16.221238', '_unique_id': '509fe3f269744d26b2ce23fecc6d5e9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd06682d-f47c-42ff-aa57-442d47283ceb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.222979', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3b6c3f0-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': '59a838a37dc1038fcd385e1d10cbfe01abf05ebf753e94fa2ad086698045a9af'}]}, 'timestamp': '2025-12-02 10:00:16.223275', '_unique_id': '6522b5a688484d04b34b4d74878d3f99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.224 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.224 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 15220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8b70a51-8a82-4b0d-a366-15075f5a2be2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15220000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:00:16.224595', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b3b703c4-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.438024119, 'message_signature': '4e974438ac71afc62b2e63b66d14f8fdc43081a47beb00f6687efca9cd04da2b'}]}, 'timestamp': '2025-12-02 10:00:16.224898', '_unique_id': '4766a431b1514d84a8f28c0d03332b33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.226 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14afbf92-0fde-41c8-b50f-bc19de7573dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:00:16.226206', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'b3b741e0-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11778.342974778, 'message_signature': '932aa14a4d105dc5d991ca1831f7618690575360ea8c2d2294dd78b8fa88d450'}]}, 'timestamp': '2025-12-02 10:00:16.226497', '_unique_id': 'b1c8b157d67a4010a3e22055d2e9e9f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:00:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:00:16.227 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:00:16 np0005541913.localdomain ceph-mon[298296]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Dec 02 10:00:17 np0005541913.localdomain ceph-mon[298296]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Dec 02 10:00:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:00:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:00:18 np0005541913.localdomain podman[306818]: 2025-12-02 10:00:18.45385446 +0000 UTC m=+0.084394516 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:00:18 np0005541913.localdomain podman[306818]: 2025-12-02 10:00:18.46650764 +0000 UTC m=+0.097047686 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 10:00:18 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:00:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:18.665 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:19.209 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:19 np0005541913.localdomain ceph-mon[298296]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 02 10:00:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:00:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:00:21 np0005541913.localdomain podman[306838]: 2025-12-02 10:00:21.448377078 +0000 UTC m=+0.086509712 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:00:21 np0005541913.localdomain podman[306838]: 2025-12-02 10:00:21.460929575 +0000 UTC m=+0.099062199 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:00:21 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:00:21 np0005541913.localdomain podman[306839]: 2025-12-02 10:00:21.547816647 +0000 UTC m=+0.183320281 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller)
Dec 02 10:00:21 np0005541913.localdomain ceph-mon[298296]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 10:00:21 np0005541913.localdomain podman[306839]: 2025-12-02 10:00:21.610148169 +0000 UTC m=+0.245651773 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:00:21 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:00:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:23.718 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:23 np0005541913.localdomain ceph-mon[298296]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 10:00:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:24.214 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:25 np0005541913.localdomain ceph-mon[298296]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 02 10:00:27 np0005541913.localdomain ceph-mon[298296]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:28.758 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:29.215 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:29 np0005541913.localdomain ceph-mon[298296]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:31 np0005541913.localdomain ceph-mon[298296]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:33.789 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:33 np0005541913.localdomain ceph-mon[298296]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:00:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:00:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:00:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:00:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:00:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:00:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:00:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:00:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:34.218 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:00:35 np0005541913.localdomain podman[306886]: 2025-12-02 10:00:35.440556692 +0000 UTC m=+0.081833377 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:00:35 np0005541913.localdomain podman[306886]: 2025-12-02 10:00:35.451975238 +0000 UTC m=+0.093251913 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 02 10:00:35 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:00:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:35 np0005541913.localdomain ceph-mon[298296]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:00:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:00:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:00:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:00:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:00:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18721 "" "Go-http-client/1.1"
Dec 02 10:00:38 np0005541913.localdomain ceph-mon[298296]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:38.839 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:39.221 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:40 np0005541913.localdomain ceph-mon[298296]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:00:40 np0005541913.localdomain podman[306904]: 2025-12-02 10:00:40.440434431 +0000 UTC m=+0.080325787 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:00:40 np0005541913.localdomain podman[306904]: 2025-12-02 10:00:40.474975347 +0000 UTC m=+0.114866663 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:00:40 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:00:40 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:41 np0005541913.localdomain ceph-mon[298296]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:00:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:00:42 np0005541913.localdomain podman[306924]: 2025-12-02 10:00:42.443748339 +0000 UTC m=+0.086510353 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:00:42 np0005541913.localdomain podman[306924]: 2025-12-02 10:00:42.457055486 +0000 UTC m=+0.099817490 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:00:42 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:00:42 np0005541913.localdomain podman[306923]: 2025-12-02 10:00:42.550139574 +0000 UTC m=+0.195444516 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 10:00:42 np0005541913.localdomain podman[306923]: 2025-12-02 10:00:42.561853968 +0000 UTC m=+0.207158910 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc., name=ubi9-minimal)
Dec 02 10:00:42 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:00:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:43.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:00:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:43.867 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:43 np0005541913.localdomain ceph-mon[298296]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:44.227 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:45 np0005541913.localdomain ceph-mon[298296]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:48 np0005541913.localdomain ceph-mon[298296]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:48.877 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:49.228 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:00:49 np0005541913.localdomain podman[306967]: 2025-12-02 10:00:49.435368627 +0000 UTC m=+0.078838577 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, config_id=multipathd)
Dec 02 10:00:49 np0005541913.localdomain podman[306967]: 2025-12-02 10:00:49.446323081 +0000 UTC m=+0.089792991 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:00:49 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:00:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:49.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:00:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:49.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:00:50 np0005541913.localdomain ceph-mon[298296]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:50.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:00:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:50.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:00:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:50.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:00:51 np0005541913.localdomain ceph-mon[298296]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:51.747 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:00:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:51.748 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:00:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:51.748 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:00:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:51.748 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.108 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.222 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.222 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.224 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:00:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:00:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:00:52 np0005541913.localdomain podman[306987]: 2025-12-02 10:00:52.423549144 +0000 UTC m=+0.061704087 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 02 10:00:52 np0005541913.localdomain systemd[1]: tmp-crun.9ecj9X.mount: Deactivated successfully.
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.494 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:00:52 np0005541913.localdomain podman[306986]: 2025-12-02 10:00:52.494844358 +0000 UTC m=+0.134100840 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.494 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.495 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.495 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.495 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:00:52 np0005541913.localdomain podman[306986]: 2025-12-02 10:00:52.502551495 +0000 UTC m=+0.141807997 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:00:52 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:00:52 np0005541913.localdomain podman[306987]: 2025-12-02 10:00:52.528574743 +0000 UTC m=+0.166729686 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:00:52 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:00:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:00:52 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2791761851' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.907 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:00:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2791761851' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.968 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:00:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:52.969 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.188 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.190 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11689MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.190 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.191 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.246 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.247 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.247 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.281 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:00:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:00:53 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2694848661' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.703 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.709 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.750 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.753 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.754 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:00:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:53.912 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:53 np0005541913.localdomain ceph-mon[298296]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:53 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2694848661' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:00:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:54.231 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:55.358 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:00:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:55.359 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:00:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:55.359 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:00:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:55.359 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:00:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:00:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:55.824 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:00:55 np0005541913.localdomain ceph-mon[298296]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:58 np0005541913.localdomain ceph-mon[298296]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:00:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:58.942 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:00:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:00:59.234 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.656126) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669660656245, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 955, "num_deletes": 256, "total_data_size": 1794692, "memory_usage": 1818592, "flush_reason": "Manual Compaction"}
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669660665488, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1174858, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17619, "largest_seqno": 18569, "table_properties": {"data_size": 1170557, "index_size": 1964, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10294, "raw_average_key_size": 20, "raw_value_size": 1161498, "raw_average_value_size": 2290, "num_data_blocks": 82, "num_entries": 507, "num_filter_entries": 507, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669610, "oldest_key_time": 1764669610, "file_creation_time": 1764669660, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 9407 microseconds, and 3693 cpu microseconds.
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.665550) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1174858 bytes OK
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.665576) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.667151) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.667168) EVENT_LOG_v1 {"time_micros": 1764669660667163, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.667192) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1789695, prev total WAL file size 1790019, number of live WAL files 2.
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.667794) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373637' seq:72057594037927935, type:22 .. '6C6F676D0034303139' seq:0, type:0; will stop at (end)
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1147KB)], [27(16MB)]
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669660667835, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18422250, "oldest_snapshot_seqno": -1}
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11893 keys, 18282442 bytes, temperature: kUnknown
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669660776250, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 18282442, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18215227, "index_size": 36394, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29765, "raw_key_size": 320650, "raw_average_key_size": 26, "raw_value_size": 18012802, "raw_average_value_size": 1514, "num_data_blocks": 1380, "num_entries": 11893, "num_filter_entries": 11893, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764669660, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.776726) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 18282442 bytes
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.778207) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 169.7 rd, 168.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 16.4 +0.0 blob) out(17.4 +0.0 blob), read-write-amplify(31.2) write-amplify(15.6) OK, records in: 12430, records dropped: 537 output_compression: NoCompression
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.778230) EVENT_LOG_v1 {"time_micros": 1764669660778218, "job": 14, "event": "compaction_finished", "compaction_time_micros": 108548, "compaction_time_cpu_micros": 35624, "output_level": 6, "num_output_files": 1, "total_output_size": 18282442, "num_input_records": 12430, "num_output_records": 11893, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669660778650, "job": 14, "event": "table_file_deletion", "file_number": 29}
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669660781591, "job": 14, "event": "table_file_deletion", "file_number": 27}
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.667700) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.781691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.781698) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.781701) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.781703) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:01:00.781705) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:01:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:01 np0005541913.localdomain CROND[307078]: (root) CMD (run-parts /etc/cron.hourly)
Dec 02 10:01:01 np0005541913.localdomain run-parts[307081]: (/etc/cron.hourly) starting 0anacron
Dec 02 10:01:01 np0005541913.localdomain run-parts[307087]: (/etc/cron.hourly) finished 0anacron
Dec 02 10:01:01 np0005541913.localdomain CROND[307077]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 02 10:01:01 np0005541913.localdomain ceph-mon[298296]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:01:03.044 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:01:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:01:03.045 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:01:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:01:03.045 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:01:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:03.981 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:04 np0005541913.localdomain ceph-mon[298296]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:01:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:01:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:01:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:01:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:01:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:01:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:01:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:04.236 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/457788740' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:01:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/457788740' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:01:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:01:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:01:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:01:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:01:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:01:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18729 "" "Go-http-client/1.1"
Dec 02 10:01:06 np0005541913.localdomain ceph-mon[298296]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/27386934' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:01:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/198510701' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:01:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:01:06 np0005541913.localdomain podman[307088]: 2025-12-02 10:01:06.451131008 +0000 UTC m=+0.087718884 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:01:06 np0005541913.localdomain podman[307088]: 2025-12-02 10:01:06.463853209 +0000 UTC m=+0.100441085 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 10:01:06 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:01:08 np0005541913.localdomain ceph-mon[298296]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1910608220' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:01:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2730124847' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:01:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:09.016 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:09.240 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:10 np0005541913.localdomain ceph-mon[298296]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:01:11 np0005541913.localdomain podman[307107]: 2025-12-02 10:01:11.437812024 +0000 UTC m=+0.081693893 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 10:01:11 np0005541913.localdomain podman[307107]: 2025-12-02 10:01:11.472165506 +0000 UTC m=+0.116047365 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 02 10:01:11 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:01:11 np0005541913.localdomain ceph-mon[298296]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:01:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:01:13 np0005541913.localdomain podman[307127]: 2025-12-02 10:01:13.443925227 +0000 UTC m=+0.087939530 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Dec 02 10:01:13 np0005541913.localdomain podman[307127]: 2025-12-02 10:01:13.456125814 +0000 UTC m=+0.100140167 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Dec 02 10:01:13 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:01:13 np0005541913.localdomain podman[307128]: 2025-12-02 10:01:13.545808111 +0000 UTC m=+0.186977688 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:01:13 np0005541913.localdomain podman[307128]: 2025-12-02 10:01:13.578370255 +0000 UTC m=+0.219539842 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:01:13 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:01:13 np0005541913.localdomain ceph-mon[298296]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:13 np0005541913.localdomain sudo[307170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:01:13 np0005541913.localdomain sudo[307170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:01:13 np0005541913.localdomain sudo[307170]: pam_unix(sudo:session): session closed for user root
Dec 02 10:01:14 np0005541913.localdomain sudo[307188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 10:01:14 np0005541913.localdomain sudo[307188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:01:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:14.060 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:14.243 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:14 np0005541913.localdomain sudo[307188]: pam_unix(sudo:session): session closed for user root
Dec 02 10:01:14 np0005541913.localdomain sudo[307227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:01:14 np0005541913.localdomain sudo[307227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:01:14 np0005541913.localdomain sudo[307227]: pam_unix(sudo:session): session closed for user root
Dec 02 10:01:14 np0005541913.localdomain sudo[307245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:01:14 np0005541913.localdomain sudo[307245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:01:15 np0005541913.localdomain sudo[307245]: pam_unix(sudo:session): session closed for user root
Dec 02 10:01:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:01:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:01:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:01:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:01:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:01:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:01:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:01:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:01:15 np0005541913.localdomain sudo[307296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:01:15 np0005541913.localdomain sudo[307296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:01:15 np0005541913.localdomain sudo[307296]: pam_unix(sudo:session): session closed for user root
Dec 02 10:01:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:16 np0005541913.localdomain ceph-mon[298296]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:01:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:01:17 np0005541913.localdomain ceph-mon[298296]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:01:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:19.104 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:19.245 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:19 np0005541913.localdomain ceph-mon[298296]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:01:20 np0005541913.localdomain podman[307314]: 2025-12-02 10:01:20.503814795 +0000 UTC m=+0.127712187 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec 02 10:01:20 np0005541913.localdomain podman[307314]: 2025-12-02 10:01:20.520176445 +0000 UTC m=+0.144073837 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:01:20 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:01:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:21 np0005541913.localdomain ceph-mon[298296]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:01:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:01:23 np0005541913.localdomain systemd[1]: tmp-crun.19rXuy.mount: Deactivated successfully.
Dec 02 10:01:23 np0005541913.localdomain podman[307333]: 2025-12-02 10:01:23.45201892 +0000 UTC m=+0.089768470 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:01:23 np0005541913.localdomain podman[307333]: 2025-12-02 10:01:23.486083064 +0000 UTC m=+0.123832644 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:01:23 np0005541913.localdomain systemd[1]: tmp-crun.hEZ3O8.mount: Deactivated successfully.
Dec 02 10:01:23 np0005541913.localdomain podman[307334]: 2025-12-02 10:01:23.499225276 +0000 UTC m=+0.134391008 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:01:23 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:01:23 np0005541913.localdomain podman[307334]: 2025-12-02 10:01:23.567139839 +0000 UTC m=+0.202305571 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 10:01:23 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:01:23 np0005541913.localdomain ceph-mon[298296]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:24.107 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:24.247 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:25 np0005541913.localdomain ceph-mon[298296]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:27.141 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:01:27.140 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:01:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:01:27.142 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:01:27 np0005541913.localdomain ceph-mon[298296]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:29.154 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:29.251 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:30 np0005541913.localdomain ceph-mon[298296]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:31 np0005541913.localdomain ceph-mon[298296]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:33 np0005541913.localdomain ceph-mon[298296]: pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:01:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:01:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:01:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:01:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:01:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:01:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:01:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:01:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:34.156 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:34.255 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:36 np0005541913.localdomain ceph-mon[298296]: pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:01:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:01:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:01:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:01:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:01:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18728 "" "Go-http-client/1.1"
Dec 02 10:01:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:01:36.143 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:01:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:01:37 np0005541913.localdomain podman[307383]: 2025-12-02 10:01:37.437302408 +0000 UTC m=+0.079478813 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Dec 02 10:01:37 np0005541913.localdomain podman[307383]: 2025-12-02 10:01:37.452080995 +0000 UTC m=+0.094257390 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 02 10:01:37 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:01:38 np0005541913.localdomain ceph-mon[298296]: pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:39.187 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:39.257 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:40 np0005541913.localdomain ceph-mon[298296]: pgmap v49: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Dec 02 10:01:40 np0005541913.localdomain ceph-mon[298296]: mgrmap e43: np0005541914.lljzmk(active, since 92s), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:01:40 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:41 np0005541913.localdomain ceph-mon[298296]: pgmap v50: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Dec 02 10:01:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:01:42 np0005541913.localdomain podman[307402]: 2025-12-02 10:01:42.441899215 +0000 UTC m=+0.078938720 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:01:42 np0005541913.localdomain podman[307402]: 2025-12-02 10:01:42.477095389 +0000 UTC m=+0.114134884 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:01:42 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:01:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:43.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:44 np0005541913.localdomain ceph-mon[298296]: pgmap v51: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Dec 02 10:01:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:44.227 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:44.259 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:01:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:01:44 np0005541913.localdomain systemd[1]: tmp-crun.BrzE66.mount: Deactivated successfully.
Dec 02 10:01:44 np0005541913.localdomain podman[307420]: 2025-12-02 10:01:44.452076466 +0000 UTC m=+0.094490857 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Dec 02 10:01:44 np0005541913.localdomain podman[307421]: 2025-12-02 10:01:44.500999509 +0000 UTC m=+0.135982660 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:01:44 np0005541913.localdomain podman[307420]: 2025-12-02 10:01:44.520174683 +0000 UTC m=+0.162589064 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, vcs-type=git, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:01:44 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:01:44 np0005541913.localdomain podman[307421]: 2025-12-02 10:01:44.536666256 +0000 UTC m=+0.171649397 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:01:44 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:01:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:46 np0005541913.localdomain ceph-mon[298296]: pgmap v52: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Dec 02 10:01:48 np0005541913.localdomain ceph-mon[298296]: pgmap v53: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Dec 02 10:01:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:49.261 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:01:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:49.263 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:01:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:49.263 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:01:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:49.263 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:01:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:49.264 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:49.264 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:01:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:49.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:49.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:01:50 np0005541913.localdomain ceph-mon[298296]: pgmap v54: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Dec 02 10:01:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:01:51 np0005541913.localdomain podman[307463]: 2025-12-02 10:01:51.437716245 +0000 UTC m=+0.081133208 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:01:51 np0005541913.localdomain podman[307463]: 2025-12-02 10:01:51.4740307 +0000 UTC m=+0.117447623 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd)
Dec 02 10:01:51 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:01:51 np0005541913.localdomain ceph-mon[298296]: pgmap v55: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:51.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:51.933 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:51.952 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:01:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:51.953 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:01:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:51.953 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:01:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:51.954 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:01:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:51.954 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:01:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:01:52 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3800101711' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:01:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:52.374 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:01:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3800101711' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.534 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.535 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:01:53 np0005541913.localdomain ceph-mon[298296]: pgmap v56: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.762 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.763 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11703MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.764 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.764 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.830 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.830 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.831 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:01:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:53.874 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.266 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.268 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.269 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.269 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.297 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.298 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:01:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:01:54 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4063615434' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.338 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:01:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.345 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:01:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.367 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.370 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:01:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:54.370 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:01:54 np0005541913.localdomain podman[307529]: 2025-12-02 10:01:54.446639899 +0000 UTC m=+0.085572968 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 10:01:54 np0005541913.localdomain systemd[1]: tmp-crun.S18yD0.mount: Deactivated successfully.
Dec 02 10:01:54 np0005541913.localdomain podman[307528]: 2025-12-02 10:01:54.530049356 +0000 UTC m=+0.171914574 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 10:01:54 np0005541913.localdomain podman[307529]: 2025-12-02 10:01:54.540327342 +0000 UTC m=+0.179260401 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 02 10:01:54 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:01:54 np0005541913.localdomain podman[307528]: 2025-12-02 10:01:54.567921773 +0000 UTC m=+0.209786961 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:01:54 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:01:54 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/4063615434' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.265 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.266 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.266 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.408 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.409 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.409 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.410 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:01:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.809 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.825 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.826 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:55.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e92 e92: 6 total, 6 up, 6 in
Dec 02 10:01:55 np0005541913.localdomain ceph-mon[298296]: pgmap v57: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:56.385 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:56.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:01:56 np0005541913.localdomain ceph-mon[298296]: osdmap e92: 6 total, 6 up, 6 in
Dec 02 10:01:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 e93: 6 total, 6 up, 6 in
Dec 02 10:01:57 np0005541913.localdomain ceph-mon[298296]: pgmap v59: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:01:59 np0005541913.localdomain ceph-mon[298296]: osdmap e93: 6 total, 6 up, 6 in
Dec 02 10:01:59 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/118532377' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:01:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:59.298 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:01:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:59.300 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:01:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:59.300 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:01:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:59.301 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:01:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:59.334 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:01:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:01:59.336 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:00 np0005541913.localdomain ceph-mon[298296]: pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 5.1 MiB/s wr, 41 op/s
Dec 02 10:02:00 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3891973243' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:02:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:02:00Z|00081|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Dec 02 10:02:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1269445713' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:02:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2528476716' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:02:02 np0005541913.localdomain ceph-mon[298296]: pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 5.1 MiB/s wr, 41 op/s
Dec 02 10:02:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:03.046 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:02:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:03.046 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:02:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:03.047 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:02:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:02:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:02:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:02:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:02:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:02:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:02:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:02:04 np0005541913.localdomain ceph-mon[298296]: pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Dec 02 10:02:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:04.337 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:04.339 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:04.340 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:02:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:04.340 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:04.373 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:04.374 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4056391954' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:02:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4056391954' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:02:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:02:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:02:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:02:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:02:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:02:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18736 "" "Go-http-client/1.1"
Dec 02 10:02:06 np0005541913.localdomain ceph-mon[298296]: pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 4.5 MiB/s wr, 42 op/s
Dec 02 10:02:08 np0005541913.localdomain ceph-mon[298296]: pgmap v65: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 4.1 MiB/s wr, 38 op/s
Dec 02 10:02:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:02:08 np0005541913.localdomain systemd[1]: tmp-crun.eaMzie.mount: Deactivated successfully.
Dec 02 10:02:08 np0005541913.localdomain podman[307574]: 2025-12-02 10:02:08.440300715 +0000 UTC m=+0.082506825 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute)
Dec 02 10:02:08 np0005541913.localdomain podman[307574]: 2025-12-02 10:02:08.47812027 +0000 UTC m=+0.120326380 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:02:08 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:02:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:09.375 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:09.378 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:09.378 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:02:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:09.378 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:09.404 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:09.404 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:10 np0005541913.localdomain ceph-mon[298296]: pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 3.0 KiB/s rd, 465 B/s wr, 4 op/s
Dec 02 10:02:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:11 np0005541913.localdomain ceph-mon[298296]: pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 2.7 KiB/s rd, 426 B/s wr, 3 op/s
Dec 02 10:02:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:02:13 np0005541913.localdomain podman[307593]: 2025-12-02 10:02:13.440327169 +0000 UTC m=+0.082094255 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:02:13 np0005541913.localdomain podman[307593]: 2025-12-02 10:02:13.446419262 +0000 UTC m=+0.088185948 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:02:13 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:02:13 np0005541913.localdomain ceph-mon[298296]: pgmap v68: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 2.7 KiB/s rd, 426 B/s wr, 3 op/s
Dec 02 10:02:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:14.406 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:14.440 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:14.440 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:02:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:14.441 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:14.442 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:14.443 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:02:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:02:15 np0005541913.localdomain systemd[1]: tmp-crun.kUMfUb.mount: Deactivated successfully.
Dec 02 10:02:15 np0005541913.localdomain systemd[299560]: Created slice User Background Tasks Slice.
Dec 02 10:02:15 np0005541913.localdomain podman[307611]: 2025-12-02 10:02:15.443060092 +0000 UTC m=+0.086181164 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container)
Dec 02 10:02:15 np0005541913.localdomain systemd[299560]: Starting Cleanup of User's Temporary Files and Directories...
Dec 02 10:02:15 np0005541913.localdomain systemd[299560]: Finished Cleanup of User's Temporary Files and Directories.
Dec 02 10:02:15 np0005541913.localdomain podman[307611]: 2025-12-02 10:02:15.480669171 +0000 UTC m=+0.123790233 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container)
Dec 02 10:02:15 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:02:15 np0005541913.localdomain podman[307612]: 2025-12-02 10:02:15.499382563 +0000 UTC m=+0.137813439 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:02:15 np0005541913.localdomain podman[307612]: 2025-12-02 10:02:15.51195543 +0000 UTC m=+0.150386346 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:02:15 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:02:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:15 np0005541913.localdomain sudo[307654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:02:15 np0005541913.localdomain sudo[307654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:02:15 np0005541913.localdomain sudo[307654]: pam_unix(sudo:session): session closed for user root
Dec 02 10:02:15 np0005541913.localdomain sudo[307672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:02:15 np0005541913.localdomain sudo[307672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:02:15 np0005541913.localdomain ceph-mon[298296]: pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.104 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.131 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.132 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e09cd9d6-f618-4677-840e-0c564a65a7fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:02:16.105587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fb2f7006-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': 'a255f9f12c5a0d5d5d6d8e70dc0dd9a27e22e48adef1d4576cc16e604253d5fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:02:16.105587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fb2f84f6-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': 'fcc4db09da525c7969000476afa1dfb0e5f872695be940a07b39d99fb7d685c8'}]}, 'timestamp': '2025-12-02 10:02:16.132963', '_unique_id': '35abaf336e7949859ac710d9d3b00035'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.134 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.145 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.146 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48d5b941-7d93-463b-b3d3-6c763461bfe1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:02:16.135967', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fb319cdc-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.35517945, 'message_signature': 'a2c27be467b98a2f4f1cb8c9ca9b0c3e14f01f1fa1a838574bdf22a6a989ebe6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:02:16.135967', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fb31b294-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.35517945, 'message_signature': '150d9b4dc378a76604de3c0e8ee65f8b98a75e76e1a5306a09d6d8a506d072c6'}]}, 'timestamp': '2025-12-02 10:02:16.147244', '_unique_id': '05c587ffa9f34bf899599c5ebd14e0d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.148 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.150 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.150 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31c81874-3a72-48c7-8b58-4d003ec1ee75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:02:16.150188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fb32389a-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.35517945, 'message_signature': 'bd34cc6d0f01e0e390ea760fa6f8f05a91e211757d92fd00edb1d4ad35a85b15'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:02:16.150188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fb324be6-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.35517945, 'message_signature': '1ab8f7ded5d2906493500f3200aef7bf927abd96e5af9174b9b83b16adbbb2a5'}]}, 'timestamp': '2025-12-02 10:02:16.151159', '_unique_id': '0187e113435144d583b785f07c2c5dc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.152 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.156 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd8b5d80-ab12-40c5-9e9d-71344ebe4d0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.153568', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb333c7c-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': 'b42488375a3cf2cfc4f6eb342a8c5af0a8fa3cce02470045268914d7fe9a8dd2'}]}, 'timestamp': '2025-12-02 10:02:16.157358', '_unique_id': '9c01dfae69f64c2485c432d55bfd3898'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.158 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.159 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a5fa7ab-2824-44fe-bd86-e6030b7383be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.159846', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb33b18e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': 'bbf5d605c34ae673371f2dc8a36b4b4d6af4f46e692bdc569490e4d7dc064e2e'}]}, 'timestamp': '2025-12-02 10:02:16.160347', '_unique_id': '5614f0ddd9e8443a95c7d6667d900482'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.162 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89d4e847-9cbf-42fd-9c44-86801242fee7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.162673', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb3421c8-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': 'f983b72d0bde01fede380afae2cca0f6a69dcce1d624cd861595dd3e342124c8'}]}, 'timestamp': '2025-12-02 10:02:16.163293', '_unique_id': '8f5ebf26a7294791adf5d390e56bed2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.164 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.165 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16990482-79a0-4839-af96-e22ba7e6d4ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.165732', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb34973e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': '97b23688c201819248b40f15ad130e6df85ffe6a798227ae57c69e6d5611cce1'}]}, 'timestamp': '2025-12-02 10:02:16.166244', '_unique_id': '114cc659dbfd4430aaa3faf08fa70f87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.167 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.168 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.169 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95654a20-8098-4207-af30-94ffe5579822', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:02:16.168685', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fb350ade-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.35517945, 'message_signature': '0cd9d5fdd38605f0ea6442e328b36fc6d255cc8bfe3f449f5e30de1f3531e726'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:02:16.168685', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fb351c5e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.35517945, 'message_signature': 'c564dc5cf7c7618cfe82cca23340fa5df2bfb42a59df9badb415f2125a0f665b'}]}, 'timestamp': '2025-12-02 10:02:16.169600', '_unique_id': 'f9c4fef0d4b440cfaa3b01cf0d779fd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.172 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.172 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8df9bc96-6126-4fce-b675-889bd2d3a6bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:02:16.172160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fb359242-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': '67813b00a37f99c2d3dacfdd5ae7e06f8e7941979290978469d0e6675d36f6c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:02:16.172160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fb35a502-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': 'b594c9b897b79d4d33a90e1ab3c079fd49fd0eaacd59c944c31c614337bbacd4'}]}, 'timestamp': '2025-12-02 10:02:16.173146', '_unique_id': 'b7af9c92e3c4470bb4ae73c0688fab82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.175 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.190 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 15810000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbc15d08-a227-4c1d-8d88-f10d4e8f25ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15810000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:02:16.175559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'fb386fda-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.409762965, 'message_signature': '65d7aa6fafc56e8d4405eb0f90e0ec83111049f9111d096d3e22dd45e3589cc2'}]}, 'timestamp': '2025-12-02 10:02:16.191474', '_unique_id': '6ec5cc9e618f4b988659019f9ade9774'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.194 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.194 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14dfe020-cab9-4234-8e1c-3a5bd09b4029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:02:16.194211', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fb38ef1e-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': 'aa4310bcd396e46247434653c86713a4e29cc08a9bc41daee9678d2b61a7faa6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:02:16.194211', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fb3903e6-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': 'e48601afaf4e7abc9e9103d480d79be0eaa35d601202d0dc79f55f86439c5e92'}]}, 'timestamp': '2025-12-02 10:02:16.195253', '_unique_id': 'ceea1c433d82438da46af64cd11eaee8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.197 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7b00a3e-7f15-4a82-8582-1dd50d4ceb68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.197811', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb397c36-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': '9c987d7d56f671c730eb7300e9f496bfc013a9ea0bece8fda058150e0db542bb'}]}, 'timestamp': '2025-12-02 10:02:16.198298', '_unique_id': '10a88d4c45ac44638eee5c25c34400d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.200 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.201 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.201 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'add1ca88-7341-4e29-a884-bbc96ca6b6e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:02:16.201191', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fb3a02b4-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': 'e2639c7d2d5b93af0c00faaf7968a77853de251ea1a35ea4d0afc49450e53933'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:02:16.201191', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fb3a18f8-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': 'fa60656bdb22c9081dc711b7cfd899a8ab1834da4cb28387c64220f9ebfb51fe'}]}, 'timestamp': '2025-12-02 10:02:16.202270', '_unique_id': '882db7b3046e49dfae26b8ebbb73fc7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.204 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.204 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4a7f6b9-5ab7-4c39-a612-de4a749dedfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.204889', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb3a90b2-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': '94724d8d1d4f2971d476c905f865114fdb4df940b6baf6577a421be09e1d46f8'}]}, 'timestamp': '2025-12-02 10:02:16.205360', '_unique_id': '93b7bd9ea36b4e2c9ea034638a0b7193'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.206 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.207 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '410da770-f039-468d-912b-bc930278c44e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.207765', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb3b0420-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': 'b15b88eeff625dde81bf44136fab72a4e8dde484277fd205ecbf025b48369b5e'}]}, 'timestamp': '2025-12-02 10:02:16.208318', '_unique_id': 'c91350ba56ae49d48e9bfa0c9f5a7b56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.209 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.210 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.210 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43b97fb2-44ad-4d72-b531-e8e06167b43c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:02:16.210721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'fb3b740a-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.409762965, 'message_signature': '3e7fd3dabd6bfe7610a782d810676958d1a204f24a7e53a7db89e08a7fe8fbc2'}]}, 'timestamp': '2025-12-02 10:02:16.211166', '_unique_id': '556521fdfb3d458c8e6a7114d4166b43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.213 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6dbb797-4663-4dfd-ab5e-2f87f52aa2a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:02:16.213338', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fb3bda12-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': '601c0a785ff034b4a3cd06d58fcb480508c1155efb575ed55a1dd494849ba054'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:02:16.213338', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fb3beb88-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': '5631e2e7e913bbd4027e30ddcb462fea579bc6a5284c3ccd387ea973857f0e72'}]}, 'timestamp': '2025-12-02 10:02:16.214207', '_unique_id': 'a7edec14717d4d668a2135a1dc8b4588'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.216 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '055ab8dd-3e8d-4a9e-851d-7d7407b70911', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.216583', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb3c5a96-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': '0631d6dc7d10a3f22e9d5d95994cfffce26aa67e73a5849a1c9564bd03c2dbc0'}]}, 'timestamp': '2025-12-02 10:02:16.217079', '_unique_id': '6ea71566d682449ab0534212fd8a452f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.220 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '785c7af2-9ddb-43e0-9a06-5873d5551a2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.220234', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb3ceed4-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': 'edecdcced86536e860068dca218419ca5ce8a3acbd4c131f076007b6819b3160'}]}, 'timestamp': '2025-12-02 10:02:16.221016', '_unique_id': 'b0d70c97d35b4b0db4e3dc06918d16b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2eab6219-524b-42ad-b907-f22d5bebe898', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:02:16.223699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fb3d6c24-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': '71aa76c78f2fcb2e54552428264edd8a0624fed02dfa91d67a6f50e864c7e1f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:02:16.223699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fb3d76d8-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.324695962, 'message_signature': 'c15da7fe6216236c638e222f7321c91b12436c8e54e21978fcdf702c3585b22c'}]}, 'timestamp': '2025-12-02 10:02:16.224282', '_unique_id': '86eb41953eca439da6581be9f15642da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.224 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.225 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.225 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4658116c-2d11-4c2c-b1f1-cc1ff6674e09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:02:16.225778', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'fb3dbd78-cf65-11f0-a0da-fa163e3f40cc', 'monotonic_time': 11898.37270087, 'message_signature': '7d634b636f180a9cf526b5148fe046bd39f4a0790de9352441a5ce1d2ab52721'}]}, 'timestamp': '2025-12-02 10:02:16.226087', '_unique_id': 'e31c3f4c6cce46f59f39125d0dcf7a99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.227 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:02:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:02:16.227 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:02:16 np0005541913.localdomain sudo[307672]: pam_unix(sudo:session): session closed for user root
Dec 02 10:02:16 np0005541913.localdomain sudo[307723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:02:16 np0005541913.localdomain sudo[307723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:02:16 np0005541913.localdomain sudo[307723]: pam_unix(sudo:session): session closed for user root
Dec 02 10:02:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:02:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:02:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:02:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:02:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:02:17 np0005541913.localdomain ceph-mon[298296]: pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:19.443 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:19.446 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:19.446 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:02:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:19.446 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:19.481 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:19.482 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:20 np0005541913.localdomain ceph-mon[298296]: pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:21 np0005541913.localdomain ceph-mon[298296]: pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:02:22 np0005541913.localdomain podman[307741]: 2025-12-02 10:02:22.450791611 +0000 UTC m=+0.086651137 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:02:22 np0005541913.localdomain podman[307741]: 2025-12-02 10:02:22.488284617 +0000 UTC m=+0.124144163 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:02:22 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:02:23 np0005541913.localdomain ceph-mon[298296]: pgmap v73: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:24.483 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:24.485 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:24.485 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:02:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:24.485 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:24.523 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:24.523 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:02:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:02:25 np0005541913.localdomain systemd[1]: tmp-crun.5sdv63.mount: Deactivated successfully.
Dec 02 10:02:25 np0005541913.localdomain podman[307760]: 2025-12-02 10:02:25.458909602 +0000 UTC m=+0.097407255 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:02:25 np0005541913.localdomain podman[307760]: 2025-12-02 10:02:25.498195947 +0000 UTC m=+0.136693660 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:02:25 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:02:25 np0005541913.localdomain podman[307761]: 2025-12-02 10:02:25.503361555 +0000 UTC m=+0.138444236 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:02:25 np0005541913.localdomain podman[307761]: 2025-12-02 10:02:25.584179144 +0000 UTC m=+0.219261785 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:02:25 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:02:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:26 np0005541913.localdomain ceph-mon[298296]: pgmap v74: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:28 np0005541913.localdomain ceph-mon[298296]: pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:28.872 281858 DEBUG oslo_concurrency.processutils [None req-23272247-f378-467d-94da-4db8e7b85b30 a61e36d3a8ec4e5abb065f4dc9d19030 bacffdfceba742b2a0f3443d4df622d9 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:02:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:28.892 281858 DEBUG oslo_concurrency.processutils [None req-23272247-f378-467d-94da-4db8e7b85b30 a61e36d3a8ec4e5abb065f4dc9d19030 bacffdfceba742b2a0f3443d4df622d9 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:02:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:29.038 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:29 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:29.038 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:02:29 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:29.040 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:02:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:29.570 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:30 np0005541913.localdomain ceph-mon[298296]: pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:31 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:31.042 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:02:31 np0005541913.localdomain ceph-mon[298296]: pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:33 np0005541913.localdomain ceph-mon[298296]: pgmap v78: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:02:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:02:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:02:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:02:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:02:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:02:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:02:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:02:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:34.572 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:34.575 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:35 np0005541913.localdomain ceph-mon[298296]: pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:02:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:02:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:02:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:02:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:02:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18739 "" "Go-http-client/1.1"
Dec 02 10:02:38 np0005541913.localdomain ceph-mon[298296]: pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:38.956 263406 INFO oslo.privsep.daemon [None req-581de5e2-9df3-4050-9744-5ce3f59e019f - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpf9vaplw0/privsep.sock']
Dec 02 10:02:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:02:39 np0005541913.localdomain podman[307813]: 2025-12-02 10:02:39.163413996 +0000 UTC m=+0.083261505 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:02:39 np0005541913.localdomain podman[307813]: 2025-12-02 10:02:39.173681892 +0000 UTC m=+0.093529391 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 02 10:02:39 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:02:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:39.547 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:39.554 263406 INFO oslo.privsep.daemon [None req-581de5e2-9df3-4050-9744-5ce3f59e019f - - - - - -] Spawned new privsep daemon via rootwrap
Dec 02 10:02:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:39.449 307833 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 10:02:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:39.454 307833 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 10:02:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:39.458 307833 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 02 10:02:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:39.458 307833 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307833
Dec 02 10:02:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:39.574 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:40 np0005541913.localdomain ceph-mon[298296]: pgmap v81: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:40.108 263406 INFO oslo.privsep.daemon [None req-581de5e2-9df3-4050-9744-5ce3f59e019f - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpop6amzyq/privsep.sock']
Dec 02 10:02:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:40.781 263406 INFO oslo.privsep.daemon [None req-581de5e2-9df3-4050-9744-5ce3f59e019f - - - - - -] Spawned new privsep daemon via rootwrap
Dec 02 10:02:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:40.671 307842 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 10:02:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:40.676 307842 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 10:02:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:40.680 307842 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 02 10:02:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:40.680 307842 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307842
Dec 02 10:02:40 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:41 np0005541913.localdomain ceph-mon[298296]: pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:41.845 263406 INFO oslo.privsep.daemon [None req-581de5e2-9df3-4050-9744-5ce3f59e019f - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpzogwehhu/privsep.sock']
Dec 02 10:02:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:42.456 263406 INFO oslo.privsep.daemon [None req-581de5e2-9df3-4050-9744-5ce3f59e019f - - - - - -] Spawned new privsep daemon via rootwrap
Dec 02 10:02:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:42.367 307854 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 10:02:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:42.372 307854 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 10:02:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:42.375 307854 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 02 10:02:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:42.376 307854 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307854
Dec 02 10:02:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:42.592 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:43.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:43.856 263406 INFO neutron.agent.linux.ip_lib [None req-581de5e2-9df3-4050-9744-5ce3f59e019f - - - - - -] Device tapfbe9f539-2c cannot be used as it has no MAC address
Dec 02 10:02:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:43.932 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:43 np0005541913.localdomain kernel: device tapfbe9f539-2c entered promiscuous mode
Dec 02 10:02:43 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669763.9440] manager: (tapfbe9f539-2c): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec 02 10:02:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:02:43Z|00082|binding|INFO|Claiming lport fbe9f539-2caa-4225-b0aa-ee0756eec0f0 for this chassis.
Dec 02 10:02:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:43.945 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:02:43Z|00083|binding|INFO|fbe9f539-2caa-4225-b0aa-ee0756eec0f0: Claiming unknown
Dec 02 10:02:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:02:43Z|00084|binding|INFO|Setting lport fbe9f539-2caa-4225-b0aa-ee0756eec0f0 ovn-installed in OVS
Dec 02 10:02:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:43.954 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:43 np0005541913.localdomain systemd-udevd[307869]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:02:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:43.956 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:43 np0005541913.localdomain virtnodedevd[230136]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 02 10:02:43 np0005541913.localdomain virtnodedevd[230136]: hostname: np0005541913.localdomain
Dec 02 10:02:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapfbe9f539-2c: No such device
Dec 02 10:02:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapfbe9f539-2c: No such device
Dec 02 10:02:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:43.982 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:02:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapfbe9f539-2c: No such device
Dec 02 10:02:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapfbe9f539-2c: No such device
Dec 02 10:02:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapfbe9f539-2c: No such device
Dec 02 10:02:43 np0005541913.localdomain ceph-mon[298296]: pgmap v83: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapfbe9f539-2c: No such device
Dec 02 10:02:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapfbe9f539-2c: No such device
Dec 02 10:02:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapfbe9f539-2c: No such device
Dec 02 10:02:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:44.017 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:44.047 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:44 np0005541913.localdomain podman[307878]: 2025-12-02 10:02:44.1187379 +0000 UTC m=+0.120710900 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 02 10:02:44 np0005541913.localdomain podman[307878]: 2025-12-02 10:02:44.149524576 +0000 UTC m=+0.151497526 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:02:44 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:02:44 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:02:44Z|00085|binding|INFO|Setting lport fbe9f539-2caa-4225-b0aa-ee0756eec0f0 up in Southbound
Dec 02 10:02:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:44.261 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccbafb2e3c343b2aab51714734bddce', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c5273a4-e474-4c2c-a95a-a522e1a174bd, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=fbe9f539-2caa-4225-b0aa-ee0756eec0f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:02:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:44.264 160221 INFO neutron.agent.ovn.metadata.agent [-] Port fbe9f539-2caa-4225-b0aa-ee0756eec0f0 in datapath 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f bound to our chassis
Dec 02 10:02:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:44.268 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port b22990f2-0db4-407c-a5b6-65e7991152d1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:02:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:44.268 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:02:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:02:44.270 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6d383b1e-7432-428b-b66d-b227847827ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:02:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:44.615 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:45 np0005541913.localdomain podman[307959]: 
Dec 02 10:02:45 np0005541913.localdomain podman[307959]: 2025-12-02 10:02:45.279431906 +0000 UTC m=+0.083835790 container create 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:02:45 np0005541913.localdomain systemd[1]: Started libpod-conmon-2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2.scope.
Dec 02 10:02:45 np0005541913.localdomain podman[307959]: 2025-12-02 10:02:45.240516082 +0000 UTC m=+0.044920026 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:02:45 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:02:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93ad5d2b9af04d633613c8f460d48e56923a84b4e7f2b732ec5f908e2b44d433/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:02:45 np0005541913.localdomain podman[307959]: 2025-12-02 10:02:45.356396792 +0000 UTC m=+0.160800706 container init 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:02:45 np0005541913.localdomain podman[307959]: 2025-12-02 10:02:45.36602938 +0000 UTC m=+0.170433284 container start 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:02:45 np0005541913.localdomain dnsmasq[307978]: started, version 2.85 cachesize 150
Dec 02 10:02:45 np0005541913.localdomain dnsmasq[307978]: DNS service limited to local subnets
Dec 02 10:02:45 np0005541913.localdomain dnsmasq[307978]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:02:45 np0005541913.localdomain dnsmasq[307978]: warning: no upstream servers configured
Dec 02 10:02:45 np0005541913.localdomain dnsmasq-dhcp[307978]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:02:45 np0005541913.localdomain dnsmasq[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/addn_hosts - 0 addresses
Dec 02 10:02:45 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/host
Dec 02 10:02:45 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/opts
Dec 02 10:02:45 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:45.436 263406 INFO neutron.agent.dhcp.agent [None req-8bb7a97c-e972-4f91-946e-ce71eb4abfe8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:02:40Z, description=, device_id=f7309812-362b-4bd1-84da-e909158b6cbe, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b58a30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b589d0>], id=7878efd7-90f1-41dd-b669-949b08145e13, ip_allocation=immediate, mac_address=fa:16:3e:fd:f0:c0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:02:35Z, description=, dns_domain=, id=62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-307256986-network, port_security_enabled=True, project_id=cccbafb2e3c343b2aab51714734bddce, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28433, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=182, status=ACTIVE, subnets=['9bd66995-30b3-4c53-b58b-ce3f8d5848fa'], tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, updated_at=2025-12-02T10:02:37Z, vlan_transparent=None, network_id=62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, port_security_enabled=False, project_id=cccbafb2e3c343b2aab51714734bddce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=190, status=DOWN, tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, updated_at=2025-12-02T10:02:40Z on network 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f
Dec 02 10:02:45 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:45.552 263406 INFO neutron.agent.dhcp.agent [None req-97c34d3b-174e-4349-83a3-aca6766d33fe - - - - - -] DHCP configuration for ports {'ea045be8-e121-4ff5-bb82-2a757b7ce736'} is completed
Dec 02 10:02:45 np0005541913.localdomain dnsmasq[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/addn_hosts - 1 addresses
Dec 02 10:02:45 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/host
Dec 02 10:02:45 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/opts
Dec 02 10:02:45 np0005541913.localdomain podman[307996]: 2025-12-02 10:02:45.645504 +0000 UTC m=+0.053117386 container kill 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:02:45 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:45.762 263406 INFO neutron.agent.dhcp.agent [None req-f8296964-90c5-493b-9fe6-7a4ac869da28 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:02:40Z, description=, device_id=f7309812-362b-4bd1-84da-e909158b6cbe, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b36880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b36670>], id=7878efd7-90f1-41dd-b669-949b08145e13, ip_allocation=immediate, mac_address=fa:16:3e:fd:f0:c0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:02:35Z, description=, dns_domain=, id=62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-307256986-network, port_security_enabled=True, project_id=cccbafb2e3c343b2aab51714734bddce, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28433, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=182, status=ACTIVE, subnets=['9bd66995-30b3-4c53-b58b-ce3f8d5848fa'], tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, updated_at=2025-12-02T10:02:37Z, vlan_transparent=None, network_id=62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, port_security_enabled=False, project_id=cccbafb2e3c343b2aab51714734bddce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=190, status=DOWN, tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, updated_at=2025-12-02T10:02:40Z on network 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f
Dec 02 10:02:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:45 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:45.905 263406 INFO neutron.agent.dhcp.agent [None req-7c9cfc7d-febc-40dc-846c-5a189c8e6784 - - - - - -] DHCP configuration for ports {'7878efd7-90f1-41dd-b669-949b08145e13'} is completed
Dec 02 10:02:45 np0005541913.localdomain dnsmasq[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/addn_hosts - 1 addresses
Dec 02 10:02:45 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/host
Dec 02 10:02:45 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/opts
Dec 02 10:02:45 np0005541913.localdomain podman[308033]: 2025-12-02 10:02:45.996036716 +0000 UTC m=+0.057620777 container kill 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:02:46 np0005541913.localdomain ceph-mon[298296]: pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:02:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:02:46 np0005541913.localdomain podman[308052]: 2025-12-02 10:02:46.197197585 +0000 UTC m=+0.085038993 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc.)
Dec 02 10:02:46 np0005541913.localdomain podman[308052]: 2025-12-02 10:02:46.211214201 +0000 UTC m=+0.099055609 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 10:02:46 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:02:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:02:46.255 263406 INFO neutron.agent.dhcp.agent [None req-6dfbdc49-02de-4abd-aace-58ed827e1bbc - - - - - -] DHCP configuration for ports {'7878efd7-90f1-41dd-b669-949b08145e13'} is completed
Dec 02 10:02:46 np0005541913.localdomain podman[308054]: 2025-12-02 10:02:46.304659488 +0000 UTC m=+0.189926398 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:02:46 np0005541913.localdomain podman[308054]: 2025-12-02 10:02:46.316049234 +0000 UTC m=+0.201316124 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:02:46 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: pgmap v85: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3473161244' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.122984) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669768123046, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1557, "num_deletes": 251, "total_data_size": 2356118, "memory_usage": 2499776, "flush_reason": "Manual Compaction"}
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669768134351, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 1534273, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18574, "largest_seqno": 20126, "table_properties": {"data_size": 1528251, "index_size": 3300, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13615, "raw_average_key_size": 20, "raw_value_size": 1515825, "raw_average_value_size": 2321, "num_data_blocks": 142, "num_entries": 653, "num_filter_entries": 653, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669660, "oldest_key_time": 1764669660, "file_creation_time": 1764669768, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 11391 microseconds, and 4530 cpu microseconds.
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.134397) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 1534273 bytes OK
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.134421) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.136803) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.136824) EVENT_LOG_v1 {"time_micros": 1764669768136817, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.136845) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 2348808, prev total WAL file size 2348808, number of live WAL files 2.
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.137570) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end)
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(1498KB)], [30(17MB)]
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669768137758, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 19816715, "oldest_snapshot_seqno": -1}
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 12014 keys, 17167464 bytes, temperature: kUnknown
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669768230034, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 17167464, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17099933, "index_size": 36390, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30085, "raw_key_size": 323655, "raw_average_key_size": 26, "raw_value_size": 16895879, "raw_average_value_size": 1406, "num_data_blocks": 1378, "num_entries": 12014, "num_filter_entries": 12014, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764669768, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.230367) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 17167464 bytes
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.232745) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.5 rd, 185.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 17.4 +0.0 blob) out(16.4 +0.0 blob), read-write-amplify(24.1) write-amplify(11.2) OK, records in: 12546, records dropped: 532 output_compression: NoCompression
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.232774) EVENT_LOG_v1 {"time_micros": 1764669768232761, "job": 16, "event": "compaction_finished", "compaction_time_micros": 92369, "compaction_time_cpu_micros": 44721, "output_level": 6, "num_output_files": 1, "total_output_size": 17167464, "num_input_records": 12546, "num_output_records": 12014, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669768233115, "job": 16, "event": "table_file_deletion", "file_number": 32}
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669768235701, "job": 16, "event": "table_file_deletion", "file_number": 30}
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.137494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.235806) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.235815) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.235818) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.235821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:02:48 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:02:48.235824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:02:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:49.618 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4886-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:49.620 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:49.620 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:02:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:49.620 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:49.647 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:49.647 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:50 np0005541913.localdomain ceph-mon[298296]: pgmap v86: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:51.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:51.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:02:52 np0005541913.localdomain ceph-mon[298296]: pgmap v87: 177 pgs: 177 active+clean; 145 MiB data, 706 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:02:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:52.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:52.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:02:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:52.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:02:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:02:52 np0005541913.localdomain podman[308100]: 2025-12-02 10:02:52.975692662 +0000 UTC m=+0.112327946 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:02:52 np0005541913.localdomain podman[308100]: 2025-12-02 10:02:52.99313321 +0000 UTC m=+0.129768474 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:02:53 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:02:54 np0005541913.localdomain ceph-mon[298296]: pgmap v88: 177 pgs: 177 active+clean; 152 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 105 KiB/s wr, 13 op/s
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.075 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.076 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.076 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.076 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.650 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.692 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.692 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.692 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.693 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:02:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:54.696 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:55 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2378891610' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:02:55 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3529410076' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.244 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.270 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.271 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.271 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.272 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.293 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.294 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.294 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.295 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.295 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:02:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:02:55 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/660414268' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.792 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:02:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.864 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:02:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:55.864 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:02:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:56.093 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:02:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:56.095 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11388MB free_disk=41.833717346191406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:02:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:56.095 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:02:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:56.096 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:02:56 np0005541913.localdomain ceph-mon[298296]: pgmap v89: 177 pgs: 177 active+clean; 152 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 105 KiB/s wr, 13 op/s
Dec 02 10:02:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/660414268' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:02:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:02:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:02:56 np0005541913.localdomain podman[308141]: 2025-12-02 10:02:56.446295843 +0000 UTC m=+0.082980197 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:02:56 np0005541913.localdomain podman[308141]: 2025-12-02 10:02:56.454981857 +0000 UTC m=+0.091666211 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:02:56 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:02:56 np0005541913.localdomain podman[308142]: 2025-12-02 10:02:56.504126035 +0000 UTC m=+0.139043273 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Dec 02 10:02:56 np0005541913.localdomain podman[308142]: 2025-12-02 10:02:56.551104796 +0000 UTC m=+0.186022104 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Dec 02 10:02:56 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:02:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:56.621 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:02:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:56.632 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:02:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:56.633 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:02:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:56.633 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:02:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:56.878 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.141 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.142 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.162 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.187 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_ABM,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.231 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:02:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e94 e94: 6 total, 6 up, 6 in
Dec 02 10:02:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:02:57 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2247793407' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.698 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.705 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.729 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.732 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.732 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.733 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:57.734 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 02 10:02:58 np0005541913.localdomain ceph-mon[298296]: pgmap v90: 177 pgs: 177 active+clean; 152 MiB data, 706 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 105 KiB/s wr, 13 op/s
Dec 02 10:02:58 np0005541913.localdomain ceph-mon[298296]: osdmap e94: 6 total, 6 up, 6 in
Dec 02 10:02:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2247793407' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:02:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:59.305 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:59.306 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:59.306 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:59.307 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:02:59 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e95 e95: 6 total, 6 up, 6 in
Dec 02 10:02:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:02:59.724 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:00 np0005541913.localdomain ceph-mon[298296]: pgmap v92: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 2.1 MiB/s wr, 79 op/s
Dec 02 10:03:00 np0005541913.localdomain ceph-mon[298296]: osdmap e95: 6 total, 6 up, 6 in
Dec 02 10:03:00 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1870214121' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:01.723 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:01 np0005541913.localdomain ceph-mon[298296]: pgmap v94: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 467 KiB/s rd, 2.5 MiB/s wr, 79 op/s
Dec 02 10:03:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:01.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:01.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 02 10:03:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:01.851 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 02 10:03:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:01.852 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/388746291' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1138207709' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:03.047 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:03.047 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:03.048 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:03 np0005541913.localdomain ceph-mon[298296]: pgmap v95: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 2.5 MiB/s wr, 190 op/s
Dec 02 10:03:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:03:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:03:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:03:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:03:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:03:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:03:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:03:04 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:03:04.330 2 INFO neutron.agent.securitygroups_rpc [None req-5c06dfad-89c5-4abc-a7de-de583f339085 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Security group member updated ['5c93e274-85ac-42d3-b949-bdb62e6b8c39']
Dec 02 10:03:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:04.421 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:03:03Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a59ca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a59ee0>], id=31de197b-ef56-4d2a-9fa2-293715a60004, ip_allocation=immediate, mac_address=fa:16:3e:8f:bb:bd, name=tempest-parent-17247491, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:02:35Z, description=, dns_domain=, id=62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-307256986-network, port_security_enabled=True, project_id=cccbafb2e3c343b2aab51714734bddce, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28433, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=182, status=ACTIVE, subnets=['9bd66995-30b3-4c53-b58b-ce3f8d5848fa'], tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, updated_at=2025-12-02T10:02:37Z, vlan_transparent=None, network_id=62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, port_security_enabled=True, project_id=cccbafb2e3c343b2aab51714734bddce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5c93e274-85ac-42d3-b949-bdb62e6b8c39'], standard_attr_id=324, status=DOWN, tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, updated_at=2025-12-02T10:03:03Z on network 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f
Dec 02 10:03:04 np0005541913.localdomain dnsmasq[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/addn_hosts - 2 addresses
Dec 02 10:03:04 np0005541913.localdomain podman[308227]: 2025-12-02 10:03:04.646719008 +0000 UTC m=+0.060998718 container kill 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 02 10:03:04 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/host
Dec 02 10:03:04 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/opts
Dec 02 10:03:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:04.751 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1475536354' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:03:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1475536354' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:03:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:04.881 263406 INFO neutron.agent.dhcp.agent [None req-f663a97d-1aec-4c39-9524-3a24e53a4d1b - - - - - -] DHCP configuration for ports {'31de197b-ef56-4d2a-9fa2-293715a60004'} is completed
Dec 02 10:03:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e96 e96: 6 total, 6 up, 6 in
Dec 02 10:03:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:05 np0005541913.localdomain ceph-mon[298296]: pgmap v96: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 2.5 MiB/s wr, 190 op/s
Dec 02 10:03:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/4080154938' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:05 np0005541913.localdomain ceph-mon[298296]: osdmap e96: 6 total, 6 up, 6 in
Dec 02 10:03:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:03:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:03:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:03:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156107 "" "Go-http-client/1.1"
Dec 02 10:03:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:03:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19228 "" "Go-http-client/1.1"
Dec 02 10:03:07 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:03:07.116 2 INFO neutron.agent.securitygroups_rpc [None req-897aec69-e9e3-465e-bb92-a062d09dda9e 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Security group member updated ['5c93e274-85ac-42d3-b949-bdb62e6b8c39']
Dec 02 10:03:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:07.313 263406 INFO neutron.agent.linux.ip_lib [None req-40676a43-f972-46a9-93c8-453d0ba44b2e - - - - - -] Device tap07dfafb4-09 cannot be used as it has no MAC address
Dec 02 10:03:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:07.335 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:07 np0005541913.localdomain kernel: device tap07dfafb4-09 entered promiscuous mode
Dec 02 10:03:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:07Z|00086|binding|INFO|Claiming lport 07dfafb4-0984-469d-a49c-9faf3746b302 for this chassis.
Dec 02 10:03:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:07Z|00087|binding|INFO|07dfafb4-0984-469d-a49c-9faf3746b302: Claiming unknown
Dec 02 10:03:07 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669787.3468] manager: (tap07dfafb4-09): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Dec 02 10:03:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:07.346 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:07 np0005541913.localdomain systemd-udevd[308260]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:03:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:07.357 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-3673812c-f461-4e86-831f-b7a7821f4bda', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3673812c-f461-4e86-831f-b7a7821f4bda', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccbafb2e3c343b2aab51714734bddce', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ebc33b-05e4-4907-9bc1-7e563b7692f1, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=07dfafb4-0984-469d-a49c-9faf3746b302) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:03:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:07Z|00088|binding|INFO|Setting lport 07dfafb4-0984-469d-a49c-9faf3746b302 ovn-installed in OVS
Dec 02 10:03:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:07Z|00089|binding|INFO|Setting lport 07dfafb4-0984-469d-a49c-9faf3746b302 up in Southbound
Dec 02 10:03:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:07.360 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 07dfafb4-0984-469d-a49c-9faf3746b302 in datapath 3673812c-f461-4e86-831f-b7a7821f4bda bound to our chassis
Dec 02 10:03:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:07.363 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 50e76764-b6f4-47d9-9fe0-99e7b5813c75 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:03:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:07.363 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3673812c-f461-4e86-831f-b7a7821f4bda, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:03:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:07.364 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:07.364 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[4c52e8d4-16f2-46e2-8184-1f9d26aae144]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:07.398 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:07.432 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:07.461 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:07 np0005541913.localdomain ceph-mon[298296]: pgmap v98: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 2.5 MiB/s rd, 1.7 KiB/s wr, 111 op/s
Dec 02 10:03:08 np0005541913.localdomain podman[308316]: 
Dec 02 10:03:08 np0005541913.localdomain podman[308316]: 2025-12-02 10:03:08.266040531 +0000 UTC m=+0.077841850 container create 1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3673812c-f461-4e86-831f-b7a7821f4bda, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:03:08 np0005541913.localdomain systemd[1]: Started libpod-conmon-1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2.scope.
Dec 02 10:03:08 np0005541913.localdomain podman[308316]: 2025-12-02 10:03:08.219998855 +0000 UTC m=+0.031800174 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:03:08 np0005541913.localdomain systemd[1]: tmp-crun.pKIA95.mount: Deactivated successfully.
Dec 02 10:03:08 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:03:08 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b24a22bfd2247520c320aa8b36a4cd59aff7c93df00851a3bdf42877c37d8eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:03:08 np0005541913.localdomain podman[308316]: 2025-12-02 10:03:08.350642001 +0000 UTC m=+0.162443310 container init 1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3673812c-f461-4e86-831f-b7a7821f4bda, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:03:08 np0005541913.localdomain podman[308316]: 2025-12-02 10:03:08.359392406 +0000 UTC m=+0.171193715 container start 1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3673812c-f461-4e86-831f-b7a7821f4bda, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:03:08 np0005541913.localdomain dnsmasq[308334]: started, version 2.85 cachesize 150
Dec 02 10:03:08 np0005541913.localdomain dnsmasq[308334]: DNS service limited to local subnets
Dec 02 10:03:08 np0005541913.localdomain dnsmasq[308334]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:03:08 np0005541913.localdomain dnsmasq[308334]: warning: no upstream servers configured
Dec 02 10:03:08 np0005541913.localdomain dnsmasq-dhcp[308334]: DHCP, static leases only on 19.80.0.0, lease time 1d
Dec 02 10:03:08 np0005541913.localdomain dnsmasq[308334]: read /var/lib/neutron/dhcp/3673812c-f461-4e86-831f-b7a7821f4bda/addn_hosts - 0 addresses
Dec 02 10:03:08 np0005541913.localdomain dnsmasq-dhcp[308334]: read /var/lib/neutron/dhcp/3673812c-f461-4e86-831f-b7a7821f4bda/host
Dec 02 10:03:08 np0005541913.localdomain dnsmasq-dhcp[308334]: read /var/lib/neutron/dhcp/3673812c-f461-4e86-831f-b7a7821f4bda/opts
Dec 02 10:03:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:08.424 263406 INFO neutron.agent.dhcp.agent [None req-3cee9568-4648-4e84-b0aa-c395d14e194f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:03:06Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a05610>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b639d0>], id=40590dd1-9250-4409-a2d0-cd4f4774bfc8, ip_allocation=immediate, mac_address=fa:16:3e:51:01:78, name=tempest-subport-1284966936, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:03:04Z, description=, dns_domain=, id=3673812c-f461-4e86-831f-b7a7821f4bda, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-1030391115, port_security_enabled=True, project_id=cccbafb2e3c343b2aab51714734bddce, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36642, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=325, status=ACTIVE, subnets=['f4f4111e-f6c8-4f21-b1f7-bb1c4d497d49'], tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, updated_at=2025-12-02T10:03:05Z, vlan_transparent=None, network_id=3673812c-f461-4e86-831f-b7a7821f4bda, port_security_enabled=True, project_id=cccbafb2e3c343b2aab51714734bddce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5c93e274-85ac-42d3-b949-bdb62e6b8c39'], standard_attr_id=347, status=DOWN, tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, updated_at=2025-12-02T10:03:06Z on network 3673812c-f461-4e86-831f-b7a7821f4bda
Dec 02 10:03:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:08.542 263406 INFO neutron.agent.linux.ip_lib [None req-c3bd8ac9-86de-482c-b6ff-e04635e7f4ea - - - - - -] Device tapc4946b01-03 cannot be used as it has no MAC address
Dec 02 10:03:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:08.562 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:08 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 02 10:03:08 np0005541913.localdomain kernel: device tapc4946b01-03 entered promiscuous mode
Dec 02 10:03:08 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669788.5683] manager: (tapc4946b01-03): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Dec 02 10:03:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:08Z|00090|binding|INFO|Claiming lport c4946b01-0395-4a62-9a39-4286d5803bca for this chassis.
Dec 02 10:03:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:08Z|00091|binding|INFO|c4946b01-0395-4a62-9a39-4286d5803bca: Claiming unknown
Dec 02 10:03:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:08.570 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:08.575 263406 INFO neutron.agent.dhcp.agent [None req-35bec98c-b180-4377-82b8-1b4b7ca641aa - - - - - -] DHCP configuration for ports {'ba8757f7-1076-4bc0-8968-1084ffa48766'} is completed
Dec 02 10:03:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:08Z|00092|binding|INFO|Setting lport c4946b01-0395-4a62-9a39-4286d5803bca ovn-installed in OVS
Dec 02 10:03:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:08Z|00093|binding|INFO|Setting lport c4946b01-0395-4a62-9a39-4286d5803bca up in Southbound
Dec 02 10:03:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:08.583 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd048f19ff5fc47dc88162ef5f9cebe8b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51e42abf-8647-4013-9c62-778191c64ad0, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c4946b01-0395-4a62-9a39-4286d5803bca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:03:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:08.584 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:08.584 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c4946b01-0395-4a62-9a39-4286d5803bca in datapath 13bbad22-ab61-4b1f-849e-c651aa8f3297 bound to our chassis
Dec 02 10:03:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:08.586 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 13bbad22-ab61-4b1f-849e-c651aa8f3297 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:03:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:08.587 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[c67a5572-7a25-4d3f-9500-76a90bd85cb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:08.605 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:08.628 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:08 np0005541913.localdomain dnsmasq[308334]: read /var/lib/neutron/dhcp/3673812c-f461-4e86-831f-b7a7821f4bda/addn_hosts - 1 addresses
Dec 02 10:03:08 np0005541913.localdomain podman[308363]: 2025-12-02 10:03:08.650675123 +0000 UTC m=+0.042769659 container kill 1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3673812c-f461-4e86-831f-b7a7821f4bda, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:03:08 np0005541913.localdomain dnsmasq-dhcp[308334]: read /var/lib/neutron/dhcp/3673812c-f461-4e86-831f-b7a7821f4bda/host
Dec 02 10:03:08 np0005541913.localdomain dnsmasq-dhcp[308334]: read /var/lib/neutron/dhcp/3673812c-f461-4e86-831f-b7a7821f4bda/opts
Dec 02 10:03:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:08.653 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:09 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:09.182 263406 INFO neutron.agent.dhcp.agent [None req-393758a9-a6e3-41ab-a514-43b6b1d6d252 - - - - - -] DHCP configuration for ports {'40590dd1-9250-4409-a2d0-cd4f4774bfc8'} is completed
Dec 02 10:03:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:03:09 np0005541913.localdomain systemd[1]: tmp-crun.Zidxs0.mount: Deactivated successfully.
Dec 02 10:03:09 np0005541913.localdomain podman[308414]: 2025-12-02 10:03:09.391672397 +0000 UTC m=+0.089552814 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:03:09 np0005541913.localdomain podman[308414]: 2025-12-02 10:03:09.402835396 +0000 UTC m=+0.100715813 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:03:09 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:03:09 np0005541913.localdomain podman[308455]: 
Dec 02 10:03:09 np0005541913.localdomain podman[308455]: 2025-12-02 10:03:09.539245527 +0000 UTC m=+0.065499328 container create 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:03:09 np0005541913.localdomain systemd[1]: Started libpod-conmon-77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7.scope.
Dec 02 10:03:09 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:03:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/896dba9b1a38f0638159f863e9536c69068bcbb89b8facb5a357e5a5dc8cf960/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:03:09 np0005541913.localdomain podman[308455]: 2025-12-02 10:03:09.601567379 +0000 UTC m=+0.127821180 container init 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:03:09 np0005541913.localdomain podman[308455]: 2025-12-02 10:03:09.504564676 +0000 UTC m=+0.030818497 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:03:09 np0005541913.localdomain podman[308455]: 2025-12-02 10:03:09.609678407 +0000 UTC m=+0.135932208 container start 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:03:09 np0005541913.localdomain dnsmasq[308473]: started, version 2.85 cachesize 150
Dec 02 10:03:09 np0005541913.localdomain dnsmasq[308473]: DNS service limited to local subnets
Dec 02 10:03:09 np0005541913.localdomain dnsmasq[308473]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:03:09 np0005541913.localdomain dnsmasq[308473]: warning: no upstream servers configured
Dec 02 10:03:09 np0005541913.localdomain dnsmasq-dhcp[308473]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:03:09 np0005541913.localdomain dnsmasq[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/addn_hosts - 0 addresses
Dec 02 10:03:09 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/host
Dec 02 10:03:09 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/opts
Dec 02 10:03:09 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:09.728 263406 INFO neutron.agent.dhcp.agent [None req-2097c0e3-0c94-445c-a2fd-9bfd4cf2c44a - - - - - -] DHCP configuration for ports {'202be55f-4a2f-4e8a-884e-d4a72a4d525d'} is completed
Dec 02 10:03:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:09.781 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:09.940 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:10 np0005541913.localdomain ceph-mon[298296]: pgmap v99: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 1.5 KiB/s wr, 93 op/s
Dec 02 10:03:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:11 np0005541913.localdomain ceph-mon[298296]: pgmap v100: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 1.4 KiB/s wr, 89 op/s
Dec 02 10:03:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:12.712 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Acquiring lock "63092ab0-9432-4c74-933e-e9d5428e6162" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:12.712 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:12.732 281858 DEBUG nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 02 10:03:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:12.807 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:12.808 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:12.816 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 02 10:03:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:12.817 281858 INFO nova.compute.claims [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Claim successful on node np0005541913.localdomain
Dec 02 10:03:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:12.821 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:12.923 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:03:13 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2025890447' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.379 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.386 281858 DEBUG nova.compute.provider_tree [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.401 281858 DEBUG nova.scheduler.client.report [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.423 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.424 281858 DEBUG nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.466 281858 DEBUG nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.467 281858 DEBUG nova.network.neutron [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.478 281858 INFO nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.494 281858 DEBUG nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.577 281858 DEBUG nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.578 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.579 281858 INFO nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Creating image(s)
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.613 281858 DEBUG nova.storage.rbd_utils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] rbd image 63092ab0-9432-4c74-933e-e9d5428e6162_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.646 281858 DEBUG nova.storage.rbd_utils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] rbd image 63092ab0-9432-4c74-933e-e9d5428e6162_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.683 281858 DEBUG nova.storage.rbd_utils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] rbd image 63092ab0-9432-4c74-933e-e9d5428e6162_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.687 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Acquiring lock "43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.688 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.701 281858 WARNING oslo_policy.policy [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.701 281858 WARNING oslo_policy.policy [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.703 281858 DEBUG nova.policy [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f523e6d03743daa3ff6f5bc7122d00', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cccbafb2e3c343b2aab51714734bddce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 02 10:03:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:13.738 281858 DEBUG nova.virt.libvirt.imagebackend [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Image locations are: [{'url': 'rbd://c7c8e171-a193-56fb-95fa-8879fcfa7074/images/d85e840d-fa56-497b-b5bd-b49584d3e97a/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://c7c8e171-a193-56fb-95fa-8879fcfa7074/images/d85e840d-fa56-497b-b5bd-b49584d3e97a/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Dec 02 10:03:14 np0005541913.localdomain ceph-mon[298296]: pgmap v101: 177 pgs: 177 active+clean; 225 MiB data, 868 MiB used, 41 GiB / 42 GiB avail; 351 KiB/s rd, 2.6 MiB/s wr, 71 op/s
Dec 02 10:03:14 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2025890447' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:14.234 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005541913.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:03:03Z, description=, device_id=63092ab0-9432-4c74-933e-e9d5428e6162, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a61d60>], dns_domain=, dns_name=tempest-liveautoblockmigrationv225test-server-861747463, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a612e0>], id=31de197b-ef56-4d2a-9fa2-293715a60004, ip_allocation=immediate, mac_address=fa:16:3e:8f:bb:bd, name=tempest-parent-17247491, network_id=62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, port_security_enabled=True, project_id=cccbafb2e3c343b2aab51714734bddce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['5c93e274-85ac-42d3-b949-bdb62e6b8c39'], standard_attr_id=324, status=DOWN, tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a13400>], trunk_id=5b1dd84a-69f3-4e17-8604-49965c03b89c, updated_at=2025-12-02T10:03:13Z on network 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f
Dec 02 10:03:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:03:14 np0005541913.localdomain podman[308564]: 2025-12-02 10:03:14.456758347 +0000 UTC m=+0.090018297 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 02 10:03:14 np0005541913.localdomain podman[308564]: 2025-12-02 10:03:14.486911586 +0000 UTC m=+0.120171536 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:03:14 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.511 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:14 np0005541913.localdomain dnsmasq[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/addn_hosts - 2 addresses
Dec 02 10:03:14 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/host
Dec 02 10:03:14 np0005541913.localdomain podman[308573]: 2025-12-02 10:03:14.521384241 +0000 UTC m=+0.122960900 container kill 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:03:14 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/opts
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.589 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc.part --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.590 281858 DEBUG nova.virt.images [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] d85e840d-fa56-497b-b5bd-b49584d3e97a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.591 281858 DEBUG nova.privsep.utils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.591 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc.part /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.674 281858 DEBUG nova.network.neutron [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Successfully updated port: 31de197b-ef56-4d2a-9fa2-293715a60004 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.694 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Acquiring lock "refresh_cache-63092ab0-9432-4c74-933e-e9d5428e6162" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.694 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Acquired lock "refresh_cache-63092ab0-9432-4c74-933e-e9d5428e6162" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.695 281858 DEBUG nova.network.neutron [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 02 10:03:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:14.769 263406 INFO neutron.agent.dhcp.agent [None req-693c441f-0fd4-4c3e-9741-627c215fdfa7 - - - - - -] DHCP configuration for ports {'31de197b-ef56-4d2a-9fa2-293715a60004'} is completed
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.781 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc.part /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc.converted" returned: 0 in 0.190s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.784 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.814 281858 DEBUG nova.network.neutron [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.817 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.819 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.856 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc.converted --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.857 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.881 281858 DEBUG nova.storage.rbd_utils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] rbd image 63092ab0-9432-4c74-933e-e9d5428e6162_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:03:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:14.884 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc 63092ab0-9432-4c74-933e-e9d5428e6162_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:14.944 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:03:13Z, description=, device_id=c633bc2a-d8d8-4d52-951c-727821eef4f5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089c4490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089c40a0>], id=e956d78f-d33b-49fb-a452-eaed9391e7d2, ip_allocation=immediate, mac_address=fa:16:3e:54:ce:16, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:03:06Z, description=, dns_domain=, id=13bbad22-ab61-4b1f-849e-c651aa8f3297, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-1859087569-network, port_security_enabled=True, project_id=d048f19ff5fc47dc88162ef5f9cebe8b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25848, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=342, status=ACTIVE, subnets=['a62c0502-5155-4c20-aaad-4cc8bce976da'], tags=[], tenant_id=d048f19ff5fc47dc88162ef5f9cebe8b, updated_at=2025-12-02T10:03:07Z, vlan_transparent=None, network_id=13bbad22-ab61-4b1f-849e-c651aa8f3297, port_security_enabled=False, project_id=d048f19ff5fc47dc88162ef5f9cebe8b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=411, status=DOWN, tags=[], tenant_id=d048f19ff5fc47dc88162ef5f9cebe8b, updated_at=2025-12-02T10:03:14Z on network 13bbad22-ab61-4b1f-849e-c651aa8f3297
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.187 281858 DEBUG nova.network.neutron [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Updating instance_info_cache with network_info: [{"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.265 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Releasing lock "refresh_cache-63092ab0-9432-4c74-933e-e9d5428e6162" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.266 281858 DEBUG nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Instance network_info: |[{"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 02 10:03:15 np0005541913.localdomain dnsmasq[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/addn_hosts - 1 addresses
Dec 02 10:03:15 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/host
Dec 02 10:03:15 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/opts
Dec 02 10:03:15 np0005541913.localdomain systemd[1]: tmp-crun.ZoLt0o.mount: Deactivated successfully.
Dec 02 10:03:15 np0005541913.localdomain podman[308672]: 2025-12-02 10:03:15.356786889 +0000 UTC m=+0.070513383 container kill 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.356 281858 DEBUG nova.compute.manager [req-f549a86e-f608-422e-9010-2a1262a2e085 req-c8977d39-15cb-4501-941e-9b4a06d961ec dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received event network-changed-31de197b-ef56-4d2a-9fa2-293715a60004 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.357 281858 DEBUG nova.compute.manager [req-f549a86e-f608-422e-9010-2a1262a2e085 req-c8977d39-15cb-4501-941e-9b4a06d961ec dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Refreshing instance network info cache due to event network-changed-31de197b-ef56-4d2a-9fa2-293715a60004. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.357 281858 DEBUG oslo_concurrency.lockutils [req-f549a86e-f608-422e-9010-2a1262a2e085 req-c8977d39-15cb-4501-941e-9b4a06d961ec dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "refresh_cache-63092ab0-9432-4c74-933e-e9d5428e6162" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.358 281858 DEBUG oslo_concurrency.lockutils [req-f549a86e-f608-422e-9010-2a1262a2e085 req-c8977d39-15cb-4501-941e-9b4a06d961ec dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquired lock "refresh_cache-63092ab0-9432-4c74-933e-e9d5428e6162" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.358 281858 DEBUG nova.network.neutron [req-f549a86e-f608-422e-9010-2a1262a2e085 req-c8977d39-15cb-4501-941e-9b4a06d961ec dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Refreshing network info cache for port 31de197b-ef56-4d2a-9fa2-293715a60004 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.611 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/43cc3eae4d6ab33a15526950b68aad5ba6c1c8fc 63092ab0-9432-4c74-933e-e9d5428e6162_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.727s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.713 281858 DEBUG nova.storage.rbd_utils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] resizing rbd image 63092ab0-9432-4c74-933e-e9d5428e6162_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Dec 02 10:03:15 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:15.850 263406 INFO neutron.agent.dhcp.agent [None req-a4b435d7-c0b1-455d-8992-3323068c46ee - - - - - -] DHCP configuration for ports {'e956d78f-d33b-49fb-a452-eaed9391e7d2'} is completed
Dec 02 10:03:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.885 281858 DEBUG nova.objects.instance [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lazy-loading 'migration_context' on Instance uuid 63092ab0-9432-4c74-933e-e9d5428e6162 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.908 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.909 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Ensure instance console log exists: /var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.909 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.910 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.910 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.914 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Start _get_guest_xml network_info=[{"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T10:01:53Z,direct_url=<?>,disk_format='qcow2',id=d85e840d-fa56-497b-b5bd-b49584d3e97a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e2d97696ab6749899bb8ba5ce29a3de2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T10:01:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'size': 0, 'encryption_options': None, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': 'd85e840d-fa56-497b-b5bd-b49584d3e97a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.920 281858 WARNING nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.923 281858 DEBUG nova.virt.libvirt.host [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Searching host: 'np0005541913.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.924 281858 DEBUG nova.virt.libvirt.host [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.926 281858 DEBUG nova.virt.libvirt.host [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Searching host: 'np0005541913.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.926 281858 DEBUG nova.virt.libvirt.host [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.927 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.927 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T10:01:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82beb986-6d20-42dc-b738-1cef87dee30f',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T10:01:53Z,direct_url=<?>,disk_format='qcow2',id=d85e840d-fa56-497b-b5bd-b49584d3e97a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e2d97696ab6749899bb8ba5ce29a3de2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T10:01:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.928 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.929 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.929 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.929 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.930 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.930 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.931 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.931 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.931 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.932 281858 DEBUG nova.virt.hardware [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 02 10:03:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:15.937 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:16 np0005541913.localdomain ceph-mon[298296]: pgmap v102: 177 pgs: 177 active+clean; 225 MiB data, 868 MiB used, 41 GiB / 42 GiB avail; 351 KiB/s rd, 2.6 MiB/s wr, 71 op/s
Dec 02 10:03:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:03:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:03:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 02 10:03:16 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3290176110' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.440 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:16 np0005541913.localdomain podman[308787]: 2025-12-02 10:03:16.451332761 +0000 UTC m=+0.090765097 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:03:16 np0005541913.localdomain podman[308787]: 2025-12-02 10:03:16.467099183 +0000 UTC m=+0.106531509 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:03:16 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.484 281858 DEBUG nova.storage.rbd_utils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] rbd image 63092ab0-9432-4c74-933e-e9d5428e6162_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.489 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.508 281858 DEBUG nova.network.neutron [req-f549a86e-f608-422e-9010-2a1262a2e085 req-c8977d39-15cb-4501-941e-9b4a06d961ec dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Updated VIF entry in instance network info cache for port 31de197b-ef56-4d2a-9fa2-293715a60004. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.509 281858 DEBUG nova.network.neutron [req-f549a86e-f608-422e-9010-2a1262a2e085 req-c8977d39-15cb-4501-941e-9b4a06d961ec dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Updating instance_info_cache with network_info: [{"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:03:16 np0005541913.localdomain podman[308786]: 2025-12-02 10:03:16.422755053 +0000 UTC m=+0.066380081 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.531 281858 DEBUG oslo_concurrency.lockutils [req-f549a86e-f608-422e-9010-2a1262a2e085 req-c8977d39-15cb-4501-941e-9b4a06d961ec dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Releasing lock "refresh_cache-63092ab0-9432-4c74-933e-e9d5428e6162" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:03:16 np0005541913.localdomain podman[308786]: 2025-12-02 10:03:16.556012709 +0000 UTC m=+0.199637737 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7)
Dec 02 10:03:16 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:03:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 02 10:03:16 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4171051508' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.915 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.918 281858 DEBUG nova.virt.libvirt.vif [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-02T10:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-861747463',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005541913.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-861747463',id=7,image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005541913.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccbafb2e3c343b2aab51714734bddce',ramdisk_id='',reservation_id='r-sf2jj0i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-5814605',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-5814605-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T10:03:13Z,user_data=None,user_id='60f523e6d03743daa3ff6f5bc7122d00',uuid=63092ab0-9432-4c74-933e-e9d5428e6162,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.919 281858 DEBUG nova.network.os_vif_util [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Converting VIF {"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.920 281858 DEBUG nova.network.os_vif_util [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:bb:bd,bridge_name='br-int',has_traffic_filtering=True,id=31de197b-ef56-4d2a-9fa2-293715a60004,network=Network(62df5f27-c8d9-4d79-9ad6-2f32e63bf47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap31de197b-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 10:03:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:16.923 281858 DEBUG nova.objects.instance [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lazy-loading 'pci_devices' on Instance uuid 63092ab0-9432-4c74-933e-e9d5428e6162 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:03:17 np0005541913.localdomain sudo[308870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:03:17 np0005541913.localdomain sudo[308870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:03:17 np0005541913.localdomain sudo[308870]: pam_unix(sudo:session): session closed for user root
Dec 02 10:03:17 np0005541913.localdomain sudo[308888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:03:17 np0005541913.localdomain sudo[308888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:03:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3290176110' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:03:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/4171051508' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:03:17 np0005541913.localdomain sudo[308888]: pam_unix(sudo:session): session closed for user root
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.885 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] End _get_guest_xml xml=<domain type="kvm">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <uuid>63092ab0-9432-4c74-933e-e9d5428e6162</uuid>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <name>instance-00000007</name>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <memory>131072</memory>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <vcpu>1</vcpu>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <metadata>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-861747463</nova:name>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <nova:creationTime>2025-12-02 10:03:15</nova:creationTime>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <nova:flavor name="m1.nano">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <nova:memory>128</nova:memory>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <nova:disk>1</nova:disk>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <nova:swap>0</nova:swap>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <nova:vcpus>1</nova:vcpus>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       </nova:flavor>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <nova:owner>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <nova:user uuid="60f523e6d03743daa3ff6f5bc7122d00">tempest-LiveAutoBlockMigrationV225Test-5814605-project-member</nova:user>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <nova:project uuid="cccbafb2e3c343b2aab51714734bddce">tempest-LiveAutoBlockMigrationV225Test-5814605</nova:project>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       </nova:owner>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <nova:root type="image" uuid="d85e840d-fa56-497b-b5bd-b49584d3e97a"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <nova:ports>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <nova:port uuid="31de197b-ef56-4d2a-9fa2-293715a60004">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         </nova:port>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       </nova:ports>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     </nova:instance>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   </metadata>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <sysinfo type="smbios">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <system>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <entry name="manufacturer">RDO</entry>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <entry name="product">OpenStack Compute</entry>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <entry name="serial">63092ab0-9432-4c74-933e-e9d5428e6162</entry>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <entry name="uuid">63092ab0-9432-4c74-933e-e9d5428e6162</entry>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <entry name="family">Virtual Machine</entry>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     </system>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   </sysinfo>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <os>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <boot dev="hd"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <smbios mode="sysinfo"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   </os>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <features>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <acpi/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <apic/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <vmcoreinfo/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   </features>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <clock offset="utc">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <timer name="hpet" present="no"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   </clock>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <cpu mode="host-model" match="exact">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   </cpu>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   <devices>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <disk type="network" device="disk">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <driver type="raw" cache="none"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <source protocol="rbd" name="vms/63092ab0-9432-4c74-933e-e9d5428e6162_disk">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.103" port="6789"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.104" port="6789"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.105" port="6789"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       </source>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <auth username="openstack">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <secret type="ceph" uuid="c7c8e171-a193-56fb-95fa-8879fcfa7074"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       </auth>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <target dev="vda" bus="virtio"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <disk type="network" device="cdrom">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <driver type="raw" cache="none"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <source protocol="rbd" name="vms/63092ab0-9432-4c74-933e-e9d5428e6162_disk.config">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.103" port="6789"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.104" port="6789"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.105" port="6789"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       </source>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <auth username="openstack">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:         <secret type="ceph" uuid="c7c8e171-a193-56fb-95fa-8879fcfa7074"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       </auth>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <target dev="sda" bus="sata"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <interface type="ethernet">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <mac address="fa:16:3e:8f:bb:bd"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <model type="virtio"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <mtu size="1442"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <target dev="tap31de197b-ef"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     </interface>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <serial type="pty">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <log file="/var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162/console.log" append="off"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     </serial>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <video>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <model type="virtio"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     </video>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <input type="tablet" bus="usb"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <rng model="virtio">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <backend model="random">/dev/urandom</backend>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     </rng>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <controller type="usb" index="0"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     <memballoon model="virtio">
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:       <stats period="10"/>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:     </memballoon>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:   </devices>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: </domain>
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.890 281858 DEBUG nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Preparing to wait for external event network-vif-plugged-31de197b-ef56-4d2a-9fa2-293715a60004 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.890 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Acquiring lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.891 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.892 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.894 281858 DEBUG nova.virt.libvirt.vif [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-02T10:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-861747463',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005541913.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-861747463',id=7,image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005541913.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccbafb2e3c343b2aab51714734bddce',ramdisk_id='',reservation_id='r-sf2jj0i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-5814605',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-5814605-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T10:03:13Z,user_data=None,user_id='60f523e6d03743daa3ff6f5bc7122d00',uuid=63092ab0-9432-4c74-933e-e9d5428e6162,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.894 281858 DEBUG nova.network.os_vif_util [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Converting VIF {"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.896 281858 DEBUG nova.network.os_vif_util [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:bb:bd,bridge_name='br-int',has_traffic_filtering=True,id=31de197b-ef56-4d2a-9fa2-293715a60004,network=Network(62df5f27-c8d9-4d79-9ad6-2f32e63bf47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap31de197b-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.897 281858 DEBUG os_vif [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:bb:bd,bridge_name='br-int',has_traffic_filtering=True,id=31de197b-ef56-4d2a-9fa2-293715a60004,network=Network(62df5f27-c8d9-4d79-9ad6-2f32e63bf47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap31de197b-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.898 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.899 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.900 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.905 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.906 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31de197b-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.907 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31de197b-ef, col_values=(('external_ids', {'iface-id': '31de197b-ef56-4d2a-9fa2-293715a60004', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:bb:bd', 'vm-uuid': '63092ab0-9432-4c74-933e-e9d5428e6162'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.915 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.918 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:17.919 281858 INFO os_vif [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:bb:bd,bridge_name='br-int',has_traffic_filtering=True,id=31de197b-ef56-4d2a-9fa2-293715a60004,network=Network(62df5f27-c8d9-4d79-9ad6-2f32e63bf47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap31de197b-ef')
Dec 02 10:03:18 np0005541913.localdomain sudo[308939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:03:18 np0005541913.localdomain sudo[308939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:03:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:18.086 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 02 10:03:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:18.087 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 02 10:03:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:18.087 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] No VIF found with MAC fa:16:3e:8f:bb:bd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 02 10:03:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:18.088 281858 INFO nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Using config drive
Dec 02 10:03:18 np0005541913.localdomain sudo[308939]: pam_unix(sudo:session): session closed for user root
Dec 02 10:03:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:18.131 281858 DEBUG nova.storage.rbd_utils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] rbd image 63092ab0-9432-4c74-933e-e9d5428e6162_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:03:18 np0005541913.localdomain ceph-mon[298296]: pgmap v103: 177 pgs: 177 active+clean; 225 MiB data, 868 MiB used, 41 GiB / 42 GiB avail; 314 KiB/s rd, 2.3 MiB/s wr, 64 op/s
Dec 02 10:03:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:03:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:03:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:03:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.012 281858 INFO nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Creating config drive at /var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162/disk.config
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.019 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpta2cs2cy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.149 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpta2cs2cy" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.201 281858 DEBUG nova.storage.rbd_utils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] rbd image 63092ab0-9432-4c74-933e-e9d5428e6162_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.206 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162/disk.config 63092ab0-9432-4c74-933e-e9d5428e6162_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.433 281858 DEBUG oslo_concurrency.processutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162/disk.config 63092ab0-9432-4c74-933e-e9d5428e6162_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.434 281858 INFO nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Deleting local config drive /var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162/disk.config because it was imported into RBD.
Dec 02 10:03:19 np0005541913.localdomain systemd[1]: Started libvirt secret daemon.
Dec 02 10:03:19 np0005541913.localdomain kernel: device tap31de197b-ef entered promiscuous mode
Dec 02 10:03:19 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669799.5414] manager: (tap31de197b-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Dec 02 10:03:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:19Z|00094|binding|INFO|Claiming lport 31de197b-ef56-4d2a-9fa2-293715a60004 for this chassis.
Dec 02 10:03:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:19Z|00095|binding|INFO|31de197b-ef56-4d2a-9fa2-293715a60004: Claiming fa:16:3e:8f:bb:bd 10.100.0.4
Dec 02 10:03:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:19Z|00096|binding|INFO|Claiming lport 40590dd1-9250-4409-a2d0-cd4f4774bfc8 for this chassis.
Dec 02 10:03:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:19Z|00097|binding|INFO|40590dd1-9250-4409-a2d0-cd4f4774bfc8: Claiming fa:16:3e:51:01:78 19.80.0.123
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.546 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:19 np0005541913.localdomain systemd-udevd[309044]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:03:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:19Z|00098|binding|INFO|Setting lport 31de197b-ef56-4d2a-9fa2-293715a60004 ovn-installed in OVS
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.560 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.562 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:19 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669799.5701] device (tap31de197b-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 02 10:03:19 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669799.5708] device (tap31de197b-ef): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Dec 02 10:03:19 np0005541913.localdomain systemd-machined[84262]: New machine qemu-3-instance-00000007.
Dec 02 10:03:19 np0005541913.localdomain systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.857 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.925 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764669799.9243534, 63092ab0-9432-4c74-933e-e9d5428e6162 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:03:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:19.925 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] VM Started (Lifecycle Event)
Dec 02 10:03:20 np0005541913.localdomain ceph-mon[298296]: pgmap v104: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 3.9 MiB/s wr, 94 op/s
Dec 02 10:03:20 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:20Z|00099|binding|INFO|Setting lport 31de197b-ef56-4d2a-9fa2-293715a60004 up in Southbound
Dec 02 10:03:20 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:20Z|00100|binding|INFO|Setting lport 40590dd1-9250-4409-a2d0-cd4f4774bfc8 up in Southbound
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.500 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:bb:bd 10.100.0.4'], port_security=['fa:16:3e:8f:bb:bd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-17247491', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '63092ab0-9432-4c74-933e-e9d5428e6162', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-17247491', 'neutron:project_id': 'cccbafb2e3c343b2aab51714734bddce', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5c93e274-85ac-42d3-b949-bdb62e6b8c39', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c5273a4-e474-4c2c-a95a-a522e1a174bd, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=31de197b-ef56-4d2a-9fa2-293715a60004) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.503 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:01:78 19.80.0.123'], port_security=['fa:16:3e:51:01:78 19.80.0.123'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['31de197b-ef56-4d2a-9fa2-293715a60004'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1284966936', 'neutron:cidrs': '19.80.0.123/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3673812c-f461-4e86-831f-b7a7821f4bda', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1284966936', 'neutron:project_id': 'cccbafb2e3c343b2aab51714734bddce', 'neutron:revision_number': '3', 'neutron:security_group_ids': '5c93e274-85ac-42d3-b949-bdb62e6b8c39', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=23ebc33b-05e4-4907-9bc1-7e563b7692f1, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=40590dd1-9250-4409-a2d0-cd4f4774bfc8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.505 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 31de197b-ef56-4d2a-9fa2-293715a60004 in datapath 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f bound to our chassis
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.508 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port b22990f2-0db4-407c-a5b6-65e7991152d1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.508 160221 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.519 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4cebe5-de32-4d37-bf99-7236371f5ec9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.520 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62df5f27-c1 in ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.523 160340 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62df5f27-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.523 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[806350cd-6559-4eae-8a97-f295bbf61dff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.524 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ceef0be1-80cb-4d64-b23e-b72d9dfda56e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.536 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[d1fc5f82-7ac1-4149-b45f-2bbfaad4e413]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.551 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[c6acb99b-6351-4904-8f14-c12fda713ac3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.579 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[67019f57-b575-4bc7-a1c9-b5abf52447b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.586 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[42f90b64-6ffb-48c0-bb98-298adbaff0ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669800.5875] manager: (tap62df5f27-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.620 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa63576-fc78-401d-8323-c4e776e26501]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.625 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[34b01d2b-6976-4352-bfe2-8980825fc81f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap62df5f27-c1: link becomes ready
Dec 02 10:03:20 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap62df5f27-c0: link becomes ready
Dec 02 10:03:20 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669800.6540] device (tap62df5f27-c0): carrier: link connected
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.660 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[bc6b2f52-874e-4887-99cb-8e729ab32ae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.677 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5b56ad1c-c3fc-4754-aa3a-22cc146f291a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62df5f27-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:73:df:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1196280, 'reachable_time': 26916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309123, 'error': None, 'target': 'ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.696 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[b83b9017-f167-4b73-a4bd-6f55145bc071]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:df9c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1196280, 'tstamp': 1196280}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309124, 'error': None, 'target': 'ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.707 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.713 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764669799.9246593, 63092ab0-9432-4c74-933e-e9d5428e6162 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.714 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] VM Paused (Lifecycle Event)
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.715 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6960b323-9a0a-4040-88b3-4950e676f025]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62df5f27-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:73:df:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1196280, 'reachable_time': 26916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309125, 'error': None, 'target': 'ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.745 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[747ed6c1-8065-4515-929a-3729ff486855]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.800 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[f3450662-2baf-4490-ab5c-3b87cd5a1191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.802 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62df5f27-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.802 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.803 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62df5f27-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.803 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:03:20 np0005541913.localdomain kernel: device tap62df5f27-c0 entered promiscuous mode
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.806 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.809 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.810 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62df5f27-c0, col_values=(('external_ids', {'iface-id': 'ea045be8-e121-4ff5-bb82-2a757b7ce736'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:20 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:20Z|00101|binding|INFO|Releasing lport ea045be8-e121-4ff5-bb82-2a757b7ce736 from this chassis (sb_readonly=0)
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.812 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.815 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.822 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.823 160221 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.824 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[bd666bd3-a5c9-4d25-9cdb-e6e09ed04a0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.825 160221 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: global
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     log         /dev/log local0 debug
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     log-tag     haproxy-metadata-proxy-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     user        root
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     group       root
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     maxconn     1024
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     pidfile     /var/lib/neutron/external/pids/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f.pid.haproxy
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     daemon
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: defaults
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     log global
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     mode http
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     option httplog
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     option dontlognull
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     option http-server-close
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     option forwardfor
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     retries                 3
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-request    30s
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout connect         30s
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout client          32s
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout server          32s
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-keep-alive 30s
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: listen listener
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     bind 169.254.169.254:80
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:     http-request add-header X-OVN-Network-ID 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 02 10:03:20 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:20.826 160221 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'env', 'PROCESS_TAG=haproxy-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 02 10:03:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:20.870 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.078 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:21.346 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:03:13Z, description=, device_id=c633bc2a-d8d8-4d52-951c-727821eef4f5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a0f5b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a0fd90>], id=e956d78f-d33b-49fb-a452-eaed9391e7d2, ip_allocation=immediate, mac_address=fa:16:3e:54:ce:16, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:03:06Z, description=, dns_domain=, id=13bbad22-ab61-4b1f-849e-c651aa8f3297, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-1859087569-network, port_security_enabled=True, project_id=d048f19ff5fc47dc88162ef5f9cebe8b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25848, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=342, status=ACTIVE, subnets=['a62c0502-5155-4c20-aaad-4cc8bce976da'], tags=[], tenant_id=d048f19ff5fc47dc88162ef5f9cebe8b, updated_at=2025-12-02T10:03:07Z, vlan_transparent=None, network_id=13bbad22-ab61-4b1f-849e-c651aa8f3297, port_security_enabled=False, project_id=d048f19ff5fc47dc88162ef5f9cebe8b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=411, status=DOWN, tags=[], tenant_id=d048f19ff5fc47dc88162ef5f9cebe8b, updated_at=2025-12-02T10:03:14Z on network 13bbad22-ab61-4b1f-849e-c651aa8f3297
Dec 02 10:03:21 np0005541913.localdomain podman[309158]: 
Dec 02 10:03:21 np0005541913.localdomain podman[309158]: 2025-12-02 10:03:21.371991104 +0000 UTC m=+0.098970437 container create fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:03:21 np0005541913.localdomain systemd[1]: Started libpod-conmon-fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03.scope.
Dec 02 10:03:21 np0005541913.localdomain podman[309158]: 2025-12-02 10:03:21.321864508 +0000 UTC m=+0.048844322 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 02 10:03:21 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:03:21 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88e44ad48351df4bc5cf9273b4853724ba68f5d6925b7196bceece1b80907f57/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:03:21 np0005541913.localdomain podman[309158]: 2025-12-02 10:03:21.474353671 +0000 UTC m=+0.201333014 container init fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:03:21 np0005541913.localdomain podman[309158]: 2025-12-02 10:03:21.48478618 +0000 UTC m=+0.211765523 container start fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:03:21 np0005541913.localdomain neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f[309172]: [NOTICE]   (309185) : New worker (309190) forked
Dec 02 10:03:21 np0005541913.localdomain neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f[309172]: [NOTICE]   (309185) : Loading success.
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.546 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 40590dd1-9250-4409-a2d0-cd4f4774bfc8 in datapath 3673812c-f461-4e86-831f-b7a7821f4bda unbound from our chassis
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.551 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 50e76764-b6f4-47d9-9fe0-99e7b5813c75 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.551 160221 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3673812c-f461-4e86-831f-b7a7821f4bda
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.559 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[408c3dd7-de6c-45de-9396-ba0244090ed2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.560 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3673812c-f1 in ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.562 160340 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3673812c-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.562 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[17b97fa5-5515-4206-994c-5fc3cfa05ce6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.563 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[72a13be5-0734-4b3a-bdc8-a7209988e446]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.574 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[033a0d74-28f9-4b94-aae9-130414101301]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.587 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[9b566ed3-a7a9-4694-a311-3497afd34049]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.603 281858 DEBUG nova.compute.manager [req-d4c037f9-67dc-4a02-94d2-2aaebd6fcf63 req-10a287b8-19a5-41a5-af53-a531ec46647f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received event network-vif-plugged-31de197b-ef56-4d2a-9fa2-293715a60004 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.603 281858 DEBUG oslo_concurrency.lockutils [req-d4c037f9-67dc-4a02-94d2-2aaebd6fcf63 req-10a287b8-19a5-41a5-af53-a531ec46647f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.606 281858 DEBUG oslo_concurrency.lockutils [req-d4c037f9-67dc-4a02-94d2-2aaebd6fcf63 req-10a287b8-19a5-41a5-af53-a531ec46647f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.606 281858 DEBUG oslo_concurrency.lockutils [req-d4c037f9-67dc-4a02-94d2-2aaebd6fcf63 req-10a287b8-19a5-41a5-af53-a531ec46647f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.607 281858 DEBUG nova.compute.manager [req-d4c037f9-67dc-4a02-94d2-2aaebd6fcf63 req-10a287b8-19a5-41a5-af53-a531ec46647f dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Processing event network-vif-plugged-31de197b-ef56-4d2a-9fa2-293715a60004 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.607 281858 DEBUG nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.612 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.615 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764669801.61529, 63092ab0-9432-4c74-933e-e9d5428e6162 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.615 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] VM Resumed (Lifecycle Event)
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.617 281858 INFO nova.virt.libvirt.driver [-] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Instance spawned successfully.
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.617 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.617 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9a748b-9912-4f33-afe1-4d3dc23bf593]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.623 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[dce52d7c-abc6-4b52-ac3e-9ef0ffd3a82a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669801.6249] manager: (tap3673812c-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Dec 02 10:03:21 np0005541913.localdomain systemd-udevd[309110]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:03:21 np0005541913.localdomain podman[309203]: 2025-12-02 10:03:21.640099449 +0000 UTC m=+0.079835964 container kill 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:03:21 np0005541913.localdomain dnsmasq[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/addn_hosts - 1 addresses
Dec 02 10:03:21 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/host
Dec 02 10:03:21 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/opts
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.648 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.654 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.655 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.655 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.656 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.656 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.657 281858 DEBUG nova.virt.libvirt.driver [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.661 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.662 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[59a0beef-f567-4298-80b5-d5576f8e438b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.666 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[cc51efb9-65f2-4b62-8ccd-f5a36a16d235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669801.6895] device (tap3673812c-f0): carrier: link connected
Dec 02 10:03:21 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap3673812c-f1: link becomes ready
Dec 02 10:03:21 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap3673812c-f0: link becomes ready
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.690 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.695 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[57a60cc0-7558-4482-b792-54a4b40de722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.714 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[fed0bc11-6d8f-4fdc-a3a4-6f997f8eb4fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3673812c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e1:13:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1196383, 'reachable_time': 44383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309229, 'error': None, 'target': 'ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.719 281858 INFO nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Took 8.14 seconds to spawn the instance on the hypervisor.
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.720 281858 DEBUG nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.730 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[619de326-974d-4695-adcc-664020a969e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee1:13c5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1196383, 'tstamp': 1196383}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309231, 'error': None, 'target': 'ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.743 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[984ac655-9156-4709-bc86-dd8060ca3e0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3673812c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e1:13:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1196383, 'reachable_time': 44383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309234, 'error': None, 'target': 'ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.770 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[447801d1-36ce-4f7a-9fd1-981cbc8a7008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ceph-mon[298296]: pgmap v105: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 3.9 MiB/s wr, 93 op/s
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.788 281858 INFO nova.compute.manager [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Took 9.01 seconds to build instance.
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.818 281858 DEBUG oslo_concurrency.lockutils [None req-3fb4799c-0c3c-4bc7-b583-6f253e51fc78 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.822 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[d63ed11a-57f3-4a53-923c-fe48a0d59675]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.823 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3673812c-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.823 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.824 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3673812c-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.826 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:21 np0005541913.localdomain kernel: device tap3673812c-f0 entered promiscuous mode
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.830 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3673812c-f0, col_values=(('external_ids', {'iface-id': 'ba8757f7-1076-4bc0-8968-1084ffa48766'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:21 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:21Z|00102|binding|INFO|Releasing lport ba8757f7-1076-4bc0-8968-1084ffa48766 from this chassis (sb_readonly=0)
Dec 02 10:03:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:21.842 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.843 160221 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3673812c-f461-4e86-831f-b7a7821f4bda.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3673812c-f461-4e86-831f-b7a7821f4bda.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.844 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[f3888fd0-2042-4227-a9b8-3b53bd19d5aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.845 160221 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: global
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     log         /dev/log local0 debug
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     log-tag     haproxy-metadata-proxy-3673812c-f461-4e86-831f-b7a7821f4bda
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     user        root
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     group       root
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     maxconn     1024
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     pidfile     /var/lib/neutron/external/pids/3673812c-f461-4e86-831f-b7a7821f4bda.pid.haproxy
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     daemon
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: defaults
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     log global
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     mode http
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     option httplog
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     option dontlognull
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     option http-server-close
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     option forwardfor
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     retries                 3
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-request    30s
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout connect         30s
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout client          32s
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout server          32s
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-keep-alive 30s
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: listen listener
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     bind 169.254.169.254:80
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:     http-request add-header X-OVN-Network-ID 3673812c-f461-4e86-831f-b7a7821f4bda
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 02 10:03:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:21.846 160221 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda', 'env', 'PROCESS_TAG=haproxy-3673812c-f461-4e86-831f-b7a7821f4bda', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3673812c-f461-4e86-831f-b7a7821f4bda.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 02 10:03:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:21.874 263406 INFO neutron.agent.dhcp.agent [None req-4f4b2c56-0164-4e20-86ef-6834063a9380 - - - - - -] DHCP configuration for ports {'e956d78f-d33b-49fb-a452-eaed9391e7d2'} is completed
Dec 02 10:03:22 np0005541913.localdomain podman[309269]: 
Dec 02 10:03:22 np0005541913.localdomain podman[309269]: 2025-12-02 10:03:22.219804635 +0000 UTC m=+0.075112997 container create 5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:03:22 np0005541913.localdomain systemd[1]: Started libpod-conmon-5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9.scope.
Dec 02 10:03:22 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:03:22 np0005541913.localdomain podman[309269]: 2025-12-02 10:03:22.183074889 +0000 UTC m=+0.038383281 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 02 10:03:22 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a803b833a5ff5c579638b33681dc94f55bac1f087c0b32e1bc859addffd561/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:03:22 np0005541913.localdomain podman[309269]: 2025-12-02 10:03:22.299069842 +0000 UTC m=+0.154378214 container init 5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:03:22 np0005541913.localdomain podman[309269]: 2025-12-02 10:03:22.315754609 +0000 UTC m=+0.171062981 container start 5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:03:22 np0005541913.localdomain neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda[309283]: [NOTICE]   (309287) : New worker (309289) forked
Dec 02 10:03:22 np0005541913.localdomain neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda[309283]: [NOTICE]   (309287) : Loading success.
Dec 02 10:03:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:22.911 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:23 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:03:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:03:23 np0005541913.localdomain podman[309298]: 2025-12-02 10:03:23.495829366 +0000 UTC m=+0.131160411 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 10:03:23 np0005541913.localdomain podman[309298]: 2025-12-02 10:03:23.536406345 +0000 UTC m=+0.171737440 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 10:03:23 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:03:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:23.748 281858 DEBUG nova.compute.manager [req-15163fd7-a138-43dd-9fd2-bf93b56ede75 req-b57662d3-f756-462f-a04f-bca3192e809a dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received event network-vif-plugged-31de197b-ef56-4d2a-9fa2-293715a60004 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 10:03:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:23.749 281858 DEBUG oslo_concurrency.lockutils [req-15163fd7-a138-43dd-9fd2-bf93b56ede75 req-b57662d3-f756-462f-a04f-bca3192e809a dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:23.749 281858 DEBUG oslo_concurrency.lockutils [req-15163fd7-a138-43dd-9fd2-bf93b56ede75 req-b57662d3-f756-462f-a04f-bca3192e809a dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:23.750 281858 DEBUG oslo_concurrency.lockutils [req-15163fd7-a138-43dd-9fd2-bf93b56ede75 req-b57662d3-f756-462f-a04f-bca3192e809a dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:23.751 281858 DEBUG nova.compute.manager [req-15163fd7-a138-43dd-9fd2-bf93b56ede75 req-b57662d3-f756-462f-a04f-bca3192e809a dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] No waiting events found dispatching network-vif-plugged-31de197b-ef56-4d2a-9fa2-293715a60004 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 02 10:03:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:23.751 281858 WARNING nova.compute.manager [req-15163fd7-a138-43dd-9fd2-bf93b56ede75 req-b57662d3-f756-462f-a04f-bca3192e809a dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received unexpected event network-vif-plugged-31de197b-ef56-4d2a-9fa2-293715a60004 for instance with vm_state active and task_state None.
Dec 02 10:03:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e97 e97: 6 total, 6 up, 6 in
Dec 02 10:03:24 np0005541913.localdomain ceph-mon[298296]: pgmap v106: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 3.9 MiB/s wr, 121 op/s
Dec 02 10:03:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:24.076 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:24.898 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:25 np0005541913.localdomain ceph-mon[298296]: osdmap e97: 6 total, 6 up, 6 in
Dec 02 10:03:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:26 np0005541913.localdomain ceph-mon[298296]: pgmap v108: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.2 MiB/s wr, 74 op/s
Dec 02 10:03:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e98 e98: 6 total, 6 up, 6 in
Dec 02 10:03:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e99 e99: 6 total, 6 up, 6 in
Dec 02 10:03:27 np0005541913.localdomain ceph-mon[298296]: osdmap e98: 6 total, 6 up, 6 in
Dec 02 10:03:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:03:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:03:27 np0005541913.localdomain podman[309317]: 2025-12-02 10:03:27.450497178 +0000 UTC m=+0.090560251 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:03:27 np0005541913.localdomain podman[309317]: 2025-12-02 10:03:27.458468632 +0000 UTC m=+0.098531745 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:03:27 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:03:27 np0005541913.localdomain podman[309318]: 2025-12-02 10:03:27.506146111 +0000 UTC m=+0.144024746 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:03:27 np0005541913.localdomain podman[309318]: 2025-12-02 10:03:27.543022921 +0000 UTC m=+0.180901586 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:03:27 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:03:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:27.913 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:28 np0005541913.localdomain ceph-mon[298296]: pgmap v110: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 640 KiB/s rd, 42 KiB/s wr, 41 op/s
Dec 02 10:03:28 np0005541913.localdomain ceph-mon[298296]: osdmap e99: 6 total, 6 up, 6 in
Dec 02 10:03:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:29.790 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Check if temp file /var/lib/nova/instances/tmp6m2ihysk exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Dec 02 10:03:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:29.791 281858 DEBUG nova.compute.manager [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6m2ihysk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='63092ab0-9432-4c74-933e-e9d5428e6162',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Dec 02 10:03:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:29.902 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:30 np0005541913.localdomain ceph-mon[298296]: pgmap v112: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 11 MiB/s rd, 7.8 MiB/s wr, 251 op/s
Dec 02 10:03:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e100 e100: 6 total, 6 up, 6 in
Dec 02 10:03:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:31 np0005541913.localdomain ceph-mon[298296]: osdmap e100: 6 total, 6 up, 6 in
Dec 02 10:03:31 np0005541913.localdomain ceph-mon[298296]: pgmap v114: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 11 MiB/s rd, 7.8 MiB/s wr, 251 op/s
Dec 02 10:03:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:32.665 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:03:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:32.666 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:32.668 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:03:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:32.915 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:34 np0005541913.localdomain ceph-mon[298296]: pgmap v115: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 9.6 MiB/s rd, 6.9 MiB/s wr, 240 op/s
Dec 02 10:03:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:03:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:03:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:03:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:03:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:03:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:03:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:03:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:03:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:34.953 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:35 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1135731250' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:35.087 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:03:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:35.087 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:03:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:35.096 281858 INFO nova.compute.rpcapi [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Dec 02 10:03:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:35.096 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:03:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:35.296 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:03:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:03:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:03:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162127 "" "Go-http-client/1.1"
Dec 02 10:03:36 np0005541913.localdomain ceph-mon[298296]: pgmap v116: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 8.1 MiB/s rd, 5.8 MiB/s wr, 202 op/s
Dec 02 10:03:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:03:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 21144 "" "Go-http-client/1.1"
Dec 02 10:03:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:36Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8f:bb:bd 10.100.0.4
Dec 02 10:03:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:36Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8f:bb:bd 10.100.0.4
Dec 02 10:03:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:37Z|00103|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:03:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:37Z|00104|binding|INFO|Releasing lport ea045be8-e121-4ff5-bb82-2a757b7ce736 from this chassis (sb_readonly=0)
Dec 02 10:03:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:37Z|00105|binding|INFO|Releasing lport ba8757f7-1076-4bc0-8968-1084ffa48766 from this chassis (sb_readonly=0)
Dec 02 10:03:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:37.788 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:37.917 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:38 np0005541913.localdomain ceph-mon[298296]: pgmap v117: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 6.7 MiB/s rd, 4.8 MiB/s wr, 167 op/s
Dec 02 10:03:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:38.671 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:39 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/84271393' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:39.955 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.128 281858 DEBUG nova.compute.manager [req-82284151-90b9-4cda-944e-fe6cb016570e req-f0e02f47-5e48-434e-bdec-c75cdac73624 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received event network-vif-unplugged-31de197b-ef56-4d2a-9fa2-293715a60004 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.129 281858 DEBUG oslo_concurrency.lockutils [req-82284151-90b9-4cda-944e-fe6cb016570e req-f0e02f47-5e48-434e-bdec-c75cdac73624 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.129 281858 DEBUG oslo_concurrency.lockutils [req-82284151-90b9-4cda-944e-fe6cb016570e req-f0e02f47-5e48-434e-bdec-c75cdac73624 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.130 281858 DEBUG oslo_concurrency.lockutils [req-82284151-90b9-4cda-944e-fe6cb016570e req-f0e02f47-5e48-434e-bdec-c75cdac73624 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.130 281858 DEBUG nova.compute.manager [req-82284151-90b9-4cda-944e-fe6cb016570e req-f0e02f47-5e48-434e-bdec-c75cdac73624 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] No waiting events found dispatching network-vif-unplugged-31de197b-ef56-4d2a-9fa2-293715a60004 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.131 281858 DEBUG nova.compute.manager [req-82284151-90b9-4cda-944e-fe6cb016570e req-f0e02f47-5e48-434e-bdec-c75cdac73624 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received event network-vif-unplugged-31de197b-ef56-4d2a-9fa2-293715a60004 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 02 10:03:40 np0005541913.localdomain ceph-mon[298296]: pgmap v118: 177 pgs: 177 active+clean; 304 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 518 KiB/s rd, 2.6 MiB/s wr, 122 op/s
Dec 02 10:03:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:03:40 np0005541913.localdomain podman[309365]: 2025-12-02 10:03:40.465454069 +0000 UTC m=+0.098555796 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 10:03:40 np0005541913.localdomain podman[309365]: 2025-12-02 10:03:40.477354448 +0000 UTC m=+0.110456205 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:03:40 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:03:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:40.546 263406 INFO neutron.agent.linux.ip_lib [None req-d56a0314-fd8f-4bd2-ae84-88824e1313ec - - - - - -] Device tapae9b1151-59 cannot be used as it has no MAC address
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.571 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:40 np0005541913.localdomain kernel: device tapae9b1151-59 entered promiscuous mode
Dec 02 10:03:40 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669820.5774] manager: (tapae9b1151-59): new Generic device (/org/freedesktop/NetworkManager/Devices/24)
Dec 02 10:03:40 np0005541913.localdomain systemd-udevd[309395]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.579 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:40Z|00106|binding|INFO|Claiming lport ae9b1151-5912-406f-ae7b-9db37b471685 for this chassis.
Dec 02 10:03:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:40Z|00107|binding|INFO|ae9b1151-5912-406f-ae7b-9db37b471685: Claiming unknown
Dec 02 10:03:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:40.590 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-97ae066a-ecdb-4d1f-a021-787e342a02a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-97ae066a-ecdb-4d1f-a021-787e342a02a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc1edab5ae5d43f08b967b5bf594f8b5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5764aa57-a87d-4e3f-89b1-49a48ee4f883, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=ae9b1151-5912-406f-ae7b-9db37b471685) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:03:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:40.591 160221 INFO neutron.agent.ovn.metadata.agent [-] Port ae9b1151-5912-406f-ae7b-9db37b471685 in datapath 97ae066a-ecdb-4d1f-a021-787e342a02a4 bound to our chassis
Dec 02 10:03:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:40.592 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 97ae066a-ecdb-4d1f-a021-787e342a02a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:03:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:40.593 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[af19e651-a41c-4bed-b631-d522ab2033a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.613 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:40Z|00108|binding|INFO|Setting lport ae9b1151-5912-406f-ae7b-9db37b471685 ovn-installed in OVS
Dec 02 10:03:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:40Z|00109|binding|INFO|Setting lport ae9b1151-5912-406f-ae7b-9db37b471685 up in Southbound
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.618 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.647 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:40.673 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:40 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:41.103 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:41 np0005541913.localdomain podman[309451]: 
Dec 02 10:03:41 np0005541913.localdomain podman[309451]: 2025-12-02 10:03:41.554948535 +0000 UTC m=+0.069391673 container create 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:03:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d.scope.
Dec 02 10:03:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:03:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f69386878a877368468586813b3dbb1937ee49b0390efbec5dd7e4f609902381/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:03:41 np0005541913.localdomain podman[309451]: 2025-12-02 10:03:41.521309272 +0000 UTC m=+0.035752450 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:03:41 np0005541913.localdomain podman[309451]: 2025-12-02 10:03:41.629590977 +0000 UTC m=+0.144034195 container init 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:03:41 np0005541913.localdomain podman[309451]: 2025-12-02 10:03:41.639825282 +0000 UTC m=+0.154268490 container start 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:03:41 np0005541913.localdomain dnsmasq[309469]: started, version 2.85 cachesize 150
Dec 02 10:03:41 np0005541913.localdomain dnsmasq[309469]: DNS service limited to local subnets
Dec 02 10:03:41 np0005541913.localdomain dnsmasq[309469]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:03:41 np0005541913.localdomain dnsmasq[309469]: warning: no upstream servers configured
Dec 02 10:03:41 np0005541913.localdomain dnsmasq-dhcp[309469]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:03:41 np0005541913.localdomain dnsmasq[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/addn_hosts - 0 addresses
Dec 02 10:03:41 np0005541913.localdomain dnsmasq-dhcp[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/host
Dec 02 10:03:41 np0005541913.localdomain dnsmasq-dhcp[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/opts
Dec 02 10:03:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:41.762 263406 INFO neutron.agent.dhcp.agent [None req-4c7df868-1e59-42ac-b35b-a3680cf97ca5 - - - - - -] DHCP configuration for ports {'a5732653-8ec3-490d-92c0-40205764cb6c'} is completed
Dec 02 10:03:41 np0005541913.localdomain ceph-mon[298296]: pgmap v119: 177 pgs: 177 active+clean; 304 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 509 KiB/s rd, 2.5 MiB/s wr, 120 op/s
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.072 281858 INFO nova.compute.manager [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Took 6.98 seconds for pre_live_migration on destination host np0005541914.localdomain.
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.072 281858 DEBUG nova.compute.manager [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.098 281858 DEBUG nova.compute.manager [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6m2ihysk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='63092ab0-9432-4c74-933e-e9d5428e6162',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(899d75d5-bebe-4551-8a0f-b0309584472e),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.102 281858 DEBUG nova.objects.instance [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Lazy-loading 'migration_context' on Instance uuid 63092ab0-9432-4c74-933e-e9d5428e6162 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.104 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.106 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.106 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.125 281858 DEBUG nova.virt.libvirt.vif [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-02T10:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-861747463',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005541913.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-861747463',id=7,image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T10:03:21Z,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005541913.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccbafb2e3c343b2aab51714734bddce',ramdisk_id='',reservation_id='r-sf2jj0i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-5814605',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-5814605-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T10:03:21Z,user_data=None,user_id='60f523e6d03743daa3ff6f5bc7122d00',uuid=63092ab0-9432-4c74-933e-e9d5428e6162,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.126 281858 DEBUG nova.network.os_vif_util [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Converting VIF {"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.127 281858 DEBUG nova.network.os_vif_util [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:bb:bd,bridge_name='br-int',has_traffic_filtering=True,id=31de197b-ef56-4d2a-9fa2-293715a60004,network=Network(62df5f27-c8d9-4d79-9ad6-2f32e63bf47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap31de197b-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.128 281858 DEBUG nova.virt.libvirt.migration [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Updating guest XML with vif config: <interface type="ethernet">
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]:   <mac address="fa:16:3e:8f:bb:bd"/>
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]:   <model type="virtio"/>
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]:   <driver name="vhost" rx_queue_size="512"/>
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]:   <mtu size="1442"/>
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]:   <target dev="tap31de197b-ef"/>
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: </interface>
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.129 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.173 281858 DEBUG nova.compute.manager [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received event network-vif-plugged-31de197b-ef56-4d2a-9fa2-293715a60004 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.174 281858 DEBUG oslo_concurrency.lockutils [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.174 281858 DEBUG oslo_concurrency.lockutils [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.175 281858 DEBUG oslo_concurrency.lockutils [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.175 281858 DEBUG nova.compute.manager [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] No waiting events found dispatching network-vif-plugged-31de197b-ef56-4d2a-9fa2-293715a60004 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.176 281858 WARNING nova.compute.manager [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received unexpected event network-vif-plugged-31de197b-ef56-4d2a-9fa2-293715a60004 for instance with vm_state active and task_state migrating.
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.176 281858 DEBUG nova.compute.manager [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received event network-changed-31de197b-ef56-4d2a-9fa2-293715a60004 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.177 281858 DEBUG nova.compute.manager [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Refreshing instance network info cache due to event network-changed-31de197b-ef56-4d2a-9fa2-293715a60004. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.177 281858 DEBUG oslo_concurrency.lockutils [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "refresh_cache-63092ab0-9432-4c74-933e-e9d5428e6162" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.178 281858 DEBUG oslo_concurrency.lockutils [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquired lock "refresh_cache-63092ab0-9432-4c74-933e-e9d5428e6162" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.178 281858 DEBUG nova.network.neutron [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Refreshing network info cache for port 31de197b-ef56-4d2a-9fa2-293715a60004 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 02 10:03:42 np0005541913.localdomain systemd[1]: tmp-crun.COZjxR.mount: Deactivated successfully.
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.609 281858 DEBUG nova.virt.libvirt.migration [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.610 281858 INFO nova.virt.libvirt.migration [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.714 281858 INFO nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 02 10:03:42 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2234913475' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:03:42 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/91978558' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.879 281858 DEBUG nova.network.neutron [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Updated VIF entry in instance network info cache for port 31de197b-ef56-4d2a-9fa2-293715a60004. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.880 281858 DEBUG nova.network.neutron [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Updating instance_info_cache with network_info: [{"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "np0005541914.localdomain"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.902 281858 DEBUG oslo_concurrency.lockutils [req-6f4549b1-77b1-4e2e-936e-d8b6b9611067 req-eba3de69-6cf9-4e7e-b7d3-2c46efce0508 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Releasing lock "refresh_cache-63092ab0-9432-4c74-933e-e9d5428e6162" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:03:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:42.920 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:43.218 281858 DEBUG nova.virt.libvirt.migration [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 02 10:03:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:43.219 281858 DEBUG nova.virt.libvirt.migration [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Dec 02 10:03:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:43.724 281858 DEBUG nova.virt.libvirt.migration [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 02 10:03:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:43.725 281858 DEBUG nova.virt.libvirt.migration [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Dec 02 10:03:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:43.786 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764669823.7865458, 63092ab0-9432-4c74-933e-e9d5428e6162 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:03:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:43.787 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] VM Paused (Lifecycle Event)
Dec 02 10:03:43 np0005541913.localdomain ceph-mon[298296]: pgmap v120: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 4.3 MiB/s rd, 6.0 MiB/s wr, 178 op/s
Dec 02 10:03:43 np0005541913.localdomain kernel: device tap31de197b-ef left promiscuous mode
Dec 02 10:03:43 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669823.9418] device (tap31de197b-ef): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Dec 02 10:03:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:43Z|00110|binding|INFO|Releasing lport 31de197b-ef56-4d2a-9fa2-293715a60004 from this chassis (sb_readonly=0)
Dec 02 10:03:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:43Z|00111|binding|INFO|Setting lport 31de197b-ef56-4d2a-9fa2-293715a60004 down in Southbound
Dec 02 10:03:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:43Z|00112|binding|INFO|Releasing lport 40590dd1-9250-4409-a2d0-cd4f4774bfc8 from this chassis (sb_readonly=0)
Dec 02 10:03:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:43Z|00113|binding|INFO|Setting lport 40590dd1-9250-4409-a2d0-cd4f4774bfc8 down in Southbound
Dec 02 10:03:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:43Z|00114|binding|INFO|Removing iface tap31de197b-ef ovn-installed in OVS
Dec 02 10:03:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:43.955 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:43.959 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:43.978 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:43 np0005541913.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 02 10:03:43 np0005541913.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 15.208s CPU time.
Dec 02 10:03:44 np0005541913.localdomain systemd-machined[84262]: Machine qemu-3-instance-00000007 terminated.
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.013 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:03:44 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:44Z|00115|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:03:44 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:44Z|00116|binding|INFO|Releasing lport ea045be8-e121-4ff5-bb82-2a757b7ce736 from this chassis (sb_readonly=0)
Dec 02 10:03:44 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:44Z|00117|binding|INFO|Releasing lport ba8757f7-1076-4bc0-8968-1084ffa48766 from this chassis (sb_readonly=0)
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.017 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:bb:bd 10.100.0.4'], port_security=['fa:16:3e:8f:bb:bd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain,np0005541914.localdomain', 'activation-strategy': 'rarp', 'additional-chassis-activated': '515e0717-8baa-40e6-ac30-5fb148626504'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-17247491', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '63092ab0-9432-4c74-933e-e9d5428e6162', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-17247491', 'neutron:project_id': 'cccbafb2e3c343b2aab51714734bddce', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5c93e274-85ac-42d3-b949-bdb62e6b8c39', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c5273a4-e474-4c2c-a95a-a522e1a174bd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=31de197b-ef56-4d2a-9fa2-293715a60004) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.019 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:01:78 19.80.0.123'], port_security=['fa:16:3e:51:01:78 19.80.0.123'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['31de197b-ef56-4d2a-9fa2-293715a60004'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1284966936', 'neutron:cidrs': '19.80.0.123/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3673812c-f461-4e86-831f-b7a7821f4bda', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1284966936', 'neutron:project_id': 'cccbafb2e3c343b2aab51714734bddce', 'neutron:revision_number': '3', 'neutron:security_group_ids': '5c93e274-85ac-42d3-b949-bdb62e6b8c39', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=23ebc33b-05e4-4907-9bc1-7e563b7692f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=40590dd1-9250-4409-a2d0-cd4f4774bfc8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.021 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 31de197b-ef56-4d2a-9fa2-293715a60004 in datapath 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f unbound from our chassis
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.023 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port b22990f2-0db4-407c-a5b6-65e7991152d1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.023 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.024 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ecad38a3-d08d-4dd4-877e-1d9f5c2cec85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.025 160221 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f namespace which is not needed anymore
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.059 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain virtqemud[203664]: cannot parse process status data
Dec 02 10:03:44 np0005541913.localdomain virtqemud[203664]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/63092ab0-9432-4c74-933e-e9d5428e6162_disk: No such file or directory
Dec 02 10:03:44 np0005541913.localdomain virtqemud[203664]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/63092ab0-9432-4c74-933e-e9d5428e6162_disk: No such file or directory
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.103 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.109 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.131 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.133 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.134 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f[309172]: [NOTICE]   (309185) : haproxy version is 2.8.14-c23fe91
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f[309172]: [NOTICE]   (309185) : path to executable is /usr/sbin/haproxy
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f[309172]: [WARNING]  (309185) : Exiting Master process...
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f[309172]: [WARNING]  (309185) : Exiting Master process...
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.230 281858 DEBUG nova.virt.libvirt.guest [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '63092ab0-9432-4c74-933e-e9d5428e6162' (instance-00000007) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.232 281858 INFO nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Migration operation has completed
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.232 281858 INFO nova.compute.manager [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] _post_live_migration() is started..
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f[309172]: [ALERT]    (309185) : Current worker (309190) exited with code 143 (Terminated)
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f[309172]: [WARNING]  (309185) : All workers exited. Exiting... (0)
Dec 02 10:03:44 np0005541913.localdomain systemd[1]: libpod-fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03.scope: Deactivated successfully.
Dec 02 10:03:44 np0005541913.localdomain podman[309507]: 2025-12-02 10:03:44.240000298 +0000 UTC m=+0.081774095 container died fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:03:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03-userdata-shm.mount: Deactivated successfully.
Dec 02 10:03:44 np0005541913.localdomain podman[309507]: 2025-12-02 10:03:44.343482375 +0000 UTC m=+0.185256142 container cleanup fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:03:44 np0005541913.localdomain podman[309521]: 2025-12-02 10:03:44.355576189 +0000 UTC m=+0.105442960 container cleanup fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:03:44 np0005541913.localdomain systemd[1]: libpod-conmon-fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03.scope: Deactivated successfully.
Dec 02 10:03:44 np0005541913.localdomain podman[309538]: 2025-12-02 10:03:44.438987407 +0000 UTC m=+0.076277807 container remove fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.443 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[1b76e555-aa6f-4314-bc20-0d09aff10df4]: (4, ('Tue Dec  2 10:03:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f (fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03)\nfd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03\nTue Dec  2 10:03:44 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f (fd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03)\nfd5e78e90e94a2686f4ae163d9f6dd4308c2dd49de37db63e43a9097c2945e03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.445 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[2ab0960b-e91f-48cd-9c4c-a9041e8030c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.446 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62df5f27-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.450 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain kernel: device tap62df5f27-c0 left promiscuous mode
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.460 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.463 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc5e43e-6305-4199-85b6-70517aa35c0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.481 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6274bafd-4027-4203-aa7a-b1ae288b84fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.482 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[86a316e6-d5a4-465c-b358-d07116563b34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.501 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[aac0fc88-39c8-4502-b8ff-9b244f28504d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1196272, 'reachable_time': 25541, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309558, 'error': None, 'target': 'ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.504 160371 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.504 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[ee81ca3b-eb9c-465e-96d5-d4f02f31194e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.505 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 40590dd1-9250-4409-a2d0-cd4f4774bfc8 in datapath 3673812c-f461-4e86-831f-b7a7821f4bda unbound from our chassis
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.511 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 50e76764-b6f4-47d9-9fe0-99e7b5813c75 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.512 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3673812c-f461-4e86-831f-b7a7821f4bda, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.513 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[2e8dd957-b8b6-46e9-a62a-91c95eb94abf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.513 160221 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda namespace which is not needed anymore
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda[309283]: [NOTICE]   (309287) : haproxy version is 2.8.14-c23fe91
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda[309283]: [NOTICE]   (309287) : path to executable is /usr/sbin/haproxy
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda[309283]: [WARNING]  (309287) : Exiting Master process...
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda[309283]: [WARNING]  (309287) : Exiting Master process...
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda[309283]: [ALERT]    (309287) : Current worker (309289) exited with code 143 (Terminated)
Dec 02 10:03:44 np0005541913.localdomain neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda[309283]: [WARNING]  (309287) : All workers exited. Exiting... (0)
Dec 02 10:03:44 np0005541913.localdomain systemd[1]: libpod-5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9.scope: Deactivated successfully.
Dec 02 10:03:44 np0005541913.localdomain podman[309576]: 2025-12-02 10:03:44.70627115 +0000 UTC m=+0.077480640 container died 5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:03:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.720 281858 DEBUG nova.compute.manager [req-da1fcdb2-c18f-4ab1-a3bf-a1162073ecf2 req-ab0eabc9-9457-445f-97d2-62861d27e746 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received event network-vif-unplugged-31de197b-ef56-4d2a-9fa2-293715a60004 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.721 281858 DEBUG oslo_concurrency.lockutils [req-da1fcdb2-c18f-4ab1-a3bf-a1162073ecf2 req-ab0eabc9-9457-445f-97d2-62861d27e746 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.722 281858 DEBUG oslo_concurrency.lockutils [req-da1fcdb2-c18f-4ab1-a3bf-a1162073ecf2 req-ab0eabc9-9457-445f-97d2-62861d27e746 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.722 281858 DEBUG oslo_concurrency.lockutils [req-da1fcdb2-c18f-4ab1-a3bf-a1162073ecf2 req-ab0eabc9-9457-445f-97d2-62861d27e746 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.722 281858 DEBUG nova.compute.manager [req-da1fcdb2-c18f-4ab1-a3bf-a1162073ecf2 req-ab0eabc9-9457-445f-97d2-62861d27e746 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] No waiting events found dispatching network-vif-unplugged-31de197b-ef56-4d2a-9fa2-293715a60004 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.723 281858 DEBUG nova.compute.manager [req-da1fcdb2-c18f-4ab1-a3bf-a1162073ecf2 req-ab0eabc9-9457-445f-97d2-62861d27e746 dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Received event network-vif-unplugged-31de197b-ef56-4d2a-9fa2-293715a60004 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 02 10:03:44 np0005541913.localdomain podman[309576]: 2025-12-02 10:03:44.738833394 +0000 UTC m=+0.110042814 container cleanup 5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:03:44 np0005541913.localdomain podman[309610]: 2025-12-02 10:03:44.829320052 +0000 UTC m=+0.067842892 container remove 5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.833 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[2415a355-e0ce-459d-8e0d-0f4ffb4beaa4]: (4, ('Tue Dec  2 10:03:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda (5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9)\n5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9\nTue Dec  2 10:03:44 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda (5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9)\n5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.835 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6a9070-27cd-44e3-a152-cd22912d64ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.837 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3673812c-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.863 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:44 np0005541913.localdomain systemd[1]: libpod-conmon-5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9.scope: Deactivated successfully.
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.882 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain kernel: device tap3673812c-f0 left promiscuous mode
Dec 02 10:03:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e101 e101: 6 total, 6 up, 6 in
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.892 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.896 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[4dcaea61-7b56-4a01-b59f-ca2c5cc9655d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.919 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[cacb33c3-9535-4be7-9564-8ee955daa90e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.920 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[3e8bf485-9ac6-4eab-aa10-dc2b3a29da80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain podman[309592]: 2025-12-02 10:03:44.809646905 +0000 UTC m=+0.083161863 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.937 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[e488d0e0-c916-4881-b96a-d55e1af92e75]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1196376, 'reachable_time': 23116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309640, 'error': None, 'target': 'ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.939 160371 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3673812c-f461-4e86-831f-b7a7821f4bda deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 02 10:03:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:03:44.939 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[faf5c316-2475-49ed-96d2-f908a797a235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.953 281858 DEBUG nova.network.neutron [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Activated binding for port 31de197b-ef56-4d2a-9fa2-293715a60004 and host np0005541914.localdomain migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.954 281858 DEBUG nova.compute.manager [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.955 281858 DEBUG nova.virt.libvirt.vif [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-02T10:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-861747463',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005541913.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-861747463',id=7,image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T10:03:21Z,launched_on='np0005541913.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005541913.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccbafb2e3c343b2aab51714734bddce',ramdisk_id='',reservation_id='r-sf2jj0i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-5814605',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-5814605-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T10:03:23Z,user_data=None,user_id='60f523e6d03743daa3ff6f5bc7122d00',uuid=63092ab0-9432-4c74-933e-e9d5428e6162,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.955 281858 DEBUG nova.network.os_vif_util [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Converting VIF {"id": "31de197b-ef56-4d2a-9fa2-293715a60004", "address": "fa:16:3e:8f:bb:bd", "network": {"id": "62df5f27-c8d9-4d79-9ad6-2f32e63bf47f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-307256986-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "cccbafb2e3c343b2aab51714734bddce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31de197b-ef", "ovs_interfaceid": "31de197b-ef56-4d2a-9fa2-293715a60004", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.956 281858 DEBUG nova.network.os_vif_util [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:bb:bd,bridge_name='br-int',has_traffic_filtering=True,id=31de197b-ef56-4d2a-9fa2-293715a60004,network=Network(62df5f27-c8d9-4d79-9ad6-2f32e63bf47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap31de197b-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.956 281858 DEBUG os_vif [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:bb:bd,bridge_name='br-int',has_traffic_filtering=True,id=31de197b-ef56-4d2a-9fa2-293715a60004,network=Network(62df5f27-c8d9-4d79-9ad6-2f32e63bf47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap31de197b-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.958 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain podman[309592]: 2025-12-02 10:03:44.959146236 +0000 UTC m=+0.232661304 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.960 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31de197b-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.961 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.962 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.966 281858 INFO os_vif [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:bb:bd,bridge_name='br-int',has_traffic_filtering=True,id=31de197b-ef56-4d2a-9fa2-293715a60004,network=Network(62df5f27-c8d9-4d79-9ad6-2f32e63bf47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap31de197b-ef')
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.966 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.966 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.967 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.967 281858 DEBUG nova.compute.manager [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.967 281858 INFO nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Deleting instance files /var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162_del
Dec 02 10:03:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:44.968 281858 INFO nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Deletion of /var/lib/nova/instances/63092ab0-9432-4c74-933e-e9d5428e6162_del complete
Dec 02 10:03:44 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:03:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-73a803b833a5ff5c579638b33681dc94f55bac1f087c0b32e1bc859addffd561-merged.mount: Deactivated successfully.
Dec 02 10:03:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5886c88cd844d5d769261d8f712fc0c863050130a274673349a42b1db3f379f9-userdata-shm.mount: Deactivated successfully.
Dec 02 10:03:45 np0005541913.localdomain systemd[1]: run-netns-ovnmeta\x2d3673812c\x2df461\x2d4e86\x2d831f\x2db7a7821f4bda.mount: Deactivated successfully.
Dec 02 10:03:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-88e44ad48351df4bc5cf9273b4853724ba68f5d6925b7196bceece1b80907f57-merged.mount: Deactivated successfully.
Dec 02 10:03:45 np0005541913.localdomain systemd[1]: run-netns-ovnmeta\x2d62df5f27\x2dc8d9\x2d4d79\x2d9ad6\x2d2f32e63bf47f.mount: Deactivated successfully.
Dec 02 10:03:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:45.612 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:45 np0005541913.localdomain ceph-mon[298296]: osdmap e101: 6 total, 6 up, 6 in
Dec 02 10:03:45 np0005541913.localdomain ceph-mon[298296]: pgmap v122: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 7.2 MiB/s wr, 203 op/s
Dec 02 10:03:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:45.949 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:03:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:03:47 np0005541913.localdomain podman[309641]: 2025-12-02 10:03:47.447792627 +0000 UTC m=+0.089614698 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Dec 02 10:03:47 np0005541913.localdomain systemd[1]: tmp-crun.8nzOeJ.mount: Deactivated successfully.
Dec 02 10:03:47 np0005541913.localdomain podman[309642]: 2025-12-02 10:03:47.50842044 +0000 UTC m=+0.147371994 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:03:47 np0005541913.localdomain podman[309641]: 2025-12-02 10:03:47.517728298 +0000 UTC m=+0.159550419 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 10:03:47 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:03:47 np0005541913.localdomain podman[309642]: 2025-12-02 10:03:47.569958936 +0000 UTC m=+0.208910460 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:03:47 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:03:48 np0005541913.localdomain ceph-mon[298296]: pgmap v123: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 7.2 MiB/s wr, 203 op/s
Dec 02 10:03:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:48.437 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:03:47Z, description=, device_id=88a5a4f4-0c8e-40f7-81a0-9e11da229be3, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a61310>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a61e20>], id=899a6997-58ef-4285-95b3-238237010220, ip_allocation=immediate, mac_address=fa:16:3e:3f:26:a6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:03:36Z, description=, dns_domain=, id=97ae066a-ecdb-4d1f-a021-787e342a02a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-624382811-network, port_security_enabled=True, project_id=dc1edab5ae5d43f08b967b5bf594f8b5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52168, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=452, status=ACTIVE, subnets=['1815dd23-acbf-4703-8ca8-599f5aab162a'], tags=[], tenant_id=dc1edab5ae5d43f08b967b5bf594f8b5, updated_at=2025-12-02T10:03:39Z, vlan_transparent=None, network_id=97ae066a-ecdb-4d1f-a021-787e342a02a4, port_security_enabled=False, project_id=dc1edab5ae5d43f08b967b5bf594f8b5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=490, status=DOWN, tags=[], tenant_id=dc1edab5ae5d43f08b967b5bf594f8b5, updated_at=2025-12-02T10:03:47Z on network 97ae066a-ecdb-4d1f-a021-787e342a02a4
Dec 02 10:03:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:48.507 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Acquiring lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:48.507 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:48.507 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Lock "63092ab0-9432-4c74-933e-e9d5428e6162-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:48.530 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:48.531 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:48.531 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:48.531 281858 DEBUG nova.compute.resource_tracker [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:03:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:48.532 281858 DEBUG oslo_concurrency.processutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:48 np0005541913.localdomain dnsmasq[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/addn_hosts - 1 addresses
Dec 02 10:03:48 np0005541913.localdomain podman[309703]: 2025-12-02 10:03:48.698422176 +0000 UTC m=+0.061828455 container kill 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:03:48 np0005541913.localdomain dnsmasq-dhcp[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/host
Dec 02 10:03:48 np0005541913.localdomain dnsmasq-dhcp[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/opts
Dec 02 10:03:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:48Z|00118|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:03:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:48.873 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 10:03:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 10:03:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 10:03:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:48.977 263406 INFO neutron.agent.dhcp.agent [None req-16ee785d-a3c7-43fa-afef-65c35f7f671e - - - - - -] DHCP configuration for ports {'899a6997-58ef-4285-95b3-238237010220'} is completed
Dec 02 10:03:49 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:03:49 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1033273436' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.095 281858 DEBUG oslo_concurrency.processutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.563s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.182 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.183 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.387 281858 WARNING nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.389 281858 DEBUG nova.compute.resource_tracker [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11321MB free_disk=41.563968658447266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.390 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.390 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.444 281858 DEBUG nova.compute.resource_tracker [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Migration for instance 63092ab0-9432-4c74-933e-e9d5428e6162 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.477 281858 DEBUG nova.compute.resource_tracker [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.505 281858 DEBUG nova.compute.resource_tracker [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.506 281858 DEBUG nova.compute.resource_tracker [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Migration 899d75d5-bebe-4551-8a0f-b0309584472e is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.506 281858 DEBUG nova.compute.resource_tracker [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.507 281858 DEBUG nova.compute.resource_tracker [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.563 281858 DEBUG oslo_concurrency.processutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:49.961 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:49 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:03:49 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3031907273' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:50.004 281858 DEBUG oslo_concurrency.processutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:50.011 281858 DEBUG nova.compute.provider_tree [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:03:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:50.035 281858 DEBUG nova.scheduler.client.report [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:03:50 np0005541913.localdomain ceph-mon[298296]: pgmap v124: 177 pgs: 177 active+clean; 304 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 214 op/s
Dec 02 10:03:50 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1033273436' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:50 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3031907273' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:50.066 281858 DEBUG nova.compute.resource_tracker [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:03:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:50.067 281858 DEBUG oslo_concurrency.lockutils [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:50.075 281858 INFO nova.compute.manager [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Migrating instance to np0005541914.localdomain finished successfully.
Dec 02 10:03:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:50.174 281858 INFO nova.scheduler.client.report [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] Deleted allocation for migration 899d75d5-bebe-4551-8a0f-b0309584472e
Dec 02 10:03:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:50.174 281858 DEBUG nova.virt.libvirt.driver [None req-a1f7258b-8365-4eb2-997c-eb7bece0a428 0f34e0319cfd4e2680d0e40bb8d8500f dfb2b4e8d0aa49b0b34376cadc0ea911 - - default default] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Dec 02 10:03:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e102 e102: 6 total, 6 up, 6 in
Dec 02 10:03:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:51 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:51.199 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:03:47Z, description=, device_id=88a5a4f4-0c8e-40f7-81a0-9e11da229be3, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a05850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a05a60>], id=899a6997-58ef-4285-95b3-238237010220, ip_allocation=immediate, mac_address=fa:16:3e:3f:26:a6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:03:36Z, description=, dns_domain=, id=97ae066a-ecdb-4d1f-a021-787e342a02a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-624382811-network, port_security_enabled=True, project_id=dc1edab5ae5d43f08b967b5bf594f8b5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52168, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=452, status=ACTIVE, subnets=['1815dd23-acbf-4703-8ca8-599f5aab162a'], tags=[], tenant_id=dc1edab5ae5d43f08b967b5bf594f8b5, updated_at=2025-12-02T10:03:39Z, vlan_transparent=None, network_id=97ae066a-ecdb-4d1f-a021-787e342a02a4, port_security_enabled=False, project_id=dc1edab5ae5d43f08b967b5bf594f8b5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=490, status=DOWN, tags=[], tenant_id=dc1edab5ae5d43f08b967b5bf594f8b5, updated_at=2025-12-02T10:03:47Z on network 97ae066a-ecdb-4d1f-a021-787e342a02a4
Dec 02 10:03:51 np0005541913.localdomain dnsmasq[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/addn_hosts - 1 addresses
Dec 02 10:03:51 np0005541913.localdomain dnsmasq-dhcp[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/host
Dec 02 10:03:51 np0005541913.localdomain dnsmasq-dhcp[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/opts
Dec 02 10:03:51 np0005541913.localdomain podman[309783]: 2025-12-02 10:03:51.449818354 +0000 UTC m=+0.072656875 container kill 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:03:51 np0005541913.localdomain systemd[1]: tmp-crun.hH2vOD.mount: Deactivated successfully.
Dec 02 10:03:51 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:51.653 263406 INFO neutron.agent.dhcp.agent [None req-13cd270d-62df-4f51-a2c9-1aec4c2acaaf - - - - - -] DHCP configuration for ports {'899a6997-58ef-4285-95b3-238237010220'} is completed
Dec 02 10:03:51 np0005541913.localdomain ceph-mon[298296]: osdmap e102: 6 total, 6 up, 6 in
Dec 02 10:03:51 np0005541913.localdomain ceph-mon[298296]: pgmap v126: 177 pgs: 177 active+clean; 304 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 20 KiB/s wr, 152 op/s
Dec 02 10:03:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:52.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:52.829 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:03:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:52.830 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:03:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:53.032 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:03:03Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a000d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a00a60>], id=31de197b-ef56-4d2a-9fa2-293715a60004, ip_allocation=immediate, mac_address=fa:16:3e:8f:bb:bd, name=tempest-parent-17247491, network_id=62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, port_security_enabled=True, project_id=cccbafb2e3c343b2aab51714734bddce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=14, security_groups=['5c93e274-85ac-42d3-b949-bdb62e6b8c39'], standard_attr_id=324, status=DOWN, tags=[], tenant_id=cccbafb2e3c343b2aab51714734bddce, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a00a30>], trunk_id=5b1dd84a-69f3-4e17-8604-49965c03b89c, updated_at=2025-12-02T10:03:51Z on network 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f
Dec 02 10:03:53 np0005541913.localdomain dnsmasq[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/addn_hosts - 2 addresses
Dec 02 10:03:53 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/host
Dec 02 10:03:53 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/opts
Dec 02 10:03:53 np0005541913.localdomain podman[309821]: 2025-12-02 10:03:53.282696968 +0000 UTC m=+0.066556352 container kill 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:03:53 np0005541913.localdomain systemd[1]: tmp-crun.491ZLY.mount: Deactivated successfully.
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.345 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.346 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.346 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.347 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:03:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:03:53.480 263406 INFO neutron.agent.dhcp.agent [None req-bd4c171a-2e18-4988-bba5-0bc844afee44 - - - - - -] DHCP configuration for ports {'31de197b-ef56-4d2a-9fa2-293715a60004'} is completed
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.908 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.927 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.927 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.928 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.929 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.930 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.952 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.952 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.953 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.953 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:03:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:53.954 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:54 np0005541913.localdomain ceph-mon[298296]: pgmap v127: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 22 KiB/s wr, 192 op/s
Dec 02 10:03:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:03:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:03:54 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2587233418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.424 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:54 np0005541913.localdomain podman[309864]: 2025-12-02 10:03:54.461679429 +0000 UTC m=+0.098629920 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:03:54 np0005541913.localdomain podman[309864]: 2025-12-02 10:03:54.476201697 +0000 UTC m=+0.113152238 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:03:54 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.500 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.501 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.743 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.745 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11311MB free_disk=41.7004280090332GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.746 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.747 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.830 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.831 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.831 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.882 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:03:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:54.965 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:55 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2587233418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:03:55 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2649201936' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:55.397 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:03:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:55.403 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:03:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:55.498 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:03:55 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:03:55.498 2 INFO neutron.agent.securitygroups_rpc [None req-e52c4e8f-c1be-4de8-b00c-43719449fd5b 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Security group member updated ['5c93e274-85ac-42d3-b949-bdb62e6b8c39']
Dec 02 10:03:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:55.501 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:03:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:55.501 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:03:55 np0005541913.localdomain dnsmasq[308334]: read /var/lib/neutron/dhcp/3673812c-f461-4e86-831f-b7a7821f4bda/addn_hosts - 0 addresses
Dec 02 10:03:55 np0005541913.localdomain dnsmasq-dhcp[308334]: read /var/lib/neutron/dhcp/3673812c-f461-4e86-831f-b7a7821f4bda/host
Dec 02 10:03:55 np0005541913.localdomain dnsmasq-dhcp[308334]: read /var/lib/neutron/dhcp/3673812c-f461-4e86-831f-b7a7821f4bda/opts
Dec 02 10:03:55 np0005541913.localdomain podman[309925]: 2025-12-02 10:03:55.802035047 +0000 UTC m=+0.069359766 container kill 1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3673812c-f461-4e86-831f-b7a7821f4bda, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:03:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:03:56 np0005541913.localdomain ceph-mon[298296]: pgmap v128: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 18 KiB/s wr, 155 op/s
Dec 02 10:03:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2649201936' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:03:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:57.401 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:57.490 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:57.912 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:58 np0005541913.localdomain ceph-mon[298296]: pgmap v129: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 18 KiB/s wr, 155 op/s
Dec 02 10:03:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:03:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:03:58 np0005541913.localdomain podman[309947]: 2025-12-02 10:03:58.446853875 +0000 UTC m=+0.087089882 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 10:03:58 np0005541913.localdomain podman[309947]: 2025-12-02 10:03:58.486060504 +0000 UTC m=+0.126296561 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:03:58 np0005541913.localdomain systemd[1]: tmp-crun.uyxf4D.mount: Deactivated successfully.
Dec 02 10:03:58 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:03:58 np0005541913.localdomain podman[309948]: 2025-12-02 10:03:58.517011982 +0000 UTC m=+0.152281836 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 10:03:58 np0005541913.localdomain podman[309948]: 2025-12-02 10:03:58.598134412 +0000 UTC m=+0.233404246 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:03:58 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:03:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:58.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:59.120 281858 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764669824.1196544, 63092ab0-9432-4c74-933e-e9d5428e6162 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:03:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:59.121 281858 INFO nova.compute.manager [-] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] VM Stopped (Lifecycle Event)
Dec 02 10:03:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:59.235 281858 DEBUG nova.compute.manager [None req-c26b12f1-728d-4822-85f2-643a4a363367 - - - - - -] [instance: 63092ab0-9432-4c74-933e-e9d5428e6162] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:03:59 np0005541913.localdomain dnsmasq[308334]: exiting on receipt of SIGTERM
Dec 02 10:03:59 np0005541913.localdomain podman[310015]: 2025-12-02 10:03:59.658993532 +0000 UTC m=+0.055577198 container kill 1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3673812c-f461-4e86-831f-b7a7821f4bda, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:03:59 np0005541913.localdomain systemd[1]: libpod-1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2.scope: Deactivated successfully.
Dec 02 10:03:59 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:59Z|00119|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:03:59 np0005541913.localdomain podman[310030]: 2025-12-02 10:03:59.735461609 +0000 UTC m=+0.065489574 container died 1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3673812c-f461-4e86-831f-b7a7821f4bda, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:03:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:59.756 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2-userdata-shm.mount: Deactivated successfully.
Dec 02 10:03:59 np0005541913.localdomain podman[310030]: 2025-12-02 10:03:59.779859066 +0000 UTC m=+0.109887001 container cleanup 1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3673812c-f461-4e86-831f-b7a7821f4bda, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:03:59 np0005541913.localdomain systemd[1]: libpod-conmon-1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2.scope: Deactivated successfully.
Dec 02 10:03:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:59.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:03:59 np0005541913.localdomain podman[310037]: 2025-12-02 10:03:59.856715322 +0000 UTC m=+0.175040674 container remove 1b22bb38a40c045fc2a47645e1a1a7cec84234f360093d901ddb7a461f8e88d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3673812c-f461-4e86-831f-b7a7821f4bda, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 10:03:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:59.867 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:59 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:59Z|00120|binding|INFO|Releasing lport 07dfafb4-0984-469d-a49c-9faf3746b302 from this chassis (sb_readonly=0)
Dec 02 10:03:59 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:03:59Z|00121|binding|INFO|Setting lport 07dfafb4-0984-469d-a49c-9faf3746b302 down in Southbound
Dec 02 10:03:59 np0005541913.localdomain kernel: device tap07dfafb4-09 left promiscuous mode
Dec 02 10:03:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:59.884 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:03:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:03:59.971 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:00.112 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-3673812c-f461-4e86-831f-b7a7821f4bda', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3673812c-f461-4e86-831f-b7a7821f4bda', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccbafb2e3c343b2aab51714734bddce', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23ebc33b-05e4-4907-9bc1-7e563b7692f1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=07dfafb4-0984-469d-a49c-9faf3746b302) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:00.114 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 07dfafb4-0984-469d-a49c-9faf3746b302 in datapath 3673812c-f461-4e86-831f-b7a7821f4bda unbound from our chassis
Dec 02 10:04:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:00.118 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3673812c-f461-4e86-831f-b7a7821f4bda, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:04:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:00.119 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[7895fb35-6480-4d70-b80d-13d1ccb84199]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e103 e103: 6 total, 6 up, 6 in
Dec 02 10:04:00 np0005541913.localdomain ceph-mon[298296]: pgmap v130: 177 pgs: 177 active+clean; 226 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 659 KiB/s rd, 37 KiB/s wr, 89 op/s
Dec 02 10:04:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7b24a22bfd2247520c320aa8b36a4cd59aff7c93df00851a3bdf42877c37d8eb-merged.mount: Deactivated successfully.
Dec 02 10:04:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:00.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:04:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:01 np0005541913.localdomain ceph-mon[298296]: osdmap e103: 6 total, 6 up, 6 in
Dec 02 10:04:01 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d3673812c\x2df461\x2d4e86\x2d831f\x2db7a7821f4bda.mount: Deactivated successfully.
Dec 02 10:04:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:01.443 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:04:01 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:01.899 2 INFO neutron.agent.securitygroups_rpc [None req-d7c6b922-a31a-45e0-b3f4-c5bd99f50015 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Security group member updated ['576d6513-029b-4880-bb0b-58094b586b90']
Dec 02 10:04:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:01.939 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:04:01Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a05520>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089e7e20>], id=54433c73-7e5c-481c-b64c-19e9cfd6e56f, ip_allocation=immediate, mac_address=fa:16:3e:bb:b6:1c, name=tempest-parent-146896978, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:03:06Z, description=, dns_domain=, id=13bbad22-ab61-4b1f-849e-c651aa8f3297, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-1859087569-network, port_security_enabled=True, project_id=d048f19ff5fc47dc88162ef5f9cebe8b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25848, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=342, status=ACTIVE, subnets=['a62c0502-5155-4c20-aaad-4cc8bce976da'], tags=[], tenant_id=d048f19ff5fc47dc88162ef5f9cebe8b, updated_at=2025-12-02T10:03:07Z, vlan_transparent=None, network_id=13bbad22-ab61-4b1f-849e-c651aa8f3297, port_security_enabled=True, project_id=d048f19ff5fc47dc88162ef5f9cebe8b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['576d6513-029b-4880-bb0b-58094b586b90'], standard_attr_id=537, status=DOWN, tags=[], tenant_id=d048f19ff5fc47dc88162ef5f9cebe8b, updated_at=2025-12-02T10:04:01Z on network 13bbad22-ab61-4b1f-849e-c651aa8f3297
Dec 02 10:04:02 np0005541913.localdomain ceph-mon[298296]: pgmap v132: 177 pgs: 177 active+clean; 226 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 659 KiB/s rd, 37 KiB/s wr, 89 op/s
Dec 02 10:04:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e104 e104: 6 total, 6 up, 6 in
Dec 02 10:04:02 np0005541913.localdomain systemd[1]: tmp-crun.TcdiUZ.mount: Deactivated successfully.
Dec 02 10:04:02 np0005541913.localdomain podman[310077]: 2025-12-02 10:04:02.297678664 +0000 UTC m=+0.083111764 container kill 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:04:02 np0005541913.localdomain dnsmasq[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/addn_hosts - 2 addresses
Dec 02 10:04:02 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/host
Dec 02 10:04:02 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/opts
Dec 02 10:04:02 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:02.518 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:04:02 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:02.592 263406 INFO neutron.agent.dhcp.agent [None req-27ffb734-3d7a-4ec4-acb0-0feda9702f62 - - - - - -] DHCP configuration for ports {'54433c73-7e5c-481c-b64c-19e9cfd6e56f'} is completed
Dec 02 10:04:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:03.048 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:03.049 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:03.050 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:03 np0005541913.localdomain ceph-mon[298296]: osdmap e104: 6 total, 6 up, 6 in
Dec 02 10:04:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2479108286' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2142671951' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1023460823' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e105 e105: 6 total, 6 up, 6 in
Dec 02 10:04:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:03Z|00122|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:04:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:03.737 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:04:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:04:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:04:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:04:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:04:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:04:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:04:04 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:04.111 2 INFO neutron.agent.securitygroups_rpc [None req-477510e9-c030-4124-bb5e-ce2ad555248a 60f523e6d03743daa3ff6f5bc7122d00 cccbafb2e3c343b2aab51714734bddce - - default default] Security group member updated ['5c93e274-85ac-42d3-b949-bdb62e6b8c39']
Dec 02 10:04:04 np0005541913.localdomain ceph-mon[298296]: pgmap v134: 177 pgs: 177 active+clean; 307 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 6.6 MiB/s rd, 5.9 MiB/s wr, 181 op/s
Dec 02 10:04:04 np0005541913.localdomain ceph-mon[298296]: osdmap e105: 6 total, 6 up, 6 in
Dec 02 10:04:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/682142553' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:04:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1748178721' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:04:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:04:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1748178721' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:04:04 np0005541913.localdomain dnsmasq[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/addn_hosts - 1 addresses
Dec 02 10:04:04 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/host
Dec 02 10:04:04 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/opts
Dec 02 10:04:04 np0005541913.localdomain podman[310116]: 2025-12-02 10:04:04.42741533 +0000 UTC m=+0.074420562 container kill 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 02 10:04:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:04.974 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1748178721' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:04:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1748178721' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:04:05 np0005541913.localdomain dnsmasq[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/addn_hosts - 0 addresses
Dec 02 10:04:05 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/host
Dec 02 10:04:05 np0005541913.localdomain dnsmasq-dhcp[307978]: read /var/lib/neutron/dhcp/62df5f27-c8d9-4d79-9ad6-2f32e63bf47f/opts
Dec 02 10:04:05 np0005541913.localdomain podman[310157]: 2025-12-02 10:04:05.791866802 +0000 UTC m=+0.068143634 container kill 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:04:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e106 e106: 6 total, 6 up, 6 in
Dec 02 10:04:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:04:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:04:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:04:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159754 "" "Go-http-client/1.1"
Dec 02 10:04:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:04:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20202 "" "Go-http-client/1.1"
Dec 02 10:04:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:06.203 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:06 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:06Z|00123|binding|INFO|Releasing lport fbe9f539-2caa-4225-b0aa-ee0756eec0f0 from this chassis (sb_readonly=0)
Dec 02 10:04:06 np0005541913.localdomain kernel: device tapfbe9f539-2c left promiscuous mode
Dec 02 10:04:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:06.227 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:06 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:06Z|00124|binding|INFO|Setting lport fbe9f539-2caa-4225-b0aa-ee0756eec0f0 down in Southbound
Dec 02 10:04:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:06.237 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccbafb2e3c343b2aab51714734bddce', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c5273a4-e474-4c2c-a95a-a522e1a174bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=fbe9f539-2caa-4225-b0aa-ee0756eec0f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:06.238 160221 INFO neutron.agent.ovn.metadata.agent [-] Port fbe9f539-2caa-4225-b0aa-ee0756eec0f0 in datapath 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f unbound from our chassis
Dec 02 10:04:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:06.243 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:04:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:06.244 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[058b8528-025a-4b06-9bce-e25d99fa0a24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:06.253 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:06.255 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:06 np0005541913.localdomain ceph-mon[298296]: pgmap v136: 177 pgs: 177 active+clean; 307 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.8 MiB/s wr, 146 op/s
Dec 02 10:04:06 np0005541913.localdomain ceph-mon[298296]: osdmap e106: 6 total, 6 up, 6 in
Dec 02 10:04:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/4051220607' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:06.877 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.595 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Acquiring lock "268e09a3-7abe-4037-a14a-068e7b8a78fb" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.595 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lock "268e09a3-7abe-4037-a14a-068e7b8a78fb" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.596 281858 INFO nova.compute.manager [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Unshelving
Dec 02 10:04:07 np0005541913.localdomain ceph-mon[298296]: pgmap v138: 177 pgs: 177 active+clean; 307 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.8 MiB/s wr, 146 op/s
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.694 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.695 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.698 281858 DEBUG nova.objects.instance [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lazy-loading 'pci_requests' on Instance uuid 268e09a3-7abe-4037-a14a-068e7b8a78fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.715 281858 DEBUG nova.objects.instance [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lazy-loading 'numa_topology' on Instance uuid 268e09a3-7abe-4037-a14a-068e7b8a78fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.728 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.728 281858 INFO nova.compute.claims [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Claim successful on node np0005541913.localdomain
Dec 02 10:04:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:07.858 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:04:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:04:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/476037169' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:08.282 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:04:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:08.291 281858 DEBUG nova.compute.provider_tree [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:04:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:08.323 281858 DEBUG nova.scheduler.client.report [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:04:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:08.352 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:08.458 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Acquiring lock "refresh_cache-268e09a3-7abe-4037-a14a-068e7b8a78fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:04:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:08.458 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Acquired lock "refresh_cache-268e09a3-7abe-4037-a14a-068e7b8a78fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:04:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:08.459 281858 DEBUG nova.network.neutron [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 02 10:04:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:08.545 281858 DEBUG nova.network.neutron [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 02 10:04:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/476037169' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.107 281858 DEBUG nova.network.neutron [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.132 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Releasing lock "refresh_cache-268e09a3-7abe-4037-a14a-068e7b8a78fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.135 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.136 281858 INFO nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Creating image(s)
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.177 281858 DEBUG nova.storage.rbd_utils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] rbd image 268e09a3-7abe-4037-a14a-068e7b8a78fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.183 281858 DEBUG nova.objects.instance [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 268e09a3-7abe-4037-a14a-068e7b8a78fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.240 281858 DEBUG nova.storage.rbd_utils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] rbd image 268e09a3-7abe-4037-a14a-068e7b8a78fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.285 281858 DEBUG nova.storage.rbd_utils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] rbd image 268e09a3-7abe-4037-a14a-068e7b8a78fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.291 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Acquiring lock "1ce5597317ee1701cfc96dd9b078f17a61568b4b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.292 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lock "1ce5597317ee1701cfc96dd9b078f17a61568b4b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.348 281858 DEBUG nova.virt.libvirt.imagebackend [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Image locations are: [{'url': 'rbd://c7c8e171-a193-56fb-95fa-8879fcfa7074/images/0e87d55f-56a4-4da8-9198-c633785685ee/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://c7c8e171-a193-56fb-95fa-8879fcfa7074/images/0e87d55f-56a4-4da8-9198-c633785685ee/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.419 281858 DEBUG nova.virt.libvirt.imagebackend [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Selected location: {'url': 'rbd://c7c8e171-a193-56fb-95fa-8879fcfa7074/images/0e87d55f-56a4-4da8-9198-c633785685ee/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.420 281858 DEBUG nova.storage.rbd_utils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] cloning images/0e87d55f-56a4-4da8-9198-c633785685ee@snap to None/268e09a3-7abe-4037-a14a-068e7b8a78fb_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.609 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lock "1ce5597317ee1701cfc96dd9b078f17a61568b4b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:09 np0005541913.localdomain ceph-mon[298296]: pgmap v139: 177 pgs: 177 active+clean; 226 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 6.9 MiB/s wr, 204 op/s
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.867 281858 DEBUG nova.objects.instance [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lazy-loading 'migration_context' on Instance uuid 268e09a3-7abe-4037-a14a-068e7b8a78fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:09.975 281858 DEBUG nova.storage.rbd_utils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] flattening vms/268e09a3-7abe-4037-a14a-068e7b8a78fb_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.066 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:10 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:10.811 2 INFO neutron.agent.securitygroups_rpc [None req-4cc1fa1d-9a41-40fb-9e7e-ba331f6b18b7 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Security group member updated ['576d6513-029b-4880-bb0b-58094b586b90']
Dec 02 10:04:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.901 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Image rbd:vms/268e09a3-7abe-4037-a14a-068e7b8a78fb_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.902 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.903 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Ensure instance console log exists: /var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.903 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.904 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.904 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.907 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-12-02T10:03:46Z,direct_url=<?>,disk_format='raw',id=0e87d55f-56a4-4da8-9198-c633785685ee,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-2084001492-shelved',owner='09cae3217c5e430b8dbe17828669a978',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-12-02T10:04:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'size': 0, 'encryption_options': None, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': 'd85e840d-fa56-497b-b5bd-b49584d3e97a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.913 281858 WARNING nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.915 281858 DEBUG nova.virt.libvirt.host [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Searching host: 'np0005541913.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.916 281858 DEBUG nova.virt.libvirt.host [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.918 281858 DEBUG nova.virt.libvirt.host [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Searching host: 'np0005541913.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.919 281858 DEBUG nova.virt.libvirt.host [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.920 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.920 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T10:01:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82beb986-6d20-42dc-b738-1cef87dee30f',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-12-02T10:03:46Z,direct_url=<?>,disk_format='raw',id=0e87d55f-56a4-4da8-9198-c633785685ee,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-2084001492-shelved',owner='09cae3217c5e430b8dbe17828669a978',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-12-02T10:04:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.921 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.922 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.922 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.923 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.923 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.923 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.924 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.924 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.925 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.925 281858 DEBUG nova.virt.hardware [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.926 281858 DEBUG nova.objects.instance [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 268e09a3-7abe-4037-a14a-068e7b8a78fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:10.951 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:04:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:04:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 02 10:04:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/534215597' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:04:11 np0005541913.localdomain systemd[1]: tmp-crun.tIxpU4.mount: Deactivated successfully.
Dec 02 10:04:11 np0005541913.localdomain podman[310436]: 2025-12-02 10:04:11.462870088 +0000 UTC m=+0.099850653 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:04:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:11.477 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:04:11 np0005541913.localdomain podman[310436]: 2025-12-02 10:04:11.506100634 +0000 UTC m=+0.143081199 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Dec 02 10:04:11 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:04:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:11.523 281858 DEBUG nova.storage.rbd_utils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] rbd image 268e09a3-7abe-4037-a14a-068e7b8a78fb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:04:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:11.531 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:04:11 np0005541913.localdomain ceph-mon[298296]: pgmap v140: 177 pgs: 177 active+clean; 226 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 2.4 KiB/s wr, 61 op/s
Dec 02 10:04:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/534215597' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:04:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:11.931 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 02 10:04:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1346093737' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:04:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:11.988 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:04:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:11.991 281858 DEBUG nova.objects.instance [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lazy-loading 'pci_devices' on Instance uuid 268e09a3-7abe-4037-a14a-068e7b8a78fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.019 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] End _get_guest_xml xml=<domain type="kvm">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <uuid>268e09a3-7abe-4037-a14a-068e7b8a78fb</uuid>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <name>instance-00000006</name>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <memory>131072</memory>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <vcpu>1</vcpu>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <metadata>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-2084001492</nova:name>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <nova:creationTime>2025-12-02 10:04:10</nova:creationTime>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <nova:flavor name="m1.nano">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <nova:memory>128</nova:memory>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <nova:disk>1</nova:disk>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <nova:swap>0</nova:swap>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <nova:vcpus>1</nova:vcpus>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       </nova:flavor>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <nova:owner>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <nova:user uuid="96d084f3c3184bf4ac7b9635139dd4aa">tempest-UnshelveToHostMultiNodesTest-557689334-project-member</nova:user>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <nova:project uuid="09cae3217c5e430b8dbe17828669a978">tempest-UnshelveToHostMultiNodesTest-557689334</nova:project>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       </nova:owner>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <nova:root type="image" uuid="0e87d55f-56a4-4da8-9198-c633785685ee"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <nova:ports/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     </nova:instance>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   </metadata>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <sysinfo type="smbios">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <system>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <entry name="manufacturer">RDO</entry>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <entry name="product">OpenStack Compute</entry>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <entry name="serial">268e09a3-7abe-4037-a14a-068e7b8a78fb</entry>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <entry name="uuid">268e09a3-7abe-4037-a14a-068e7b8a78fb</entry>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <entry name="family">Virtual Machine</entry>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     </system>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   </sysinfo>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <os>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <boot dev="hd"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <smbios mode="sysinfo"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   </os>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <features>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <acpi/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <apic/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <vmcoreinfo/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   </features>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <clock offset="utc">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <timer name="hpet" present="no"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   </clock>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <cpu mode="host-model" match="exact">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   </cpu>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   <devices>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <disk type="network" device="disk">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <driver type="raw" cache="none"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <source protocol="rbd" name="vms/268e09a3-7abe-4037-a14a-068e7b8a78fb_disk">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.103" port="6789"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.104" port="6789"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.105" port="6789"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       </source>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <auth username="openstack">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <secret type="ceph" uuid="c7c8e171-a193-56fb-95fa-8879fcfa7074"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       </auth>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <target dev="vda" bus="virtio"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <disk type="network" device="cdrom">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <driver type="raw" cache="none"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <source protocol="rbd" name="vms/268e09a3-7abe-4037-a14a-068e7b8a78fb_disk.config">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.103" port="6789"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.104" port="6789"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <host name="172.18.0.105" port="6789"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       </source>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <auth username="openstack">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:         <secret type="ceph" uuid="c7c8e171-a193-56fb-95fa-8879fcfa7074"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       </auth>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <target dev="sda" bus="sata"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     </disk>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <serial type="pty">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <log file="/var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb/console.log" append="off"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     </serial>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <video>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <model type="virtio"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     </video>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <input type="tablet" bus="usb"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <input type="keyboard" bus="usb"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <rng model="virtio">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <backend model="random">/dev/urandom</backend>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     </rng>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <controller type="usb" index="0"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     <memballoon model="virtio">
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:       <stats period="10"/>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:     </memballoon>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:   </devices>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: </domain>
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.046 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.068 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.069 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.070 281858 INFO nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Using config drive
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.107 281858 DEBUG nova.storage.rbd_utils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] rbd image 268e09a3-7abe-4037-a14a-068e7b8a78fb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.134 281858 DEBUG nova.objects.instance [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lazy-loading 'ec2_ids' on Instance uuid 268e09a3-7abe-4037-a14a-068e7b8a78fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.165 281858 DEBUG nova.objects.instance [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lazy-loading 'keypairs' on Instance uuid 268e09a3-7abe-4037-a14a-068e7b8a78fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.272 281858 INFO nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Creating config drive at /var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.config
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.279 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp67jthe8n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.408 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp67jthe8n" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.449 281858 DEBUG nova.storage.rbd_utils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] rbd image 268e09a3-7abe-4037-a14a-068e7b8a78fb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.455 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.config 268e09a3-7abe-4037-a14a-068e7b8a78fb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.670 281858 DEBUG oslo_concurrency.processutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.config 268e09a3-7abe-4037-a14a-068e7b8a78fb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.215s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:04:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:12.672 281858 INFO nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Deleting local config drive /var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.config because it was imported into RBD.
Dec 02 10:04:12 np0005541913.localdomain systemd-machined[84262]: New machine qemu-4-instance-00000006.
Dec 02 10:04:12 np0005541913.localdomain systemd[1]: Started Virtual Machine qemu-4-instance-00000006.
Dec 02 10:04:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1346093737' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.084 281858 DEBUG nova.compute.manager [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.085 281858 DEBUG nova.virt.libvirt.driver [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.086 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764669853.0826938, 268e09a3-7abe-4037-a14a-068e7b8a78fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.087 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] VM Resumed (Lifecycle Event)
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.095 281858 INFO nova.virt.libvirt.driver [-] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Instance spawned successfully.
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.111 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.114 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.133 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.133 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764669853.084783, 268e09a3-7abe-4037-a14a-068e7b8a78fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.134 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] VM Started (Lifecycle Event)
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.154 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.158 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 02 10:04:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:13.175 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 02 10:04:13 np0005541913.localdomain ceph-mon[298296]: pgmap v141: 177 pgs: 177 active+clean; 307 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 4.9 MiB/s rd, 4.8 MiB/s wr, 152 op/s
Dec 02 10:04:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e107 e107: 6 total, 6 up, 6 in
Dec 02 10:04:14 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:14Z|00125|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:04:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:14.162 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:14 np0005541913.localdomain ceph-mon[298296]: osdmap e107: 6 total, 6 up, 6 in
Dec 02 10:04:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:15.003 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:15.071 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:15.225 281858 DEBUG nova.compute.manager [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:04:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:04:15 np0005541913.localdomain systemd[1]: tmp-crun.ipugDF.mount: Deactivated successfully.
Dec 02 10:04:15 np0005541913.localdomain podman[310612]: 2025-12-02 10:04:15.513238117 +0000 UTC m=+0.139371160 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 10:04:15 np0005541913.localdomain podman[310612]: 2025-12-02 10:04:15.546037833 +0000 UTC m=+0.172170866 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 02 10:04:15 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:04:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:15 np0005541913.localdomain ceph-mon[298296]: pgmap v143: 177 pgs: 177 active+clean; 307 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 5.1 MiB/s wr, 160 op/s
Dec 02 10:04:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:16.105 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:04:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:16.111 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a966654efc63eb79f395da865ed495916856f318e31034e86d5a2b1abae24291" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 02 10:04:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:16.278 281858 DEBUG oslo_concurrency.lockutils [None req-ed1bd6d1-7bf7-4190-b304-63c91a3b4709 1cb5f3cd655948d69eadad12de0d4055 2d58bf4832b74708b28917a57e00803f - - default default] Lock "268e09a3-7abe-4037-a14a-068e7b8a78fb" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 8.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:16 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:16.387 2 INFO neutron.agent.securitygroups_rpc [req-bec2fcab-0b29-48c5-8c73-7c95715690aa req-3ce61e55-77a0-41a7-a01c-658bb353c505 5d2a1dd73fee440789897d09ac4f0afc b1db4f455ea047e3b37458f6d2c5e699 - - default default] Security group rule updated ['df5547d9-a152-449e-8fa5-5094da38cd68']
Dec 02 10:04:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:16.728 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 954 Content-Type: application/json Date: Tue, 02 Dec 2025 10:04:16 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-bfeed26e-68d4-4062-80d2-ed5b766fcfaa x-openstack-request-id: req-bfeed26e-68d4-4062-80d2-ed5b766fcfaa _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 02 10:04:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:16.728 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1839415d-f60e-4a1c-bcf9-a79f9f7cb24d", "name": "m1.micro", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/1839415d-f60e-4a1c-bcf9-a79f9f7cb24d"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/1839415d-f60e-4a1c-bcf9-a79f9f7cb24d"}]}, {"id": "45a99238-6f19-4f9e-be82-6ef3af1dcb31", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/45a99238-6f19-4f9e-be82-6ef3af1dcb31"}]}, {"id": "82beb986-6d20-42dc-b738-1cef87dee30f", "name": "m1.nano", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/82beb986-6d20-42dc-b738-1cef87dee30f"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/82beb986-6d20-42dc-b738-1cef87dee30f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 02 10:04:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:16.728 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-bfeed26e-68d4-4062-80d2-ed5b766fcfaa request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 02 10:04:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:16.731 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/82beb986-6d20-42dc-b738-1cef87dee30f -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a966654efc63eb79f395da865ed495916856f318e31034e86d5a2b1abae24291" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 02 10:04:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:16.737 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:17 np0005541913.localdomain dnsmasq[307978]: exiting on receipt of SIGTERM
Dec 02 10:04:17 np0005541913.localdomain podman[310646]: 2025-12-02 10:04:17.312633135 +0000 UTC m=+0.057852509 container kill 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:04:17 np0005541913.localdomain systemd[1]: libpod-2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2.scope: Deactivated successfully.
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.372 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 493 Content-Type: application/json Date: Tue, 02 Dec 2025 10:04:16 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-5c64cb57-59d0-4f58-9c1b-155a6e8d9224 x-openstack-request-id: req-5c64cb57-59d0-4f58-9c1b-155a6e8d9224 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.372 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "82beb986-6d20-42dc-b738-1cef87dee30f", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/82beb986-6d20-42dc-b738-1cef87dee30f"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/82beb986-6d20-42dc-b738-1cef87dee30f"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.372 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/82beb986-6d20-42dc-b738-1cef87dee30f used request id req-5c64cb57-59d0-4f58-9c1b-155a6e8d9224 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.376 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '09cae3217c5e430b8dbe17828669a978', 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'hostId': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.376 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.382 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdf030b5-cbfc-4627-9f9d-051ad00efc8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.376425', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4374ce9c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': 'f922141570756648eda573ffbc9cdfc117f584746b7b905f7657e75b04624933'}]}, 'timestamp': '2025-12-02 10:04:17.386462', '_unique_id': 'd9dc6505dd594aae85bf3cf5febc9efd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.388 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.390 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain podman[310661]: 2025-12-02 10:04:17.390890009 +0000 UTC m=+0.063332416 container died 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.402 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.407 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain systemd[1]: tmp-crun.W69xsv.mount: Deactivated successfully.
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.424 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.425 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fc7d917-d0d1-4a4a-80b3-abfa84efa979', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:04:17.390401', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43787b78-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.609516538, 'message_signature': 'ddbd25cfd11850970f14e552530b89e18593e5f53126a569822f2401e8827ecb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:04:17.390401', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43788a46-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.609516538, 'message_signature': 'bed9aefa369929a21255888b20b05824606ce83114800c77eac5c035a9f6a1d8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-vda', 'timestamp': '2025-12-02T10:04:17.390401', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '437b4786-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.626474332, 'message_signature': 'bb2177b154ecd4ff0ce551cefa34daca5bca687bae49516344e43e1a984264e3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-sda', 'timestamp': '2025-12-02T10:04:17.390401', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '437b60cc-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.626474332, 'message_signature': 'b3d4d907e3709399d0a7ff774816fa24a54938f3cf9064308e192509f862c014'}]}, 'timestamp': '2025-12-02 10:04:17.426001', '_unique_id': 'e2f49ff913e743098aa088868bd03846'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.427 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.430 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.430 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-2084001492>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-2084001492>]
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.431 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.431 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca38731f-58ee-4ea4-ace5-67c71512a8c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.431239', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '437c41a4-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': '872a5c7e7b5138ddc1039a826063ecead8fc9782902cefffb8e29f92b76a7f81'}]}, 'timestamp': '2025-12-02 10:04:17.431804', '_unique_id': '7ee2b993cc5e46c29d58b470ef14bfef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.432 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.433 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.435 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '644b84f2-dda9-4537-bf33-4b49fa4459e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.435734', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '437cf2de-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': 'b25ae39e56c899618aef2226b0341b5c4be7d242079d4fa697c614c491e0ccc4'}]}, 'timestamp': '2025-12-02 10:04:17.436305', '_unique_id': 'a4d0b4d0d58f4956bafae91c8133192a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.436 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.437 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain podman[310661]: 2025-12-02 10:04:17.449108037 +0000 UTC m=+0.121550384 container cleanup 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:04:17 np0005541913.localdomain systemd[1]: libpod-conmon-2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2.scope: Deactivated successfully.
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.466 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.467 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.493 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.494 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17ee1acc-f87f-4b65-936b-945053c8099a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:04:17.440560', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4381afb8-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': '9bf565e33c6e8ddef478efb0749b3a3ec0251647591238aadd80cec4d1eb6f94'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:04:17.440560', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4381c854-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': '2570bed2e7944db4ed6117891cf0a1a8442d3b3770d06353dad01cc566d2e54c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-vda', 'timestamp': '2025-12-02T10:04:17.440560', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4385bcca-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': 'c04e967d652b8b7835747cc07968e5358b116e99bd4c01229d3662b83d66ed67'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-sda', 'timestamp': '2025-12-02T10:04:17.440560', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4385d23c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': '51268590f285a99e3654e8c08385b4b9ccfcc56a3d34e232bfeeeca057e08c6d'}]}, 'timestamp': '2025-12-02 10:04:17.494445', '_unique_id': 'ab10bb66032f48dd829494651c17576b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.495 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.496 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.508 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.508 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.509 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.509 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a71c7d09-d153-4af4-b328-b9bdcb5b3fdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:04:17.508105', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '438801ec-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': '7bccd844f4f0763d124f75f01e2bbf5b69dee5573ff67ea2dbd6e67c31650a6b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:04:17.508105', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '438815ba-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': '26655661ddd4bc74a97066808eb6310a3da0b029b2c23a8becd5b33bda0c4d4d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-vda', 'timestamp': '2025-12-02T10:04:17.508105', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '43882618-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': '658b62569d084e87b731b2380f5dc6d4ae9ecaf504a277a34c19ea4e3175018d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-sda', 'timestamp': '2025-12-02T10:04:17.508105', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '43883824-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': 'dfbed061465d6ad068129652dbd0a0ff4faa1d25410dd5c6d14ce651ef00ecc2'}]}, 'timestamp': '2025-12-02 10:04:17.510153', '_unique_id': '046341e0dcc448c7a40ea44397d03c2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.511 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.512 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.513 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '436e3be7-8fd3-49e5-9d89-01232434cb4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.513146', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4388f44e-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': '36ef6424547bdb1ec3a4150a49aaa92e2b788b28262529c7b7eea5c099a7aaee'}]}, 'timestamp': '2025-12-02 10:04:17.517088', '_unique_id': 'fcacae156ff5426ab5e65c3e79d13e8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.519 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.522 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain podman[310662]: 2025-12-02 10:04:17.539294189 +0000 UTC m=+0.205559760 container remove 2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-62df5f27-c8d9-4d79-9ad6-2f32e63bf47f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.541 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 16460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.553 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/cpu volume: 4250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d8156a4-4a71-4e8d-bc10-bf8f574520af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16460000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:04:17.523116', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '438cfe54-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.759971123, 'message_signature': '518c9a77385f15b92922c4709a0b7c46bcb91de197cb17981360a220ed2a62a0'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4250000000, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'timestamp': '2025-12-02T10:04:17.523116', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '438eec46-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.772770825, 'message_signature': '80f2788aa81147bb84c5680b3472eae54b7a9c3092f3759934e401f95ecf86f5'}]}, 'timestamp': '2025-12-02 10:04:17.554060', '_unique_id': '62005cafbda24737adee96db661937cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.554 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.555 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.555 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.555 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5442586-1661-44d9-a848-41f327a471f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:04:17.555658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '438f341c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.609516538, 'message_signature': '6f9b7c1d46d252db4c7306f9a9ac7e6a6d0b72c66a03fb8c327811a1fd27ea85'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:04:17.555658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '438f3bba-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.609516538, 'message_signature': 'd6811bd3a785f48f9ef7479452e66d4a7a74b576f1248724b3addec74bffb1ef'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-vda', 'timestamp': '2025-12-02T10:04:17.555658', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '438f4290-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.626474332, 'message_signature': '01b460ef4cf0db55b904511aa5d14b09388013a61f26dc3b46ea0d5b270c3e37'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-sda', 'timestamp': '2025-12-02T10:04:17.555658', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '438f4916-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.626474332, 'message_signature': 'aac5bfbc4f8a1d533619a737b14e619c2f6a007368b6b564a5dc664397beda00'}]}, 'timestamp': '2025-12-02 10:04:17.556391', '_unique_id': '4e40fc3963d04c5bb850dcb0a086df9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.556 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.557 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.557 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9477040e-195c-44a5-88e9-cfb0c4c580b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.557429', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '438f7936-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': 'a5c422de8b7edd44e629be07e77d818ba008b9d9312408dbfa5a4b2132f784cd'}]}, 'timestamp': '2025-12-02 10:04:17.557656', '_unique_id': '98b39ccc26b249b3a3209e485bc7a532'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.558 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '895e4572-a2f3-4863-a3f8-6e9700baf252', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.558564', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '438fa604-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': '0ee86c968c2290ef62de14a85c21d00f10f4c1685225064e72067218d9537ec1'}]}, 'timestamp': '2025-12-02 10:04:17.558784', '_unique_id': '83dddf486f0e4ecb9e233d6fa72e0c75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.559 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.read.latency volume: 1047513403 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.read.latency volume: 2316922 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '727edf32-3211-40da-9c4e-b7804f200882', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:04:17.559695', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '438fd160-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': 'e5e750e12e1057c8d611a6c63e418dc90a26ebe772e868b29f20942b47b960e0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:04:17.559695', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '438fd82c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': '3ce2348de76efc113bbaf6c8637887ffecb68790a0716043d7b4bdf0ae5b5ca6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1047513403, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-vda', 'timestamp': '2025-12-02T10:04:17.559695', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '438fdeda-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': '49a76adb64c0a6c6c12bc350904a38144ecec088fec15642ecf7dcdb226d9799'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2316922, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-sda', 'timestamp': '2025-12-02T10:04:17.559695', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '438fe556-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': '6cff2d142fb5f914662add5a6875a21e72713d255f5eea335f4cffc9ee3719ef'}]}, 'timestamp': '2025-12-02 10:04:17.560390', '_unique_id': '1e4da737de564bc49ebffb507796c442'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.560 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.561 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.561 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.561 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.561 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 268e09a3-7abe-4037-a14a-068e7b8a78fb: ceilometer.compute.pollsters.NoVolumeException
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69b04d7f-bfad-4b6e-85b5-0f6029d53f66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:04:17.561379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4390133c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.759971123, 'message_signature': '31d0d0bdd078db893609f617adf5140d21ec2f9355c4f19dc553d85886cd6009'}]}, 'timestamp': '2025-12-02 10:04:17.561739', '_unique_id': '571759ea63d64f3383fe288777ffe157'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-2084001492>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-2084001492>]
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.562 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-2084001492>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-2084001492>]
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '799db7e3-9444-4e76-821a-5959c034cfb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.563119', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4390570c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': '7bbb00bdb8f46e8d54b31b7fe540251dbc448e8fe0638f6f7f6b15ab2970be91'}]}, 'timestamp': '2025-12-02 10:04:17.563317', '_unique_id': '839b3e820134431a959192a0ace89851'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.563 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.564 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.564 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d791f65-86d7-45c5-a90f-2c48ab2e627b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.564350', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4390895c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': 'a1fa16c6bb97f67d04a0f83bce96ff7a5e8985749486acda944644f68be0fee3'}]}, 'timestamp': '2025-12-02 10:04:17.564607', '_unique_id': '631e562b066c4efbbb3c03a4d69b6603'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.565 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7760aa84-f6c0-46cb-b4f3-1e90e87579d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:04:17.565539', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4390b648-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': 'c134bc7aa250d75374d0adeb6965feb1d8f29b0417e404a1bb5a1125d7bebd2c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:04:17.565539', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4390bd1e-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': 'cd2aa20860a299f96149fbf8f6ea3b97552709ecbb1911808c1df9343269bf79'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-vda', 'timestamp': '2025-12-02T10:04:17.565539', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4390c390-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': '0ac98cd35db0a12a528ad4e15c1f4bcd410c81d346509d95b6be014ad5fa4173'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-sda', 'timestamp': '2025-12-02T10:04:17.565539', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4390c9e4-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': 'a79cff9d0860c5c0dc56d19dc0a7ea5c67a1888cafe919e4a769dd1d5687d014'}]}, 'timestamp': '2025-12-02 10:04:17.566241', '_unique_id': '10fc7d8b645c4a2681c4e3fc7701cba4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.566 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.567 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.572 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.572 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-2084001492>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-2084001492>]
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.572 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a337f2c-8941-4b4e-85b7-205e5372deff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.573033', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4391da96-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': '4f844231400cbffe8c2cb1b326a0e1f86db49fcf8714ebe4033af23a3a014ab2'}]}, 'timestamp': '2025-12-02 10:04:17.573240', '_unique_id': 'faf92fed76f848e5bb4ff27522c81291'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.573 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.574 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.574 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.574 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.574 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.574 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf05612b-d064-4fca-b771-721662ba8f65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:04:17.574186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4392076e-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': '4b6cc9146883c898c2c9f460607b9a8be28ac4cd07411a177fd0f869f9957017'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:04:17.574186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43920e26-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': '64e38319c9cf99f8de379c7a6e1f0fff5e974ef33e49b589c4295cd41c2d0ef9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-vda', 'timestamp': '2025-12-02T10:04:17.574186', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '43921574-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': 'c4701453e9fdd0c274fefc272d27cd1d408ba06b01c3524859c26c0e76807d1c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-sda', 'timestamp': '2025-12-02T10:04:17.574186', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '43921bf0-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': '4343a6efcc13ea409353688af6cde2197212096a8bf32a8600a52ca586b4a1e5'}]}, 'timestamp': '2025-12-02 10:04:17.574895', '_unique_id': '4f9727773620470f833f89dc92329063'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.575 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af28eb7e-8358-4dff-8f23-e473e9b081ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:04:17.575863', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '4392490e-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.595470982, 'message_signature': 'b4895d045899b99a320fb25b9968f538e63aadba14a629eebc4f84ba484e02eb'}]}, 'timestamp': '2025-12-02 10:04:17.576065', '_unique_id': '9a38f13eaf7748a99a9e6ce2a1ae16a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.576 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.577 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.577 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.577 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.577 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.allocation volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5204faa-c903-4388-8a95-8927c3ee853a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:04:17.576981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43927492-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.609516538, 'message_signature': 'b2f82edd68332d330d75d232b5d4078b94145072abeb944f447f455c1032b19d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:04:17.576981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43927b72-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.609516538, 'message_signature': '5ab92c864cd16385071a71b957e652320238d24978e5c416ac553f2b4a3801bb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-vda', 'timestamp': '2025-12-02T10:04:17.576981', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '439281d0-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.626474332, 'message_signature': 'a563dd48d2f1921d733cfdd2b213c2af50d1acfda3c37693af5b5d6d98ad6cbd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-sda', 'timestamp': '2025-12-02T10:04:17.576981', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '43928810-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.626474332, 'message_signature': '48abf2d0479de67d3a8d4461600f0f807d816776c239bf0293edcc0f9eb4a37f'}]}, 'timestamp': '2025-12-02 10:04:17.577680', '_unique_id': 'adeb7366c55742e6bfa519a7b0becaee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.578 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 DEBUG ceilometer.compute.pollsters [-] 268e09a3-7abe-4037-a14a-068e7b8a78fb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5db9fce-e119-4cea-99b4-d778bdc32adf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:04:17.578630', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4392b524-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': '356eb9bd340d69c63f2e988d4f8ae9870c004067c4cebffcaa96f7fc88030007'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:04:17.578630', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4392bbc8-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.65967854, 'message_signature': '59ad6278572357e58650a51cd7cb3a00f5f447707f20c5c8be308a174e677958'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-vda', 'timestamp': '2025-12-02T10:04:17.578630', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4392c212-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': '68209aa0146d18d7ca3bbd7efe40fa76b1b277259aa585cec196ed07aab4817b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '96d084f3c3184bf4ac7b9635139dd4aa', 'user_name': None, 'project_id': '09cae3217c5e430b8dbe17828669a978', 'project_name': None, 'resource_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb-sda', 'timestamp': '2025-12-02T10:04:17.578630', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-2084001492', 'name': 'instance-00000006', 'instance_id': '268e09a3-7abe-4037-a14a-068e7b8a78fb', 'instance_type': 'm1.nano', 'host': '7cad67b582e6426efa9c66e424f7a681c87315f17829394b22a9c3a2', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '82beb986-6d20-42dc-b738-1cef87dee30f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '0e87d55f-56a4-4da8-9198-c633785685ee'}, 'image_ref': '0e87d55f-56a4-4da8-9198-c633785685ee', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4392c848-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12019.687085392, 'message_signature': 'a7ca20a58fe03f96a1fe7e81cb139ac95986dbcecc0a9d3ae7c9c5b6d879365e'}]}, 'timestamp': '2025-12-02 10:04:17.579304', '_unique_id': 'bb894090447141c580fc63b9b1df23f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:04:17 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:04:17.579 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:04:17 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:17.760 263406 INFO neutron.agent.dhcp.agent [None req-9d5fb568-c306-4dac-9877-2d927af3520d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.793 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Acquiring lock "268e09a3-7abe-4037-a14a-068e7b8a78fb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.794 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Lock "268e09a3-7abe-4037-a14a-068e7b8a78fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.795 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Acquiring lock "268e09a3-7abe-4037-a14a-068e7b8a78fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.795 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Lock "268e09a3-7abe-4037-a14a-068e7b8a78fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.796 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Lock "268e09a3-7abe-4037-a14a-068e7b8a78fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.797 281858 INFO nova.compute.manager [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Terminating instance
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.799 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Acquiring lock "refresh_cache-268e09a3-7abe-4037-a14a-068e7b8a78fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.799 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Acquired lock "refresh_cache-268e09a3-7abe-4037-a14a-068e7b8a78fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.800 281858 DEBUG nova.network.neutron [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 02 10:04:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:17.909 281858 DEBUG nova.network.neutron [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 02 10:04:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:18.013 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:04:18 np0005541913.localdomain ceph-mon[298296]: pgmap v144: 177 pgs: 177 active+clean; 307 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 4.7 MiB/s rd, 4.7 MiB/s wr, 147 op/s
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:04:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:18.160 281858 DEBUG nova.network.neutron [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:04:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:18.177 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Releasing lock "refresh_cache-268e09a3-7abe-4037-a14a-068e7b8a78fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:04:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:18.178 281858 DEBUG nova.compute.manager [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 02 10:04:18 np0005541913.localdomain podman[310687]: 2025-12-02 10:04:18.213451375 +0000 UTC m=+0.093127173 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Consumed 5.676s CPU time.
Dec 02 10:04:18 np0005541913.localdomain systemd-machined[84262]: Machine qemu-4-instance-00000006 terminated.
Dec 02 10:04:18 np0005541913.localdomain sudo[310711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:04:18 np0005541913.localdomain podman[310687]: 2025-12-02 10:04:18.262264681 +0000 UTC m=+0.141940469 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, distribution-scope=public, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 10:04:18 np0005541913.localdomain sudo[310711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:04:18 np0005541913.localdomain podman[310688]: 2025-12-02 10:04:18.279258645 +0000 UTC m=+0.157791622 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:04:18 np0005541913.localdomain sudo[310711]: pam_unix(sudo:session): session closed for user root
Dec 02 10:04:18 np0005541913.localdomain podman[310688]: 2025-12-02 10:04:18.291831502 +0000 UTC m=+0.170364469 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:04:18 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:18.296 2 INFO neutron.agent.securitygroups_rpc [req-3542c6d6-3e9a-4403-b3b7-62c55b0a2440 req-a1b9621e-b7b6-4f72-a92d-ded5fdb895c8 5d2a1dd73fee440789897d09ac4f0afc b1db4f455ea047e3b37458f6d2c5e699 - - default default] Security group rule updated ['df5547d9-a152-449e-8fa5-5094da38cd68']
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: tmp-crun.nrXs53.mount: Deactivated successfully.
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-93ad5d2b9af04d633613c8f460d48e56923a84b4e7f2b732ec5f908e2b44d433-merged.mount: Deactivated successfully.
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2d42155b5a72db54c622c9ed42c83a8217719c79542b37e5b2087004cd3850e2-userdata-shm.mount: Deactivated successfully.
Dec 02 10:04:18 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d62df5f27\x2dc8d9\x2d4d79\x2d9ad6\x2d2f32e63bf47f.mount: Deactivated successfully.
Dec 02 10:04:18 np0005541913.localdomain sudo[310748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:04:18 np0005541913.localdomain sudo[310748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:04:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:18.403 281858 INFO nova.virt.libvirt.driver [-] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Instance destroyed successfully.
Dec 02 10:04:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:18.403 281858 DEBUG nova.objects.instance [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Lazy-loading 'resources' on Instance uuid 268e09a3-7abe-4037-a14a-068e7b8a78fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:18.890 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:04:18 np0005541913.localdomain sudo[310748]: pam_unix(sudo:session): session closed for user root
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.076 281858 INFO nova.virt.libvirt.driver [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Deleting instance files /var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb_del
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.078 281858 INFO nova.virt.libvirt.driver [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Deletion of /var/lib/nova/instances/268e09a3-7abe-4037-a14a-068e7b8a78fb_del complete
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.149 281858 INFO nova.compute.manager [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Took 0.97 seconds to destroy the instance on the hypervisor.
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.152 281858 DEBUG oslo.service.loopingcall [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.153 281858 DEBUG nova.compute.manager [-] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.154 281858 DEBUG nova.network.neutron [-] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.215 281858 DEBUG nova.network.neutron [-] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.228 281858 DEBUG nova.network.neutron [-] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.243 281858 INFO nova.compute.manager [-] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Took 0.09 seconds to deallocate network for instance.
Dec 02 10:04:19 np0005541913.localdomain sudo[310819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:04:19 np0005541913.localdomain sudo[310819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.306 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.307 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:19 np0005541913.localdomain sudo[310819]: pam_unix(sudo:session): session closed for user root
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.390 281858 DEBUG oslo_concurrency.processutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:04:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:04:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2552744948' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.850 281858 DEBUG oslo_concurrency.processutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.858 281858 DEBUG nova.compute.provider_tree [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.898 281858 DEBUG nova.scheduler.client.report [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.915 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:19.973 281858 INFO nova.scheduler.client.report [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Deleted allocations for instance 268e09a3-7abe-4037-a14a-068e7b8a78fb
Dec 02 10:04:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:20.030 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:20 np0005541913.localdomain ceph-mon[298296]: pgmap v145: 177 pgs: 177 active+clean; 226 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 208 op/s
Dec 02 10:04:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:20.049 281858 DEBUG oslo_concurrency.lockutils [None req-03d25e47-17ae-4548-9c68-61f7dc41ea30 96d084f3c3184bf4ac7b9635139dd4aa 09cae3217c5e430b8dbe17828669a978 - - default default] Lock "268e09a3-7abe-4037-a14a-068e7b8a78fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:04:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:04:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:04:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:04:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/675123350' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2552744948' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:20.073 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 e108: 6 total, 6 up, 6 in
Dec 02 10:04:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:20.923 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:21.651 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005541914.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:04:01Z, description=, device_id=82e23ec3-1d57-4166-9ba0-839ded943a78, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a027c0>], dns_domain=, dns_name=tempest-livemigrationtest-server-39688497, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a02f40>], id=54433c73-7e5c-481c-b64c-19e9cfd6e56f, ip_allocation=immediate, mac_address=fa:16:3e:bb:b6:1c, name=tempest-parent-146896978, network_id=13bbad22-ab61-4b1f-849e-c651aa8f3297, port_security_enabled=True, project_id=d048f19ff5fc47dc88162ef5f9cebe8b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['576d6513-029b-4880-bb0b-58094b586b90'], standard_attr_id=537, status=DOWN, tags=[], tenant_id=d048f19ff5fc47dc88162ef5f9cebe8b, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a02220>], trunk_id=3bda7a6b-42c4-4395-9870-485919ec4ac2, updated_at=2025-12-02T10:04:20Z on network 13bbad22-ab61-4b1f-849e-c651aa8f3297
Dec 02 10:04:21 np0005541913.localdomain ceph-mon[298296]: osdmap e108: 6 total, 6 up, 6 in
Dec 02 10:04:21 np0005541913.localdomain ceph-mon[298296]: pgmap v147: 177 pgs: 177 active+clean; 226 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 1.7 KiB/s wr, 137 op/s
Dec 02 10:04:21 np0005541913.localdomain dnsmasq[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/addn_hosts - 2 addresses
Dec 02 10:04:21 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/host
Dec 02 10:04:21 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/opts
Dec 02 10:04:21 np0005541913.localdomain podman[310876]: 2025-12-02 10:04:21.878334131 +0000 UTC m=+0.062307268 container kill 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:04:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:22.055 263406 INFO neutron.agent.dhcp.agent [None req-76d7b50f-ef35-4f8e-978e-85c25ca3db70 - - - - - -] DHCP configuration for ports {'54433c73-7e5c-481c-b64c-19e9cfd6e56f'} is completed
Dec 02 10:04:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:22.182 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:23 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:04:23 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:23.966 2 INFO neutron.agent.securitygroups_rpc [req-65998e7d-c26a-45a5-8676-fd86a74e40b3 req-1863187d-62f6-4dd8-8a63-a2eeaa9837d3 1583e961fefc48749f39fdf4f81945c8 a0475908295e475d873fdbfd8cc82cea - - default default] Security group rule updated ['dfa589a5-e6b3-419a-9bd7-e5b7ecfd8cd6']
Dec 02 10:04:24 np0005541913.localdomain ceph-mon[298296]: pgmap v148: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.3 MiB/s wr, 193 op/s
Dec 02 10:04:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:24Z|00126|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:04:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:24.328 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:25.063 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:25.076 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:25 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2801301008' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:04:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:04:25 np0005541913.localdomain podman[310898]: 2025-12-02 10:04:25.436564225 +0000 UTC m=+0.072227684 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 10:04:25 np0005541913.localdomain podman[310898]: 2025-12-02 10:04:25.444671862 +0000 UTC m=+0.080335351 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125)
Dec 02 10:04:25 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:04:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:26 np0005541913.localdomain ceph-mon[298296]: pgmap v149: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 2.1 MiB/s wr, 175 op/s
Dec 02 10:04:26 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2351640434' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:04:27 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:27.147 2 INFO neutron.agent.securitygroups_rpc [req-6d0b23d6-658e-4a79-96cf-b8ca52a56a83 req-dc334455-9197-4ae2-b241-5b724098ced8 1583e961fefc48749f39fdf4f81945c8 a0475908295e475d873fdbfd8cc82cea - - default default] Security group rule updated ['aadc9cbe-01f3-422d-afff-735004537d1d']
Dec 02 10:04:28 np0005541913.localdomain ceph-mon[298296]: pgmap v150: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 2.1 MiB/s wr, 175 op/s
Dec 02 10:04:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:04:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:04:29 np0005541913.localdomain podman[310917]: 2025-12-02 10:04:29.442682009 +0000 UTC m=+0.082591020 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:04:29 np0005541913.localdomain podman[310917]: 2025-12-02 10:04:29.45801607 +0000 UTC m=+0.097925131 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:04:29 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:04:29 np0005541913.localdomain podman[310918]: 2025-12-02 10:04:29.512879927 +0000 UTC m=+0.147091125 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:04:29 np0005541913.localdomain podman[310918]: 2025-12-02 10:04:29.580009843 +0000 UTC m=+0.214221001 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:04:29 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:29.592 2 INFO neutron.agent.securitygroups_rpc [req-1c594721-186d-4097-a94c-c620e0979c63 req-4b6914f0-ee8c-4772-ac7a-a3075974ee64 1583e961fefc48749f39fdf4f81945c8 a0475908295e475d873fdbfd8cc82cea - - default default] Security group rule updated ['41f7c9c8-7668-4604-9cee-64c2ce6fa2c0']
Dec 02 10:04:29 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:04:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:30.077 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:04:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:30.078 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:04:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:30.079 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:04:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:30.079 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:04:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:30.093 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:30.094 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:04:30 np0005541913.localdomain ceph-mon[298296]: pgmap v151: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 2.1 MiB/s wr, 77 op/s
Dec 02 10:04:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:31 np0005541913.localdomain ceph-mon[298296]: pgmap v152: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 51 KiB/s rd, 2.1 MiB/s wr, 76 op/s
Dec 02 10:04:32 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:32.190 2 INFO neutron.agent.securitygroups_rpc [req-10b28dbb-d460-47e0-a99a-7ab94b16b5dd req-5be6a150-24be-4b75-af16-d1e63344c43d 1583e961fefc48749f39fdf4f81945c8 a0475908295e475d873fdbfd8cc82cea - - default default] Security group rule updated ['20cbc49d-f7c3-4e2e-87e6-586884a8dc4b']
Dec 02 10:04:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:32Z|00127|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:04:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:32.827 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:33.329 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:33.330 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:33.331 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:04:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:33.401 281858 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764669858.400273, 268e09a3-7abe-4037-a14a-068e7b8a78fb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:04:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:33.401 281858 INFO nova.compute.manager [-] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] VM Stopped (Lifecycle Event)
Dec 02 10:04:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:33.434 281858 DEBUG nova.compute.manager [None req-55824f8c-c006-4d30-974b-4804d6b3b430 - - - - - -] [instance: 268e09a3-7abe-4037-a14a-068e7b8a78fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:04:34 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:34Z|00128|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:04:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:04:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:04:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:04:34 np0005541913.localdomain ceph-mon[298296]: pgmap v153: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 1.8 MiB/s wr, 64 op/s
Dec 02 10:04:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:04:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:04:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:04:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:04:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:04:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:34.072 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:35.119 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:35.905 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:04:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:04:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:04:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157930 "" "Go-http-client/1.1"
Dec 02 10:04:36 np0005541913.localdomain ceph-mon[298296]: pgmap v154: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 7.2 KiB/s rd, 12 KiB/s wr, 9 op/s
Dec 02 10:04:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:04:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19728 "" "Go-http-client/1.1"
Dec 02 10:04:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:36.406 2 INFO neutron.agent.securitygroups_rpc [req-7ec4157a-3973-4fe9-90a5-6b7e95187ed9 req-9adda286-3e5c-4f67-99d9-e6d6658a3dd8 1583e961fefc48749f39fdf4f81945c8 a0475908295e475d873fdbfd8cc82cea - - default default] Security group rule updated ['ec37aab1-8e3e-42dd-a42d-6454010a3bb1']
Dec 02 10:04:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:36.789 2 INFO neutron.agent.securitygroups_rpc [req-41dd90c2-f92d-4e4d-a9a2-5512726d06ed req-eda4abe0-dc4d-48d0-a211-5598e3a12357 1583e961fefc48749f39fdf4f81945c8 a0475908295e475d873fdbfd8cc82cea - - default default] Security group rule updated ['ec37aab1-8e3e-42dd-a42d-6454010a3bb1']
Dec 02 10:04:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:37.362 263406 INFO neutron.agent.linux.ip_lib [None req-e8709c5f-bad0-4cd8-a86a-bcb0776bda8c - - - - - -] Device tapc1f0bd46-6b cannot be used as it has no MAC address
Dec 02 10:04:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:37.377 281858 DEBUG nova.virt.libvirt.driver [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Creating tmpfile /var/lib/nova/instances/tmpvcgqfy3k to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Dec 02 10:04:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:37.387 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:37 np0005541913.localdomain kernel: device tapc1f0bd46-6b entered promiscuous mode
Dec 02 10:04:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:37.395 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:37 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669877.3965] manager: (tapc1f0bd46-6b): new Generic device (/org/freedesktop/NetworkManager/Devices/25)
Dec 02 10:04:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:37Z|00129|binding|INFO|Claiming lport c1f0bd46-6bae-4902-9292-e19c6e88557a for this chassis.
Dec 02 10:04:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:37Z|00130|binding|INFO|c1f0bd46-6bae-4902-9292-e19c6e88557a: Claiming unknown
Dec 02 10:04:37 np0005541913.localdomain systemd-udevd[310976]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:04:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:37.404 281858 DEBUG nova.compute.manager [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvcgqfy3k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Dec 02 10:04:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:37.408 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e9e3da8770844ad5b5552298a24dcbd2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46d349b8-3ce0-4286-826a-479b1dd2a429, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c1f0bd46-6bae-4902-9292-e19c6e88557a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:37.410 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c1f0bd46-6bae-4902-9292-e19c6e88557a in datapath 26a036bb-7fc2-42d0-b324-4cf6bb77a9d6 bound to our chassis
Dec 02 10:04:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:37.412 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 26a036bb-7fc2-42d0-b324-4cf6bb77a9d6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:04:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:37.413 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6b913cd4-f142-4c5b-96e0-1360f53dd31f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:37 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc1f0bd46-6b: No such device
Dec 02 10:04:37 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc1f0bd46-6b: No such device
Dec 02 10:04:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:37Z|00131|binding|INFO|Setting lport c1f0bd46-6bae-4902-9292-e19c6e88557a ovn-installed in OVS
Dec 02 10:04:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:37Z|00132|binding|INFO|Setting lport c1f0bd46-6bae-4902-9292-e19c6e88557a up in Southbound
Dec 02 10:04:37 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc1f0bd46-6b: No such device
Dec 02 10:04:37 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc1f0bd46-6b: No such device
Dec 02 10:04:37 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc1f0bd46-6b: No such device
Dec 02 10:04:37 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc1f0bd46-6b: No such device
Dec 02 10:04:37 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc1f0bd46-6b: No such device
Dec 02 10:04:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:37.473 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:37 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc1f0bd46-6b: No such device
Dec 02 10:04:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:37.496 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:37.525 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:38 np0005541913.localdomain ceph-mon[298296]: pgmap v155: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 7.2 KiB/s rd, 12 KiB/s wr, 9 op/s
Dec 02 10:04:38 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:38.318 2 INFO neutron.agent.securitygroups_rpc [req-8e540b7a-da71-4acd-ab56-fd3bce480c0a req-799fda44-ad0c-42d1-806d-41b3bc34424c 1583e961fefc48749f39fdf4f81945c8 a0475908295e475d873fdbfd8cc82cea - - default default] Security group rule updated ['ec37aab1-8e3e-42dd-a42d-6454010a3bb1']
Dec 02 10:04:38 np0005541913.localdomain podman[311047]: 
Dec 02 10:04:38 np0005541913.localdomain podman[311047]: 2025-12-02 10:04:38.384830956 +0000 UTC m=+0.081765588 container create 2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:04:38 np0005541913.localdomain systemd[1]: Started libpod-conmon-2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88.scope.
Dec 02 10:04:38 np0005541913.localdomain podman[311047]: 2025-12-02 10:04:38.338710273 +0000 UTC m=+0.035644905 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:04:38 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:04:38 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/909ccbf636b56e5fcb70f402308fe6a02f149a317eaed6dc848cd26938534901/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:04:38 np0005541913.localdomain podman[311047]: 2025-12-02 10:04:38.467374995 +0000 UTC m=+0.164309597 container init 2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:04:38 np0005541913.localdomain podman[311047]: 2025-12-02 10:04:38.481688978 +0000 UTC m=+0.178623570 container start 2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 02 10:04:38 np0005541913.localdomain dnsmasq[311070]: started, version 2.85 cachesize 150
Dec 02 10:04:38 np0005541913.localdomain dnsmasq[311070]: DNS service limited to local subnets
Dec 02 10:04:38 np0005541913.localdomain dnsmasq[311070]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:04:38 np0005541913.localdomain dnsmasq[311070]: warning: no upstream servers configured
Dec 02 10:04:38 np0005541913.localdomain dnsmasq-dhcp[311070]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:04:38 np0005541913.localdomain dnsmasq[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/addn_hosts - 0 addresses
Dec 02 10:04:38 np0005541913.localdomain dnsmasq-dhcp[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/host
Dec 02 10:04:38 np0005541913.localdomain dnsmasq-dhcp[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/opts
Dec 02 10:04:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:38.518 263406 INFO neutron.agent.linux.ip_lib [None req-032822b7-5695-4a23-85cb-89838df6da4a - - - - - -] Device tapbd990115-99 cannot be used as it has no MAC address
Dec 02 10:04:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:38.577 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:38 np0005541913.localdomain kernel: device tapbd990115-99 entered promiscuous mode
Dec 02 10:04:38 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669878.5818] manager: (tapbd990115-99): new Generic device (/org/freedesktop/NetworkManager/Devices/26)
Dec 02 10:04:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:38Z|00133|binding|INFO|Claiming lport bd990115-9909-4e4e-a861-f26c2f53a28c for this chassis.
Dec 02 10:04:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:38Z|00134|binding|INFO|bd990115-9909-4e4e-a861-f26c2f53a28c: Claiming unknown
Dec 02 10:04:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:38.584 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:38Z|00135|binding|INFO|Setting lport bd990115-9909-4e4e-a861-f26c2f53a28c ovn-installed in OVS
Dec 02 10:04:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:38.621 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:38.654 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:38.677 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:38Z|00136|binding|INFO|Setting lport bd990115-9909-4e4e-a861-f26c2f53a28c up in Southbound
Dec 02 10:04:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:38.882 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-45d02cf1-f511-4416-b7c1-b37c417f16f9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45d02cf1-f511-4416-b7c1-b37c417f16f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50df25ee29424615807a458690cdf8d7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b257864-5151-448f-941d-2c9a748f5881, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=bd990115-9909-4e4e-a861-f26c2f53a28c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:38.885 160221 INFO neutron.agent.ovn.metadata.agent [-] Port bd990115-9909-4e4e-a861-f26c2f53a28c in datapath 45d02cf1-f511-4416-b7c1-b37c417f16f9 bound to our chassis
Dec 02 10:04:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:38.888 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port c851ffbc-ac95-4b63-ad5b-c219b2577bdd IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:04:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:38.889 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45d02cf1-f511-4416-b7c1-b37c417f16f9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:04:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:38.890 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[f39ceb36-aaf2-4a53-b2b1-1adb87cfa4af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:39.074 281858 DEBUG nova.compute.manager [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvcgqfy3k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='82e23ec3-1d57-4166-9ba0-839ded943a78',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Dec 02 10:04:39 np0005541913.localdomain systemd[1]: tmp-crun.2uEwZK.mount: Deactivated successfully.
Dec 02 10:04:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:39.689 281858 DEBUG oslo_concurrency.lockutils [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Acquiring lock "refresh_cache-82e23ec3-1d57-4166-9ba0-839ded943a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:04:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:39.689 281858 DEBUG oslo_concurrency.lockutils [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Acquired lock "refresh_cache-82e23ec3-1d57-4166-9ba0-839ded943a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:04:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:39.690 281858 DEBUG nova.network.neutron [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 02 10:04:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:39.707 263406 INFO neutron.agent.dhcp.agent [None req-62354099-c5ec-4d6a-906d-f9f9ff98d970 - - - - - -] DHCP configuration for ports {'3f99beb7-5057-4f25-a68f-132a387d4a7b'} is completed
Dec 02 10:04:40 np0005541913.localdomain ceph-mon[298296]: pgmap v156: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.168 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:40 np0005541913.localdomain podman[311130]: 
Dec 02 10:04:40 np0005541913.localdomain podman[311130]: 2025-12-02 10:04:40.191557211 +0000 UTC m=+0.127674827 container create 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:04:40 np0005541913.localdomain podman[311130]: 2025-12-02 10:04:40.110340178 +0000 UTC m=+0.046457864 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:04:40 np0005541913.localdomain systemd[1]: Started libpod-conmon-5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d.scope.
Dec 02 10:04:40 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:04:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94147dbae9838956e714723c867733a25e47b3b6162526a89da5f485c251bb56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:04:40 np0005541913.localdomain podman[311130]: 2025-12-02 10:04:40.271426618 +0000 UTC m=+0.207544234 container init 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:04:40 np0005541913.localdomain podman[311130]: 2025-12-02 10:04:40.281396454 +0000 UTC m=+0.217514090 container start 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:04:40 np0005541913.localdomain dnsmasq[311147]: started, version 2.85 cachesize 150
Dec 02 10:04:40 np0005541913.localdomain dnsmasq[311147]: DNS service limited to local subnets
Dec 02 10:04:40 np0005541913.localdomain dnsmasq[311147]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:04:40 np0005541913.localdomain dnsmasq[311147]: warning: no upstream servers configured
Dec 02 10:04:40 np0005541913.localdomain dnsmasq-dhcp[311147]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:04:40 np0005541913.localdomain dnsmasq[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/addn_hosts - 0 addresses
Dec 02 10:04:40 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/host
Dec 02 10:04:40 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/opts
Dec 02 10:04:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:40.332 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:40 np0005541913.localdomain systemd[1]: tmp-crun.ScqZBq.mount: Deactivated successfully.
Dec 02 10:04:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:40.479 263406 INFO neutron.agent.dhcp.agent [None req-d175de16-5bdd-4b06-bb3b-a8e8d0ce6b90 - - - - - -] DHCP configuration for ports {'0999b431-c362-4180-a7a9-8664fe007369'} is completed
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.746 281858 DEBUG nova.network.neutron [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Updating instance_info_cache with network_info: [{"id": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "address": "fa:16:3e:bb:b6:1c", "network": {"id": "13bbad22-ab61-4b1f-849e-c651aa8f3297", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1859087569-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "d048f19ff5fc47dc88162ef5f9cebe8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54433c73-7e", "ovs_interfaceid": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.766 281858 DEBUG oslo_concurrency.lockutils [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Releasing lock "refresh_cache-82e23ec3-1d57-4166-9ba0-839ded943a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.768 281858 DEBUG nova.virt.libvirt.driver [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvcgqfy3k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='82e23ec3-1d57-4166-9ba0-839ded943a78',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.769 281858 DEBUG nova.virt.libvirt.driver [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Creating instance directory: /var/lib/nova/instances/82e23ec3-1d57-4166-9ba0-839ded943a78 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.770 281858 DEBUG nova.virt.libvirt.driver [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Ensure instance console log exists: /var/lib/nova/instances/82e23ec3-1d57-4166-9ba0-839ded943a78/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.771 281858 DEBUG nova.virt.libvirt.driver [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.772 281858 DEBUG nova.virt.libvirt.vif [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-02T10:04:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-39688497',display_name='tempest-LiveMigrationTest-server-39688497',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005541914.localdomain',hostname='tempest-livemigrationtest-server-39688497',id=8,image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T10:04:33Z,launched_on='np0005541914.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005541914.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d048f19ff5fc47dc88162ef5f9cebe8b',ramdisk_id='',reservation_id='r-lnn0by93',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1345186206',owner_user_name='tempest-LiveMigrationTest-1345186206-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T10:04:33Z,user_data=None,user_id='ec20a6cceee246d6b46878df263d30a4',uuid=82e23ec3-1d57-4166-9ba0-839ded943a78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "address": "fa:16:3e:bb:b6:1c", "network": {"id": "13bbad22-ab61-4b1f-849e-c651aa8f3297", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1859087569-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "d048f19ff5fc47dc88162ef5f9cebe8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap54433c73-7e", "ovs_interfaceid": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.773 281858 DEBUG nova.network.os_vif_util [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Converting VIF {"id": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "address": "fa:16:3e:bb:b6:1c", "network": {"id": "13bbad22-ab61-4b1f-849e-c651aa8f3297", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1859087569-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "d048f19ff5fc47dc88162ef5f9cebe8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap54433c73-7e", "ovs_interfaceid": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.774 281858 DEBUG nova.network.os_vif_util [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:b6:1c,bridge_name='br-int',has_traffic_filtering=True,id=54433c73-7e5c-481c-b64c-19e9cfd6e56f,network=Network(13bbad22-ab61-4b1f-849e-c651aa8f3297),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap54433c73-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.775 281858 DEBUG os_vif [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:b6:1c,bridge_name='br-int',has_traffic_filtering=True,id=54433c73-7e5c-481c-b64c-19e9cfd6e56f,network=Network(13bbad22-ab61-4b1f-849e-c651aa8f3297),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap54433c73-7e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.776 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.776 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.777 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.781 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.782 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54433c73-7e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.783 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap54433c73-7e, col_values=(('external_ids', {'iface-id': '54433c73-7e5c-481c-b64c-19e9cfd6e56f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:b6:1c', 'vm-uuid': '82e23ec3-1d57-4166-9ba0-839ded943a78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.785 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.788 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.790 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.792 281858 INFO os_vif [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:b6:1c,bridge_name='br-int',has_traffic_filtering=True,id=54433c73-7e5c-481c-b64c-19e9cfd6e56f,network=Network(13bbad22-ab61-4b1f-849e-c651aa8f3297),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap54433c73-7e')
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.793 281858 DEBUG nova.virt.libvirt.driver [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Dec 02 10:04:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:40.793 281858 DEBUG nova.compute.manager [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvcgqfy3k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='82e23ec3-1d57-4166-9ba0-839ded943a78',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Dec 02 10:04:40 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:41 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3361271791' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:41.766 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:42 np0005541913.localdomain ceph-mon[298296]: pgmap v157: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 63 op/s
Dec 02 10:04:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:04:42 np0005541913.localdomain podman[311150]: 2025-12-02 10:04:42.453056372 +0000 UTC m=+0.087300556 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:04:42 np0005541913.localdomain podman[311150]: 2025-12-02 10:04:42.468101005 +0000 UTC m=+0.102345209 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec 02 10:04:42 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:04:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:43.432 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:44 np0005541913.localdomain ceph-mon[298296]: pgmap v158: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 63 op/s
Dec 02 10:04:44 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:44.397 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:04:43Z, description=, device_id=279e244d-14ba-4911-a425-d38d92768269, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a7afd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908976be0>], id=55fb1997-25fe-4011-9820-773c0aa66e3d, ip_allocation=immediate, mac_address=fa:16:3e:b4:9c:02, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:04:35Z, description=, dns_domain=, id=45d02cf1-f511-4416-b7c1-b37c417f16f9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1627103925-network, port_security_enabled=True, project_id=50df25ee29424615807a458690cdf8d7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33331, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=681, status=ACTIVE, subnets=['34aa8025-e49d-4c09-aefd-41c4d8900224'], tags=[], tenant_id=50df25ee29424615807a458690cdf8d7, updated_at=2025-12-02T10:04:36Z, vlan_transparent=None, network_id=45d02cf1-f511-4416-b7c1-b37c417f16f9, port_security_enabled=False, project_id=50df25ee29424615807a458690cdf8d7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=710, status=DOWN, tags=[], tenant_id=50df25ee29424615807a458690cdf8d7, updated_at=2025-12-02T10:04:43Z on network 45d02cf1-f511-4416-b7c1-b37c417f16f9
Dec 02 10:04:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:44.783 281858 DEBUG nova.network.neutron [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Port 54433c73-7e5c-481c-b64c-19e9cfd6e56f updated with migration profile {'migrating_to': 'np0005541913.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Dec 02 10:04:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:44.785 281858 DEBUG nova.compute.manager [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvcgqfy3k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='82e23ec3-1d57-4166-9ba0-839ded943a78',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Dec 02 10:04:44 np0005541913.localdomain dnsmasq[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/addn_hosts - 1 addresses
Dec 02 10:04:44 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/host
Dec 02 10:04:44 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/opts
Dec 02 10:04:44 np0005541913.localdomain podman[311187]: 2025-12-02 10:04:44.89982017 +0000 UTC m=+0.063583183 container kill 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:04:45 np0005541913.localdomain sshd[311208]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:04:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:45.202 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:45 np0005541913.localdomain sshd[311208]: Accepted publickey for nova from 172.17.0.108 port 58174 ssh2: ECDSA SHA256:F9h/iD/7DLBkMy7oU5JeQ80dnSC7auKWKHT/OdSB0Bo
Dec 02 10:04:45 np0005541913.localdomain systemd[1]: Created slice User Slice of UID 42436.
Dec 02 10:04:45 np0005541913.localdomain systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec 02 10:04:45 np0005541913.localdomain systemd-logind[757]: New session 73 of user nova.
Dec 02 10:04:45 np0005541913.localdomain systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec 02 10:04:45 np0005541913.localdomain systemd[1]: Starting User Manager for UID 42436...
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by (uid=0)
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Queued start job for default target Main User Target.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Created slice User Application Slice.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Reached target Paths.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Reached target Timers.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Starting D-Bus User Message Bus Socket...
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Starting Create User's Volatile Files and Directories...
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Listening on D-Bus User Message Bus Socket.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Reached target Sockets.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Finished Create User's Volatile Files and Directories.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Reached target Basic System.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Reached target Main User Target.
Dec 02 10:04:45 np0005541913.localdomain systemd[311212]: Startup finished in 170ms.
Dec 02 10:04:45 np0005541913.localdomain systemd[1]: Started User Manager for UID 42436.
Dec 02 10:04:45 np0005541913.localdomain systemd[1]: Started Session 73 of User nova.
Dec 02 10:04:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:04:45 np0005541913.localdomain sshd[311208]: pam_unix(sshd:session): session opened for user nova(uid=42436) by (uid=0)
Dec 02 10:04:45 np0005541913.localdomain podman[311228]: 2025-12-02 10:04:45.645648442 +0000 UTC m=+0.071275078 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:04:45 np0005541913.localdomain podman[311228]: 2025-12-02 10:04:45.682029685 +0000 UTC m=+0.107656311 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 02 10:04:45 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:04:45 np0005541913.localdomain kernel: device tap54433c73-7e entered promiscuous mode
Dec 02 10:04:45 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669885.7233] manager: (tap54433c73-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Dec 02 10:04:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:45.723 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:45 np0005541913.localdomain systemd-udevd[311259]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:04:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:45Z|00137|binding|INFO|Claiming lport 54433c73-7e5c-481c-b64c-19e9cfd6e56f for this additional chassis.
Dec 02 10:04:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:45Z|00138|binding|INFO|54433c73-7e5c-481c-b64c-19e9cfd6e56f: Claiming fa:16:3e:bb:b6:1c 10.100.0.13
Dec 02 10:04:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:45Z|00139|binding|INFO|Claiming lport ffcaba02-6808-4409-8458-941ca0af2e66 for this additional chassis.
Dec 02 10:04:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:45Z|00140|binding|INFO|ffcaba02-6808-4409-8458-941ca0af2e66: Claiming fa:16:3e:a7:75:fd 19.80.0.43
Dec 02 10:04:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:45.729 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:45.731 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:45 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669885.7428] device (tap54433c73-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 02 10:04:45 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669885.7438] device (tap54433c73-7e): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Dec 02 10:04:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:45Z|00141|binding|INFO|Setting lport 54433c73-7e5c-481c-b64c-19e9cfd6e56f ovn-installed in OVS
Dec 02 10:04:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:45.749 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:45 np0005541913.localdomain systemd-machined[84262]: New machine qemu-5-instance-00000008.
Dec 02 10:04:45 np0005541913.localdomain systemd[1]: Started Virtual Machine qemu-5-instance-00000008.
Dec 02 10:04:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:45.784 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:45.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:04:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:46.045 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764669886.0446723, 82e23ec3-1d57-4166-9ba0-839ded943a78 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:04:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:46.046 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] VM Started (Lifecycle Event)
Dec 02 10:04:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:46.064 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:04:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:46.206 263406 INFO neutron.agent.dhcp.agent [None req-1c008398-2d3a-4e17-910d-4f3fbd976cf7 - - - - - -] DHCP configuration for ports {'55fb1997-25fe-4011-9820-773c0aa66e3d'} is completed
Dec 02 10:04:46 np0005541913.localdomain ceph-mon[298296]: pgmap v159: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 63 op/s
Dec 02 10:04:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:46.858 281858 DEBUG nova.virt.driver [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] Emitting event <LifecycleEvent: 1764669886.8579433, 82e23ec3-1d57-4166-9ba0-839ded943a78 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:04:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:46.859 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] VM Resumed (Lifecycle Event)
Dec 02 10:04:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:46.880 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:04:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:46.886 281858 DEBUG nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 02 10:04:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:46.907 281858 INFO nova.compute.manager [None req-a2e5fb31-f32c-40ed-828f-a970ba3add1b - - - - - -] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] During the sync_power process the instance has moved from host np0005541914.localdomain to host np0005541913.localdomain
Dec 02 10:04:47 np0005541913.localdomain sshd[311229]: Received disconnect from 172.17.0.108 port 58174:11: disconnected by user
Dec 02 10:04:47 np0005541913.localdomain sshd[311229]: Disconnected from user nova 172.17.0.108 port 58174
Dec 02 10:04:47 np0005541913.localdomain sshd[311208]: pam_unix(sshd:session): session closed for user nova
Dec 02 10:04:47 np0005541913.localdomain systemd[1]: session-73.scope: Deactivated successfully.
Dec 02 10:04:47 np0005541913.localdomain systemd-logind[757]: Session 73 logged out. Waiting for processes to exit.
Dec 02 10:04:47 np0005541913.localdomain systemd-logind[757]: Removed session 73.
Dec 02 10:04:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:47.851 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:04:46Z, description=, device_id=11e16c5e-46e1-4a00-8cde-eb7c634beb6e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089d2970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089d2850>], id=f642efd7-a23a-4ea5-ac71-0a9b43d62652, ip_allocation=immediate, mac_address=fa:16:3e:01:87:24, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:04:34Z, description=, dns_domain=, id=26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-1774083162-network, port_security_enabled=True, project_id=e9e3da8770844ad5b5552298a24dcbd2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50867, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=673, status=ACTIVE, subnets=['1fd9a2bb-1a18-4b88-9f27-6b97d2310288'], tags=[], tenant_id=e9e3da8770844ad5b5552298a24dcbd2, updated_at=2025-12-02T10:04:35Z, vlan_transparent=None, network_id=26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, port_security_enabled=False, project_id=e9e3da8770844ad5b5552298a24dcbd2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=721, status=DOWN, tags=[], tenant_id=e9e3da8770844ad5b5552298a24dcbd2, updated_at=2025-12-02T10:04:47Z on network 26a036bb-7fc2-42d0-b324-4cf6bb77a9d6
Dec 02 10:04:48 np0005541913.localdomain podman[311332]: 2025-12-02 10:04:48.103431925 +0000 UTC m=+0.081995115 container kill 2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:04:48 np0005541913.localdomain dnsmasq[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/addn_hosts - 1 addresses
Dec 02 10:04:48 np0005541913.localdomain dnsmasq-dhcp[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/host
Dec 02 10:04:48 np0005541913.localdomain dnsmasq-dhcp[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/opts
Dec 02 10:04:48 np0005541913.localdomain systemd[1]: tmp-crun.DHAJxW.mount: Deactivated successfully.
Dec 02 10:04:48 np0005541913.localdomain ceph-mon[298296]: pgmap v160: 177 pgs: 177 active+clean; 192 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 63 op/s
Dec 02 10:04:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:04:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:04:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:48.352 263406 INFO neutron.agent.dhcp.agent [None req-64949150-9071-4bf4-90a9-dfc2d0ee4fb9 - - - - - -] DHCP configuration for ports {'f642efd7-a23a-4ea5-ac71-0a9b43d62652'} is completed
Dec 02 10:04:48 np0005541913.localdomain podman[311354]: 2025-12-02 10:04:48.453714846 +0000 UTC m=+0.092671260 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:04:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:48Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bb:b6:1c 10.100.0.13
Dec 02 10:04:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:48Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bb:b6:1c 10.100.0.13
Dec 02 10:04:48 np0005541913.localdomain podman[311353]: 2025-12-02 10:04:48.499293226 +0000 UTC m=+0.137858380 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, config_id=edpm, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Dec 02 10:04:48 np0005541913.localdomain podman[311353]: 2025-12-02 10:04:48.508539103 +0000 UTC m=+0.147104267 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Dec 02 10:04:48 np0005541913.localdomain podman[311354]: 2025-12-02 10:04:48.518750056 +0000 UTC m=+0.157706450 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:04:48 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:04:48 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:04:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 10:04:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:49Z|00142|binding|INFO|Claiming lport 54433c73-7e5c-481c-b64c-19e9cfd6e56f for this chassis.
Dec 02 10:04:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:49Z|00143|binding|INFO|54433c73-7e5c-481c-b64c-19e9cfd6e56f: Claiming fa:16:3e:bb:b6:1c 10.100.0.13
Dec 02 10:04:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:49Z|00144|binding|INFO|Claiming lport ffcaba02-6808-4409-8458-941ca0af2e66 for this chassis.
Dec 02 10:04:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:49Z|00145|binding|INFO|ffcaba02-6808-4409-8458-941ca0af2e66: Claiming fa:16:3e:a7:75:fd 19.80.0.43
Dec 02 10:04:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:49Z|00146|binding|INFO|Setting lport 54433c73-7e5c-481c-b64c-19e9cfd6e56f up in Southbound
Dec 02 10:04:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:49Z|00147|binding|INFO|Setting lport ffcaba02-6808-4409-8458-941ca0af2e66 up in Southbound
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.739 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:75:fd 19.80.0.43'], port_security=['fa:16:3e:a7:75:fd 19.80.0.43'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['54433c73-7e5c-481c-b64c-19e9cfd6e56f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1664568330', 'neutron:cidrs': '19.80.0.43/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c40d86e4-7101-443b-abce-328f7d1ea40e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1664568330', 'neutron:project_id': 'd048f19ff5fc47dc88162ef5f9cebe8b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '576d6513-029b-4880-bb0b-58094b586b90', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=e1e893da-07af-44e3-945f-c862571583e8, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ffcaba02-6808-4409-8458-941ca0af2e66) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.742 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:b6:1c 10.100.0.13'], port_security=['fa:16:3e:bb:b6:1c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-146896978', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '82e23ec3-1d57-4166-9ba0-839ded943a78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-146896978', 'neutron:project_id': 'd048f19ff5fc47dc88162ef5f9cebe8b', 'neutron:revision_number': '9', 'neutron:security_group_ids': '576d6513-029b-4880-bb0b-58094b586b90', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541914.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51e42abf-8647-4013-9c62-778191c64ad0, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=54433c73-7e5c-481c-b64c-19e9cfd6e56f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.743 160221 INFO neutron.agent.ovn.metadata.agent [-] Port ffcaba02-6808-4409-8458-941ca0af2e66 in datapath c40d86e4-7101-443b-abce-328f7d1ea40e bound to our chassis
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.747 160221 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c40d86e4-7101-443b-abce-328f7d1ea40e
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.756 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[cf9e4f61-5953-496e-b2bc-69d651fffddb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.758 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc40d86e4-71 in ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.760 160340 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc40d86e4-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.761 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[47f4e6d7-64ee-446f-a921-e9c9f65c3cd0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.762 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe7276a-0194-4453-871e-e584bf8eb253]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.773 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[18fe22a5-dc6a-4a16-b7cc-e80af19343b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:49.779 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:04:43Z, description=, device_id=279e244d-14ba-4911-a425-d38d92768269, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908999d00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908999c10>], id=55fb1997-25fe-4011-9820-773c0aa66e3d, ip_allocation=immediate, mac_address=fa:16:3e:b4:9c:02, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:04:35Z, description=, dns_domain=, id=45d02cf1-f511-4416-b7c1-b37c417f16f9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1627103925-network, port_security_enabled=True, project_id=50df25ee29424615807a458690cdf8d7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33331, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=681, status=ACTIVE, subnets=['34aa8025-e49d-4c09-aefd-41c4d8900224'], tags=[], tenant_id=50df25ee29424615807a458690cdf8d7, updated_at=2025-12-02T10:04:36Z, vlan_transparent=None, network_id=45d02cf1-f511-4416-b7c1-b37c417f16f9, port_security_enabled=False, project_id=50df25ee29424615807a458690cdf8d7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=710, status=DOWN, tags=[], tenant_id=50df25ee29424615807a458690cdf8d7, updated_at=2025-12-02T10:04:43Z on network 45d02cf1-f511-4416-b7c1-b37c417f16f9
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.785 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[aacd252b-b49d-430f-bc18-909617f5e161]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.810 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3a1fc7-3960-4e7f-ae2a-358d69c45824]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669889.8182] manager: (tapc40d86e4-70): new Veth device (/org/freedesktop/NetworkManager/Devices/28)
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.818 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[1379c99e-b2d5-4fa9-aaaf-09d94553df35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain systemd-udevd[311405]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.845 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[643cb8a5-57a3-4961-8adc-17810e9a2df1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.850 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[9adabb22-1535-4bd0-8756-329e8106fe19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapc40d86e4-71: link becomes ready
Dec 02 10:04:49 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapc40d86e4-70: link becomes ready
Dec 02 10:04:49 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669889.8707] device (tapc40d86e4-70): carrier: link connected
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.874 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8c22a0-3fa5-4691-bcfd-53daef2b0eb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.891 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[e888d90a-9785-42a2-a7e0-6f686e04c689]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc40d86e4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:0f:45:7f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1205201, 'reachable_time': 40862, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 311439, 'error': None, 'target': 'ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.904 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[b1386d08-322d-4282-83c8-7dee191f20b0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0f:457f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1205201, 'tstamp': 1205201}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 311441, 'error': None, 'target': 'ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.918 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[53b4a524-30f7-4562-a7cc-6c19094f1fd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc40d86e4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:0f:45:7f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1205201, 'reachable_time': 40862, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 311442, 'error': None, 'target': 'ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.938 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6daf87-6396-46b0-9b00-59d3270a791f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain dnsmasq[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/addn_hosts - 1 addresses
Dec 02 10:04:49 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/host
Dec 02 10:04:49 np0005541913.localdomain podman[311444]: 2025-12-02 10:04:49.971781309 +0000 UTC m=+0.049037613 container kill 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:04:49 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/opts
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.986 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[c8879fbe-0d6d-4de5-b797-36c69f762859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.988 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc40d86e4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.988 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.989 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc40d86e4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:49.990 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:49 np0005541913.localdomain kernel: device tapc40d86e4-70 entered promiscuous mode
Dec 02 10:04:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:49.994 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:49.998 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc40d86e4-70, col_values=(('external_ids', {'iface-id': '60398627-924e-4353-b9ee-b86c24b6fc87'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:49.999 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:50 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:50Z|00148|binding|INFO|Releasing lport 60398627-924e-4353-b9ee-b86c24b6fc87 from this chassis (sb_readonly=0)
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.000 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.003 160221 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c40d86e4-7101-443b-abce-328f7d1ea40e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c40d86e4-7101-443b-abce-328f7d1ea40e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.004 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5b1bb26c-8dff-4254-8208-b199dbf59c14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.005 160221 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: global
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     log         /dev/log local0 debug
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     log-tag     haproxy-metadata-proxy-c40d86e4-7101-443b-abce-328f7d1ea40e
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     user        root
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     group       root
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     maxconn     1024
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     pidfile     /var/lib/neutron/external/pids/c40d86e4-7101-443b-abce-328f7d1ea40e.pid.haproxy
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     daemon
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: defaults
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     log global
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     mode http
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     option httplog
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     option dontlognull
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     option http-server-close
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     option forwardfor
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     retries                 3
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-request    30s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout connect         30s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout client          32s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout server          32s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-keep-alive 30s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: listen listener
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     bind 169.254.169.254:80
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     http-request add-header X-OVN-Network-ID c40d86e4-7101-443b-abce-328f7d1ea40e
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.005 160221 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e', 'env', 'PROCESS_TAG=haproxy-c40d86e4-7101-443b-abce-328f7d1ea40e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c40d86e4-7101-443b-abce-328f7d1ea40e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.007 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.241 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:50 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:50.248 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-cf58f353-04b9-463a-832f-2ee6517a222b req-e9021766-f952-4fcb-9d58-29ffe2b82e7c 4ea94a3d730c499a8a661131692645ce 497073c2347a4b2dbbf501873318fbd3 - - default default] This port is not SRIOV, skip binding for port 54433c73-7e5c-481c-b64c-19e9cfd6e56f.
Dec 02 10:04:50 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:50.256 263406 INFO neutron.agent.dhcp.agent [None req-501d446f-4e06-4469-8098-de6632e7f437 - - - - - -] DHCP configuration for ports {'55fb1997-25fe-4011-9820-773c0aa66e3d'} is completed
Dec 02 10:04:50 np0005541913.localdomain ceph-mon[298296]: pgmap v161: 177 pgs: 177 active+clean; 218 MiB data, 874 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 123 op/s
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.407 281858 INFO nova.compute.manager [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Post operation of migration started
Dec 02 10:04:50 np0005541913.localdomain podman[311498]: 
Dec 02 10:04:50 np0005541913.localdomain podman[311498]: 2025-12-02 10:04:50.425845626 +0000 UTC m=+0.067901967 container create e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:04:50 np0005541913.localdomain systemd[1]: Started libpod-conmon-e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980.scope.
Dec 02 10:04:50 np0005541913.localdomain systemd[1]: tmp-crun.1dhSF6.mount: Deactivated successfully.
Dec 02 10:04:50 np0005541913.localdomain podman[311498]: 2025-12-02 10:04:50.391320003 +0000 UTC m=+0.033376364 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 02 10:04:50 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:04:50 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f06f0939af8ced9e01822fd15f35fbfde05ec9e41ca9e0ac345284976c2f364/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:04:50 np0005541913.localdomain podman[311498]: 2025-12-02 10:04:50.512996307 +0000 UTC m=+0.155052698 container init e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:04:50 np0005541913.localdomain podman[311498]: 2025-12-02 10:04:50.52092434 +0000 UTC m=+0.162980701 container start e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:04:50 np0005541913.localdomain neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e[311512]: [NOTICE]   (311516) : New worker (311518) forked
Dec 02 10:04:50 np0005541913.localdomain neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e[311512]: [NOTICE]   (311516) : Loading success.
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.579 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 54433c73-7e5c-481c-b64c-19e9cfd6e56f in datapath 13bbad22-ab61-4b1f-849e-c651aa8f3297 unbound from our chassis
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.581 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 53bdfc6a-79b0-43cf-92a6-99b85b988b28 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.582 160221 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 13bbad22-ab61-4b1f-849e-c651aa8f3297
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.590 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[bdefa7a7-ba4b-461b-89f1-ef48d5c15d77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.591 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap13bbad22-a1 in ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.593 160340 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap13bbad22-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.593 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[206ad9a1-00d6-416e-b8ab-6bef385097a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.595 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[50e685fa-cc8b-472a-ab7e-7da0f135fcea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.603 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[fb245f14-7b26-40e1-bc84-2186b79e805e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.615 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a787f23b-23d8-4e60-bf12-b07c5b325020]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.641 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[b12331eb-8db0-443a-90b2-a3ce42979dfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.646 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a5bee95c-ba45-4d4f-b4b7-88248dba446c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669890.6475] manager: (tap13bbad22-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Dec 02 10:04:50 np0005541913.localdomain systemd-udevd[311411]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.675 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[b99f6f6a-dbc1-45bc-b22f-cefe027be62e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.679 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[99ea0efb-c159-4795-a874-0275d8b5c257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap13bbad22-a0: link becomes ready
Dec 02 10:04:50 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669890.6980] device (tap13bbad22-a0): carrier: link connected
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.701 160351 DEBUG oslo.privsep.daemon [-] privsep: reply[379ab8be-4174-4829-928a-a110024008f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.719 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[be6b7b7a-3c0f-4378-8c9f-99fcc34d8496]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13bbad22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:0f:43:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1205284, 'reachable_time': 36060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 311537, 'error': None, 'target': 'ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.724 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.737 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[378586dd-80ac-48be-8f0a-47abbaa51adc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0f:4317'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1205284, 'tstamp': 1205284}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 311538, 'error': None, 'target': 'ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.746 281858 DEBUG oslo_concurrency.lockutils [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Acquiring lock "refresh_cache-82e23ec3-1d57-4166-9ba0-839ded943a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.746 281858 DEBUG oslo_concurrency.lockutils [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Acquired lock "refresh_cache-82e23ec3-1d57-4166-9ba0-839ded943a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.747 281858 DEBUG nova.network.neutron [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.762 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[403893e8-df9a-42fc-a410-11990bb683bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13bbad22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:0f:43:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1205284, 'reachable_time': 36060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 311539, 'error': None, 'target': 'ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.786 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.800 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[f99faaf0-b668-46bb-80b3-371c92cdd1cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.858 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5a94fc09-c4bb-4c77-922d-1b1fccdff6e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.860 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13bbad22-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.860 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.861 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13bbad22-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.864 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:50 np0005541913.localdomain kernel: device tap13bbad22-a0 entered promiscuous mode
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.868 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap13bbad22-a0, col_values=(('external_ids', {'iface-id': '202be55f-4a2f-4e8a-884e-d4a72a4d525d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:50 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:50Z|00149|binding|INFO|Releasing lport 202be55f-4a2f-4e8a-884e-d4a72a4d525d from this chassis (sb_readonly=0)
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.870 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:50.881 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.882 160221 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/13bbad22-ab61-4b1f-849e-c651aa8f3297.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/13bbad22-ab61-4b1f-849e-c651aa8f3297.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.883 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[263550af-2e8c-4221-aa56-10e1bd135aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.884 160221 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: global
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     log         /dev/log local0 debug
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     log-tag     haproxy-metadata-proxy-13bbad22-ab61-4b1f-849e-c651aa8f3297
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     user        root
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     group       root
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     maxconn     1024
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     pidfile     /var/lib/neutron/external/pids/13bbad22-ab61-4b1f-849e-c651aa8f3297.pid.haproxy
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     daemon
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: defaults
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     log global
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     mode http
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     option httplog
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     option dontlognull
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     option http-server-close
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     option forwardfor
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     retries                 3
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-request    30s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout connect         30s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout client          32s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout server          32s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     timeout http-keep-alive 30s
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: listen listener
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     bind 169.254.169.254:80
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:     http-request add-header X-OVN-Network-ID 13bbad22-ab61-4b1f-849e-c651aa8f3297
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 02 10:04:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:50.884 160221 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'env', 'PROCESS_TAG=haproxy-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/13bbad22-ab61-4b1f-849e-c651aa8f3297.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 02 10:04:51 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:51.291 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:04:46Z, description=, device_id=11e16c5e-46e1-4a00-8cde-eb7c634beb6e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908999550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089d0160>], id=f642efd7-a23a-4ea5-ac71-0a9b43d62652, ip_allocation=immediate, mac_address=fa:16:3e:01:87:24, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:04:34Z, description=, dns_domain=, id=26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-1774083162-network, port_security_enabled=True, project_id=e9e3da8770844ad5b5552298a24dcbd2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50867, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=673, status=ACTIVE, subnets=['1fd9a2bb-1a18-4b88-9f27-6b97d2310288'], tags=[], tenant_id=e9e3da8770844ad5b5552298a24dcbd2, updated_at=2025-12-02T10:04:35Z, vlan_transparent=None, network_id=26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, port_security_enabled=False, project_id=e9e3da8770844ad5b5552298a24dcbd2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=721, status=DOWN, tags=[], tenant_id=e9e3da8770844ad5b5552298a24dcbd2, updated_at=2025-12-02T10:04:47Z on network 26a036bb-7fc2-42d0-b324-4cf6bb77a9d6
Dec 02 10:04:51 np0005541913.localdomain podman[311571]: 
Dec 02 10:04:51 np0005541913.localdomain podman[311571]: 2025-12-02 10:04:51.395174648 +0000 UTC m=+0.077381960 container create aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:04:51 np0005541913.localdomain systemd[1]: Started libpod-conmon-aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5.scope.
Dec 02 10:04:51 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:04:51 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/717f1ef704ec9c5b6d7c4f85d43274eb21a1cf80205ee2cbc3615610a19b18d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:04:51 np0005541913.localdomain podman[311571]: 2025-12-02 10:04:51.459271863 +0000 UTC m=+0.141479175 container init aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 10:04:51 np0005541913.localdomain podman[311571]: 2025-12-02 10:04:51.361931479 +0000 UTC m=+0.044138811 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 02 10:04:51 np0005541913.localdomain systemd[1]: tmp-crun.hz32uu.mount: Deactivated successfully.
Dec 02 10:04:51 np0005541913.localdomain podman[311571]: 2025-12-02 10:04:51.476291878 +0000 UTC m=+0.158499190 container start aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:04:51 np0005541913.localdomain neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297[311600]: [NOTICE]   (311611) : New worker (311616) forked
Dec 02 10:04:51 np0005541913.localdomain neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297[311600]: [NOTICE]   (311611) : Loading success.
Dec 02 10:04:51 np0005541913.localdomain dnsmasq[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/addn_hosts - 1 addresses
Dec 02 10:04:51 np0005541913.localdomain dnsmasq-dhcp[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/host
Dec 02 10:04:51 np0005541913.localdomain podman[311604]: 2025-12-02 10:04:51.533315075 +0000 UTC m=+0.062577226 container kill 2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:04:51 np0005541913.localdomain dnsmasq-dhcp[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/opts
Dec 02 10:04:51 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:51.704 263406 INFO neutron.agent.dhcp.agent [None req-26581c33-f744-4381-ace6-87b946c7b089 - - - - - -] DHCP configuration for ports {'f642efd7-a23a-4ea5-ac71-0a9b43d62652'} is completed
Dec 02 10:04:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:51.778 281858 DEBUG nova.network.neutron [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Updating instance_info_cache with network_info: [{"id": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "address": "fa:16:3e:bb:b6:1c", "network": {"id": "13bbad22-ab61-4b1f-849e-c651aa8f3297", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1859087569-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "d048f19ff5fc47dc88162ef5f9cebe8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54433c73-7e", "ovs_interfaceid": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:04:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:51.803 281858 DEBUG oslo_concurrency.lockutils [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Releasing lock "refresh_cache-82e23ec3-1d57-4166-9ba0-839ded943a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:04:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:51.820 281858 DEBUG oslo_concurrency.lockutils [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:51.821 281858 DEBUG oslo_concurrency.lockutils [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:51.821 281858 DEBUG oslo_concurrency.lockutils [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:51.828 281858 INFO nova.virt.libvirt.driver [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 02 10:04:51 np0005541913.localdomain virtqemud[203664]: Domain id=5 name='instance-00000008' uuid=82e23ec3-1d57-4166-9ba0-839ded943a78 is tainted: custom-monitor
Dec 02 10:04:51 np0005541913.localdomain ceph-mon[298296]: pgmap v162: 177 pgs: 177 active+clean; 218 MiB data, 874 MiB used, 41 GiB / 42 GiB avail; 298 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Dec 02 10:04:51 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:51Z|00150|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:04:51 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:51Z|00151|binding|INFO|Releasing lport 202be55f-4a2f-4e8a-884e-d4a72a4d525d from this chassis (sb_readonly=0)
Dec 02 10:04:51 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:51Z|00152|binding|INFO|Releasing lport 60398627-924e-4353-b9ee-b86c24b6fc87 from this chassis (sb_readonly=0)
Dec 02 10:04:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:51.973 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:52.838 281858 INFO nova.virt.libvirt.driver [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 02 10:04:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:53Z|00153|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:04:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:53Z|00154|binding|INFO|Releasing lport 202be55f-4a2f-4e8a-884e-d4a72a4d525d from this chassis (sb_readonly=0)
Dec 02 10:04:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:53Z|00155|binding|INFO|Releasing lport 60398627-924e-4353-b9ee-b86c24b6fc87 from this chassis (sb_readonly=0)
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.581 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.846 281858 INFO nova.virt.libvirt.driver [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.858 281858 DEBUG nova.compute.manager [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.880 281858 DEBUG nova.objects.instance [None req-cf58f353-04b9-463a-832f-2ee6517a222b 128dc0e572734d9083e5bf6378255d58 dc1edab5ae5d43f08b967b5bf594f8b5 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.925 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.925 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.926 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:04:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:53.926 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:54 np0005541913.localdomain ceph-mon[298296]: pgmap v163: 177 pgs: 177 active+clean; 225 MiB data, 877 MiB used, 41 GiB / 42 GiB avail; 330 KiB/s rd, 2.1 MiB/s wr, 68 op/s
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.495 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.531 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.531 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.532 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.532 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.533 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.554 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.555 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.555 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.555 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:04:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:54.556 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:04:54 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:54.636 2 INFO neutron.agent.securitygroups_rpc [None req-5252ab83-90b7-4c17-ab41-150a0f430946 955214da09cd44dba70e1a06eabc9023 50df25ee29424615807a458690cdf8d7 - - default default] Security group rule updated ['2e537c1e-d2f3-49fb-8c4c-0f6b2c3e354b']
Dec 02 10:04:54 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:04:54.923 2 INFO neutron.agent.securitygroups_rpc [None req-a8a8282d-6793-4a84-80fc-24e3966f9a17 955214da09cd44dba70e1a06eabc9023 50df25ee29424615807a458690cdf8d7 - - default default] Security group rule updated ['2e537c1e-d2f3-49fb-8c4c-0f6b2c3e354b']
Dec 02 10:04:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:04:54 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4293966053' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.005 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:04:55 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3831071445' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:55 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/4293966053' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.077 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.078 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.083 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.083 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.245 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.348 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.350 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11070MB free_disk=41.70097732543945GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.350 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.351 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.398 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Applying migration context for instance 82e23ec3-1d57-4166-9ba0-839ded943a78 as it has an incoming, in-progress migration f83e1b81-4647-4642-b7c4-b4f369bef051. Migration status is running _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.398 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.415 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.442 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.443 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance 82e23ec3-1d57-4166-9ba0-839ded943a78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.443 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.444 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1152MB phys_disk=41GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.504 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.600 281858 DEBUG nova.compute.manager [req-262747ee-d656-4f22-bc15-e5465af16acf req-c58fd4d7-a041-4bfd-82f2-9e88a31911ab dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Received event network-vif-plugged-54433c73-7e5c-481c-b64c-19e9cfd6e56f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.601 281858 DEBUG oslo_concurrency.lockutils [req-262747ee-d656-4f22-bc15-e5465af16acf req-c58fd4d7-a041-4bfd-82f2-9e88a31911ab dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Acquiring lock "82e23ec3-1d57-4166-9ba0-839ded943a78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.601 281858 DEBUG oslo_concurrency.lockutils [req-262747ee-d656-4f22-bc15-e5465af16acf req-c58fd4d7-a041-4bfd-82f2-9e88a31911ab dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "82e23ec3-1d57-4166-9ba0-839ded943a78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.601 281858 DEBUG oslo_concurrency.lockutils [req-262747ee-d656-4f22-bc15-e5465af16acf req-c58fd4d7-a041-4bfd-82f2-9e88a31911ab dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] Lock "82e23ec3-1d57-4166-9ba0-839ded943a78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.602 281858 DEBUG nova.compute.manager [req-262747ee-d656-4f22-bc15-e5465af16acf req-c58fd4d7-a041-4bfd-82f2-9e88a31911ab dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] No waiting events found dispatching network-vif-plugged-54433c73-7e5c-481c-b64c-19e9cfd6e56f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.602 281858 WARNING nova.compute.manager [req-262747ee-d656-4f22-bc15-e5465af16acf req-c58fd4d7-a041-4bfd-82f2-9e88a31911ab dafd7fe1ebe54740b64cc9f8b3667fc9 497073c2347a4b2dbbf501873318fbd3 - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Received unexpected event network-vif-plugged-54433c73-7e5c-481c-b64c-19e9cfd6e56f for instance with vm_state active and task_state deleting.
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.626 281858 DEBUG oslo_concurrency.lockutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Acquiring lock "82e23ec3-1d57-4166-9ba0-839ded943a78" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.626 281858 DEBUG oslo_concurrency.lockutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Lock "82e23ec3-1d57-4166-9ba0-839ded943a78" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.627 281858 DEBUG oslo_concurrency.lockutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Acquiring lock "82e23ec3-1d57-4166-9ba0-839ded943a78-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.627 281858 DEBUG oslo_concurrency.lockutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Lock "82e23ec3-1d57-4166-9ba0-839ded943a78-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.628 281858 DEBUG oslo_concurrency.lockutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Lock "82e23ec3-1d57-4166-9ba0-839ded943a78-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.629 281858 INFO nova.compute.manager [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Terminating instance
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.630 281858 DEBUG nova.compute.manager [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 02 10:04:55 np0005541913.localdomain podman[311679]: 2025-12-02 10:04:55.635881649 +0000 UTC m=+0.056728958 container kill 2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:04:55 np0005541913.localdomain dnsmasq[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/addn_hosts - 0 addresses
Dec 02 10:04:55 np0005541913.localdomain dnsmasq-dhcp[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/host
Dec 02 10:04:55 np0005541913.localdomain dnsmasq-dhcp[311070]: read /var/lib/neutron/dhcp/26a036bb-7fc2-42d0-b324-4cf6bb77a9d6/opts
Dec 02 10:04:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:04:55 np0005541913.localdomain kernel: device tap54433c73-7e left promiscuous mode
Dec 02 10:04:55 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669895.6961] device (tap54433c73-7e): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00156|binding|INFO|Releasing lport 54433c73-7e5c-481c-b64c-19e9cfd6e56f from this chassis (sb_readonly=0)
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.706 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00157|binding|INFO|Setting lport 54433c73-7e5c-481c-b64c-19e9cfd6e56f down in Southbound
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00158|binding|INFO|Releasing lport ffcaba02-6808-4409-8458-941ca0af2e66 from this chassis (sb_readonly=0)
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00159|binding|INFO|Setting lport ffcaba02-6808-4409-8458-941ca0af2e66 down in Southbound
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00160|binding|INFO|Removing iface tap54433c73-7e ovn-installed in OVS
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.709 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00161|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00162|binding|INFO|Releasing lport 202be55f-4a2f-4e8a-884e-d4a72a4d525d from this chassis (sb_readonly=0)
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00163|binding|INFO|Releasing lport 60398627-924e-4353-b9ee-b86c24b6fc87 from this chassis (sb_readonly=0)
Dec 02 10:04:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:55.717 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:75:fd 19.80.0.43'], port_security=['fa:16:3e:a7:75:fd 19.80.0.43'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['54433c73-7e5c-481c-b64c-19e9cfd6e56f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1664568330', 'neutron:cidrs': '19.80.0.43/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c40d86e4-7101-443b-abce-328f7d1ea40e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1664568330', 'neutron:project_id': 'd048f19ff5fc47dc88162ef5f9cebe8b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '576d6513-029b-4880-bb0b-58094b586b90', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=e1e893da-07af-44e3-945f-c862571583e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ffcaba02-6808-4409-8458-941ca0af2e66) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:55.719 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:b6:1c 10.100.0.13'], port_security=['fa:16:3e:bb:b6:1c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-146896978', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '82e23ec3-1d57-4166-9ba0-839ded943a78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-146896978', 'neutron:project_id': 'd048f19ff5fc47dc88162ef5f9cebe8b', 'neutron:revision_number': '11', 'neutron:security_group_ids': '576d6513-029b-4880-bb0b-58094b586b90', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51e42abf-8647-4013-9c62-778191c64ad0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=54433c73-7e5c-481c-b64c-19e9cfd6e56f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:55.720 160221 INFO neutron.agent.ovn.metadata.agent [-] Port ffcaba02-6808-4409-8458-941ca0af2e66 in datapath c40d86e4-7101-443b-abce-328f7d1ea40e unbound from our chassis
Dec 02 10:04:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:55.728 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c40d86e4-7101-443b-abce-328f7d1ea40e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:04:55 np0005541913.localdomain systemd[1]: tmp-crun.NToL0U.mount: Deactivated successfully.
Dec 02 10:04:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:55.732 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4e1e00-745e-4ac8-8ab6-33f99187eb29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:55.735 160221 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e namespace which is not needed anymore
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.734 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec 02 10:04:55 np0005541913.localdomain systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Consumed 1.810s CPU time.
Dec 02 10:04:55 np0005541913.localdomain systemd-machined[84262]: Machine qemu-5-instance-00000008 terminated.
Dec 02 10:04:55 np0005541913.localdomain podman[311712]: 2025-12-02 10:04:55.745238355 +0000 UTC m=+0.083296390 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.747 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain podman[311712]: 2025-12-02 10:04:55.755024127 +0000 UTC m=+0.093082212 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:04:55 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.787 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.802 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00164|binding|INFO|Releasing lport c1f0bd46-6bae-4902-9292-e19c6e88557a from this chassis (sb_readonly=0)
Dec 02 10:04:55 np0005541913.localdomain kernel: device tapc1f0bd46-6b left promiscuous mode
Dec 02 10:04:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:55Z|00165|binding|INFO|Setting lport c1f0bd46-6bae-4902-9292-e19c6e88557a down in Southbound
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.817 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:55.821 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e9e3da8770844ad5b5552298a24dcbd2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46d349b8-3ce0-4286-826a-479b1dd2a429, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c1f0bd46-6bae-4902-9292-e19c6e88557a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.874 281858 INFO nova.virt.libvirt.driver [-] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Instance destroyed successfully.
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.874 281858 DEBUG nova.objects.instance [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Lazy-loading 'resources' on Instance uuid 82e23ec3-1d57-4166-9ba0-839ded943a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:04:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:04:55 np0005541913.localdomain neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e[311512]: [NOTICE]   (311516) : haproxy version is 2.8.14-c23fe91
Dec 02 10:04:55 np0005541913.localdomain neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e[311512]: [NOTICE]   (311516) : path to executable is /usr/sbin/haproxy
Dec 02 10:04:55 np0005541913.localdomain neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e[311512]: [WARNING]  (311516) : Exiting Master process...
Dec 02 10:04:55 np0005541913.localdomain neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e[311512]: [ALERT]    (311516) : Current worker (311518) exited with code 143 (Terminated)
Dec 02 10:04:55 np0005541913.localdomain neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e[311512]: [WARNING]  (311516) : All workers exited. Exiting... (0)
Dec 02 10:04:55 np0005541913.localdomain systemd[1]: libpod-e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980.scope: Deactivated successfully.
Dec 02 10:04:55 np0005541913.localdomain podman[311765]: 2025-12-02 10:04:55.906417127 +0000 UTC m=+0.070364263 container died e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:04:55 np0005541913.localdomain podman[311765]: 2025-12-02 10:04:55.926375461 +0000 UTC m=+0.090322597 container cleanup e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.931 281858 DEBUG nova.virt.libvirt.vif [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-02T10:04:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-39688497',display_name='tempest-LiveMigrationTest-server-39688497',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005541913.localdomain',hostname='tempest-livemigrationtest-server-39688497',id=8,image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T10:04:33Z,launched_on='np0005541914.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005541913.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d048f19ff5fc47dc88162ef5f9cebe8b',ramdisk_id='',reservation_id='r-lnn0by93',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='d85e840d-fa56-497b-b5bd-b49584d3e97a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1345186206',owner_user_name='tempest-LiveMigrationTest-1345186206-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T10:04:53Z,user_data=None,user_id='ec20a6cceee246d6b46878df263d30a4',uuid=82e23ec3-1d57-4166-9ba0-839ded943a78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "address": "fa:16:3e:bb:b6:1c", "network": {"id": "13bbad22-ab61-4b1f-849e-c651aa8f3297", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1859087569-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "d048f19ff5fc47dc88162ef5f9cebe8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54433c73-7e", "ovs_interfaceid": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.931 281858 DEBUG nova.network.os_vif_util [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Converting VIF {"id": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "address": "fa:16:3e:bb:b6:1c", "network": {"id": "13bbad22-ab61-4b1f-849e-c651aa8f3297", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1859087569-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "d048f19ff5fc47dc88162ef5f9cebe8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54433c73-7e", "ovs_interfaceid": "54433c73-7e5c-481c-b64c-19e9cfd6e56f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.932 281858 DEBUG nova.network.os_vif_util [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:b6:1c,bridge_name='br-int',has_traffic_filtering=True,id=54433c73-7e5c-481c-b64c-19e9cfd6e56f,network=Network(13bbad22-ab61-4b1f-849e-c651aa8f3297),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap54433c73-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.932 281858 DEBUG os_vif [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:b6:1c,bridge_name='br-int',has_traffic_filtering=True,id=54433c73-7e5c-481c-b64c-19e9cfd6e56f,network=Network(13bbad22-ab61-4b1f-849e-c651aa8f3297),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap54433c73-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.933 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.934 281858 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54433c73-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.936 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.938 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.939 281858 INFO os_vif [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:b6:1c,bridge_name='br-int',has_traffic_filtering=True,id=54433c73-7e5c-481c-b64c-19e9cfd6e56f,network=Network(13bbad22-ab61-4b1f-849e-c651aa8f3297),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap54433c73-7e')
Dec 02 10:04:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:04:55 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1384798435' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:55 np0005541913.localdomain podman[311788]: 2025-12-02 10:04:55.96595286 +0000 UTC m=+0.055225669 container cleanup e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:04:55 np0005541913.localdomain systemd[1]: libpod-conmon-e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980.scope: Deactivated successfully.
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.973 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.977 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:04:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:55.991 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:04:56 np0005541913.localdomain podman[311802]: 2025-12-02 10:04:56.000019281 +0000 UTC m=+0.062624406 container remove e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.003 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6601959a-9d80-49a9-a55f-c80b61b193e8]: (4, ('Tue Dec  2 10:04:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e (e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980)\ne8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980\nTue Dec  2 10:04:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e (e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980)\ne8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.004 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[49c553bb-6f88-4710-a4d8-a86a772afce8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.005 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc40d86e4-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:56 np0005541913.localdomain kernel: device tapc40d86e4-70 left promiscuous mode
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.009 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.013 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.013 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.021 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.023 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[b832e38e-071d-44c2-afbc-8c63fdf4d517]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.048 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[7939a2ba-4c44-4634-905c-2ded153dcd20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.050 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[c9270f5b-e9b4-448e-a2d7-09432c8963e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.061 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[09a83e81-79a1-479c-b01f-19401fb16a83]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1205195, 'reachable_time': 25638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 311837, 'error': None, 'target': 'ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.062 160371 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c40d86e4-7101-443b-abce-328f7d1ea40e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.063 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[a6df44d3-e753-479f-9683-698c24e339e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.064 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 54433c73-7e5c-481c-b64c-19e9cfd6e56f in datapath 13bbad22-ab61-4b1f-849e-c651aa8f3297 unbound from our chassis
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.068 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 53bdfc6a-79b0-43cf-92a6-99b85b988b28 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.068 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 13bbad22-ab61-4b1f-849e-c651aa8f3297, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.069 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b01095-d35d-4246-981d-c5eafeaca956]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.069 160221 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297 namespace which is not needed anymore
Dec 02 10:04:56 np0005541913.localdomain ceph-mon[298296]: pgmap v164: 177 pgs: 177 active+clean; 225 MiB data, 877 MiB used, 41 GiB / 42 GiB avail; 330 KiB/s rd, 2.1 MiB/s wr, 68 op/s
Dec 02 10:04:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/4098882359' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1384798435' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:56 np0005541913.localdomain neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297[311600]: [NOTICE]   (311611) : haproxy version is 2.8.14-c23fe91
Dec 02 10:04:56 np0005541913.localdomain neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297[311600]: [NOTICE]   (311611) : path to executable is /usr/sbin/haproxy
Dec 02 10:04:56 np0005541913.localdomain neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297[311600]: [WARNING]  (311611) : Exiting Master process...
Dec 02 10:04:56 np0005541913.localdomain neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297[311600]: [ALERT]    (311611) : Current worker (311616) exited with code 143 (Terminated)
Dec 02 10:04:56 np0005541913.localdomain neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297[311600]: [WARNING]  (311611) : All workers exited. Exiting... (0)
Dec 02 10:04:56 np0005541913.localdomain systemd[1]: libpod-aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5.scope: Deactivated successfully.
Dec 02 10:04:56 np0005541913.localdomain podman[311855]: 2025-12-02 10:04:56.285013886 +0000 UTC m=+0.067418235 container died aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:04:56 np0005541913.localdomain podman[311855]: 2025-12-02 10:04:56.326933407 +0000 UTC m=+0.109337776 container cleanup aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:04:56 np0005541913.localdomain podman[311869]: 2025-12-02 10:04:56.344341233 +0000 UTC m=+0.051880250 container cleanup aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:04:56 np0005541913.localdomain systemd[1]: libpod-conmon-aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5.scope: Deactivated successfully.
Dec 02 10:04:56 np0005541913.localdomain podman[311885]: 2025-12-02 10:04:56.41153877 +0000 UTC m=+0.067791894 container remove aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.417 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[9eef4df9-becd-4cf9-8738-59c9c7809aef]: (4, ('Tue Dec  2 10:04:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297 (aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5)\naa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5\nTue Dec  2 10:04:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297 (aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5)\naa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.419 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[74ec3334-3101-431b-b897-9cbd13ab4807]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.420 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13bbad22-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.424 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:56 np0005541913.localdomain kernel: device tap13bbad22-a0 left promiscuous mode
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.434 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.439 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[f18d4d21-672e-4efa-afda-f371dae1a342]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.454 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[933960ca-ae1f-40d7-991e-fd951415aef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.455 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a153086b-5bc8-4679-a289-407df64bab54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.472 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[9132d26b-d90d-4d80-83a6-3401a01e348f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1205278, 'reachable_time': 33002, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 311903, 'error': None, 'target': 'ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.475 160371 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-13bbad22-ab61-4b1f-849e-c651aa8f3297 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.475 160371 DEBUG oslo.privsep.daemon [-] privsep: reply[18b0b5e1-651f-4f78-a463-e6f31a86ffd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.476 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c1f0bd46-6bae-4902-9292-e19c6e88557a in datapath 26a036bb-7fc2-42d0-b324-4cf6bb77a9d6 unbound from our chassis
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.480 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:04:56 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:04:56.481 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[df3f41b4-72cb-4f4e-a9a2-fbfc6eeb32b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.564 281858 INFO nova.virt.libvirt.driver [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Deleting instance files /var/lib/nova/instances/82e23ec3-1d57-4166-9ba0-839ded943a78_del
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.565 281858 INFO nova.virt.libvirt.driver [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Deletion of /var/lib/nova/instances/82e23ec3-1d57-4166-9ba0-839ded943a78_del complete
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.610 281858 INFO nova.compute.manager [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Took 0.98 seconds to destroy the instance on the hypervisor.
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.611 281858 DEBUG oslo.service.loopingcall [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.612 281858 DEBUG nova.compute.manager [-] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 02 10:04:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:56.612 281858 DEBUG nova.network.neutron [-] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 02 10:04:56 np0005541913.localdomain systemd[1]: tmp-crun.0EJDya.mount: Deactivated successfully.
Dec 02 10:04:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-717f1ef704ec9c5b6d7c4f85d43274eb21a1cf80205ee2cbc3615610a19b18d1-merged.mount: Deactivated successfully.
Dec 02 10:04:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa94b1110fea6af2db23700b994e23c475ca73bb6dcc0213f1d1418b60de19a5-userdata-shm.mount: Deactivated successfully.
Dec 02 10:04:56 np0005541913.localdomain systemd[1]: run-netns-ovnmeta\x2d13bbad22\x2dab61\x2d4b1f\x2d849e\x2dc651aa8f3297.mount: Deactivated successfully.
Dec 02 10:04:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5f06f0939af8ced9e01822fd15f35fbfde05ec9e41ca9e0ac345284976c2f364-merged.mount: Deactivated successfully.
Dec 02 10:04:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8e9f51d75b53fa73cc5ca712ffa936d423cca0867a5796abeddc016534ff980-userdata-shm.mount: Deactivated successfully.
Dec 02 10:04:56 np0005541913.localdomain systemd[1]: run-netns-ovnmeta\x2dc40d86e4\x2d7101\x2d443b\x2dabce\x2d328f7d1ea40e.mount: Deactivated successfully.
Dec 02 10:04:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:57.308 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:04:57 np0005541913.localdomain systemd[1]: Stopping User Manager for UID 42436...
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Activating special unit Exit the Session...
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Stopped target Main User Target.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Stopped target Basic System.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Stopped target Paths.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Stopped target Sockets.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Stopped target Timers.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Closed D-Bus User Message Bus Socket.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Stopped Create User's Volatile Files and Directories.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Removed slice User Application Slice.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Reached target Shutdown.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Finished Exit the Session.
Dec 02 10:04:57 np0005541913.localdomain systemd[311212]: Reached target Exit the Session.
Dec 02 10:04:57 np0005541913.localdomain systemd[1]: user@42436.service: Deactivated successfully.
Dec 02 10:04:57 np0005541913.localdomain systemd[1]: Stopped User Manager for UID 42436.
Dec 02 10:04:57 np0005541913.localdomain systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec 02 10:04:57 np0005541913.localdomain systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec 02 10:04:57 np0005541913.localdomain systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec 02 10:04:57 np0005541913.localdomain systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec 02 10:04:57 np0005541913.localdomain systemd[1]: Removed slice User Slice of UID 42436.
Dec 02 10:04:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:57.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:04:58 np0005541913.localdomain ceph-mon[298296]: pgmap v165: 177 pgs: 177 active+clean; 225 MiB data, 877 MiB used, 41 GiB / 42 GiB avail; 330 KiB/s rd, 2.1 MiB/s wr, 68 op/s
Dec 02 10:04:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:59.190 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:04:01Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9909244250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a7a2e0>], id=54433c73-7e5c-481c-b64c-19e9cfd6e56f, ip_allocation=immediate, mac_address=fa:16:3e:bb:b6:1c, name=tempest-parent-146896978, network_id=13bbad22-ab61-4b1f-849e-c651aa8f3297, port_security_enabled=True, project_id=d048f19ff5fc47dc88162ef5f9cebe8b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=13, security_groups=['576d6513-029b-4880-bb0b-58094b586b90'], standard_attr_id=537, status=DOWN, tags=[], tenant_id=d048f19ff5fc47dc88162ef5f9cebe8b, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990899be80>], trunk_id=3bda7a6b-42c4-4395-9870-485919ec4ac2, updated_at=2025-12-02T10:04:58Z on network 13bbad22-ab61-4b1f-849e-c651aa8f3297
Dec 02 10:04:59 np0005541913.localdomain podman[311922]: 2025-12-02 10:04:59.399150517 +0000 UTC m=+0.058632148 container kill 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:04:59 np0005541913.localdomain dnsmasq[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/addn_hosts - 2 addresses
Dec 02 10:04:59 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/host
Dec 02 10:04:59 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/opts
Dec 02 10:04:59 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:04:59Z|00166|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:04:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:59.480 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:04:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:04:59.607 263406 INFO neutron.agent.dhcp.agent [None req-33be06b8-0323-4e71-9863-e6bafdac769b - - - - - -] DHCP configuration for ports {'54433c73-7e5c-481c-b64c-19e9cfd6e56f'} is completed
Dec 02 10:04:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:59.798 281858 DEBUG nova.network.neutron [-] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:04:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:59.825 281858 INFO nova.compute.manager [-] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Took 3.21 seconds to deallocate network for instance.
Dec 02 10:04:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:59.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:04:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:59.898 281858 DEBUG oslo_concurrency.lockutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:04:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:59.898 281858 DEBUG oslo_concurrency.lockutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:04:59 np0005541913.localdomain ceph-mon[298296]: pgmap v166: 177 pgs: 177 active+clean; 145 MiB data, 738 MiB used, 41 GiB / 42 GiB avail; 349 KiB/s rd, 2.1 MiB/s wr, 96 op/s
Dec 02 10:04:59 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1410163043' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:04:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:04:59.967 281858 DEBUG oslo_concurrency.processutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.030 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.288 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:00 np0005541913.localdomain dnsmasq[311070]: exiting on receipt of SIGTERM
Dec 02 10:05:00 np0005541913.localdomain podman[311979]: 2025-12-02 10:05:00.30416091 +0000 UTC m=+0.067305982 container kill 2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:05:00 np0005541913.localdomain systemd[1]: libpod-2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88.scope: Deactivated successfully.
Dec 02 10:05:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:05:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:05:00 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:05:00.404 2 INFO neutron.agent.securitygroups_rpc [req-80b5ad2b-fb4c-4362-be26-82a96d5f7828 req-4740c003-3af7-4933-8b00-851aa84e7e55 955214da09cd44dba70e1a06eabc9023 50df25ee29424615807a458690cdf8d7 - - default default] Security group member updated ['2e537c1e-d2f3-49fb-8c4c-0f6b2c3e354b']
Dec 02 10:05:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:05:00 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/557978648' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:00 np0005541913.localdomain podman[311999]: 2025-12-02 10:05:00.421422066 +0000 UTC m=+0.080368321 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:05:00 np0005541913.localdomain podman[311999]: 2025-12-02 10:05:00.426840831 +0000 UTC m=+0.085787066 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.435 281858 DEBUG oslo_concurrency.processutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:05:00 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:05:00 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:00.444 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:05:00Z, description=, device_id=abf8d33c-4e24-4d26-af41-b01c828c67e0, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a61fa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a617f0>], id=a0a73e76-685f-4ba0-87b5-5dd27b54fab4, ip_allocation=immediate, mac_address=fa:16:3e:16:9d:c1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:04:35Z, description=, dns_domain=, id=45d02cf1-f511-4416-b7c1-b37c417f16f9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1627103925-network, port_security_enabled=True, project_id=50df25ee29424615807a458690cdf8d7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33331, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=681, status=ACTIVE, subnets=['34aa8025-e49d-4c09-aefd-41c4d8900224'], tags=[], tenant_id=50df25ee29424615807a458690cdf8d7, updated_at=2025-12-02T10:04:36Z, vlan_transparent=None, network_id=45d02cf1-f511-4416-b7c1-b37c417f16f9, port_security_enabled=True, project_id=50df25ee29424615807a458690cdf8d7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['2e537c1e-d2f3-49fb-8c4c-0f6b2c3e354b'], standard_attr_id=757, status=DOWN, tags=[], tenant_id=50df25ee29424615807a458690cdf8d7, updated_at=2025-12-02T10:05:00Z on network 45d02cf1-f511-4416-b7c1-b37c417f16f9
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.444 281858 DEBUG nova.compute.provider_tree [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:05:00 np0005541913.localdomain podman[311998]: 2025-12-02 10:05:00.451785028 +0000 UTC m=+0.118352077 container died 2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:05:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88-userdata-shm.mount: Deactivated successfully.
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.478 281858 DEBUG nova.scheduler.client.report [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:05:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-909ccbf636b56e5fcb70f402308fe6a02f149a317eaed6dc848cd26938534901-merged.mount: Deactivated successfully.
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.502 281858 DEBUG oslo_concurrency.lockutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:05:00 np0005541913.localdomain podman[312000]: 2025-12-02 10:05:00.516708595 +0000 UTC m=+0.176043450 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.544 281858 INFO nova.scheduler.client.report [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Deleted allocations for instance 82e23ec3-1d57-4166-9ba0-839ded943a78
Dec 02 10:05:00 np0005541913.localdomain podman[312000]: 2025-12-02 10:05:00.561192405 +0000 UTC m=+0.220527190 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 02 10:05:00 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:05:00 np0005541913.localdomain podman[311998]: 2025-12-02 10:05:00.620383479 +0000 UTC m=+0.286950428 container remove 2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-26a036bb-7fc2-42d0-b324-4cf6bb77a9d6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:05:00 np0005541913.localdomain systemd[1]: libpod-conmon-2dcb997821f4d2734f28b29240adaafbdb32b4d61b868150280126a880924e88.scope: Deactivated successfully.
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.631 281858 DEBUG oslo_concurrency.lockutils [None req-bb096adb-4a93-4648-916e-6da0c1af22b8 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Lock "82e23ec3-1d57-4166-9ba0-839ded943a78" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:05:00 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:00.656 263406 INFO neutron.agent.dhcp.agent [None req-c5a175e5-f595-4217-adbe-0df0f2fc8df8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:00 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:00.658 263406 INFO neutron.agent.dhcp.agent [None req-c5a175e5-f595-4217-adbe-0df0f2fc8df8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:00 np0005541913.localdomain podman[312081]: 2025-12-02 10:05:00.768709897 +0000 UTC m=+0.062998676 container kill 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:05:00 np0005541913.localdomain dnsmasq[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/addn_hosts - 2 addresses
Dec 02 10:05:00 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/host
Dec 02 10:05:00 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/opts
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:05:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:00.936 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:00 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/557978648' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:01.056 263406 INFO neutron.agent.dhcp.agent [None req-4fe268aa-7bb5-4bb8-af41-38d54c225599 - - - - - -] DHCP configuration for ports {'a0a73e76-685f-4ba0-87b5-5dd27b54fab4'} is completed
Dec 02 10:05:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:01.170 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005541914.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:05:00Z, description=, device_id=abf8d33c-4e24-4d26-af41-b01c828c67e0, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a615b0>], dns_domain=, dns_name=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99092b8af0>], id=a0a73e76-685f-4ba0-87b5-5dd27b54fab4, ip_allocation=immediate, mac_address=fa:16:3e:16:9d:c1, name=, network_id=45d02cf1-f511-4416-b7c1-b37c417f16f9, port_security_enabled=True, project_id=50df25ee29424615807a458690cdf8d7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['2e537c1e-d2f3-49fb-8c4c-0f6b2c3e354b'], standard_attr_id=757, status=DOWN, tags=[], tenant_id=50df25ee29424615807a458690cdf8d7, updated_at=2025-12-02T10:05:00Z on network 45d02cf1-f511-4416-b7c1-b37c417f16f9
Dec 02 10:05:01 np0005541913.localdomain dnsmasq[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/addn_hosts - 2 addresses
Dec 02 10:05:01 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/host
Dec 02 10:05:01 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/opts
Dec 02 10:05:01 np0005541913.localdomain podman[312119]: 2025-12-02 10:05:01.409857169 +0000 UTC m=+0.077431962 container kill 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:05:01 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d26a036bb\x2d7fc2\x2d42d0\x2db324\x2d4cf6bb77a9d6.mount: Deactivated successfully.
Dec 02 10:05:01 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:05:01.531 2 INFO neutron.agent.securitygroups_rpc [None req-9eec1e00-2947-423e-88e8-b2e4c78afea0 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Security group member updated ['576d6513-029b-4880-bb0b-58094b586b90']
Dec 02 10:05:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:01.588 263406 INFO neutron.agent.dhcp.agent [None req-c4a472ea-6649-45d6-bbcf-cb7850575d3a - - - - - -] DHCP configuration for ports {'a0a73e76-685f-4ba0-87b5-5dd27b54fab4'} is completed
Dec 02 10:05:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:01.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:05:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e109 e109: 6 total, 6 up, 6 in
Dec 02 10:05:01 np0005541913.localdomain ceph-mon[298296]: pgmap v167: 177 pgs: 177 active+clean; 145 MiB data, 738 MiB used, 41 GiB / 42 GiB avail; 51 KiB/s rd, 24 KiB/s wr, 36 op/s
Dec 02 10:05:03 np0005541913.localdomain ceph-mon[298296]: osdmap e109: 6 total, 6 up, 6 in
Dec 02 10:05:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/4275910269' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:03.049 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:05:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:03.049 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:05:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:03.050 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:05:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:05:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:05:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:05:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:05:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:05:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:05:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:05:04 np0005541913.localdomain ceph-mon[298296]: pgmap v169: 177 pgs: 177 active+clean; 192 MiB data, 809 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 2.1 MiB/s wr, 81 op/s
Dec 02 10:05:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2077633090' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/920620619' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/402438692' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:05:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:04Z|00167|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:05:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:04.984 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3823900154' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:05:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/4000286707' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2916680470' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:05:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2916680470' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:05:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:05.328 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:05 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:05.532 263406 INFO neutron.agent.linux.ip_lib [None req-29d225a6-dc6b-4286-ae5e-3b8927ee1c5b - - - - - -] Device tapbf5295be-03 cannot be used as it has no MAC address
Dec 02 10:05:05 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:05:05.551 2 INFO neutron.agent.securitygroups_rpc [None req-7954669c-1491-4ccc-a463-0efe07ba8bc3 ec20a6cceee246d6b46878df263d30a4 d048f19ff5fc47dc88162ef5f9cebe8b - - default default] Security group member updated ['576d6513-029b-4880-bb0b-58094b586b90']
Dec 02 10:05:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:05.554 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:05 np0005541913.localdomain kernel: device tapbf5295be-03 entered promiscuous mode
Dec 02 10:05:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:05Z|00168|binding|INFO|Claiming lport bf5295be-0321-4f82-8125-4c1394da80db for this chassis.
Dec 02 10:05:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:05Z|00169|binding|INFO|bf5295be-0321-4f82-8125-4c1394da80db: Claiming unknown
Dec 02 10:05:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:05.561 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:05 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669905.5625] manager: (tapbf5295be-03): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Dec 02 10:05:05 np0005541913.localdomain systemd-udevd[312151]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:05:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:05.571 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-a0d374a1-1751-4f10-b3b2-966d56e45d4e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0d374a1-1751-4f10-b3b2-966d56e45d4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29134a5a6b554e34bc1729ff0e939209', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f78d00a-923c-4dce-8ef1-798cd9b95762, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=bf5295be-0321-4f82-8125-4c1394da80db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:05.572 160221 INFO neutron.agent.ovn.metadata.agent [-] Port bf5295be-0321-4f82-8125-4c1394da80db in datapath a0d374a1-1751-4f10-b3b2-966d56e45d4e bound to our chassis
Dec 02 10:05:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:05.574 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a0d374a1-1751-4f10-b3b2-966d56e45d4e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:05:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:05.575 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5af1dbe3-70e8-4bd6-b5c4-d6990bc1c41e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:05:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:05Z|00170|binding|INFO|Setting lport bf5295be-0321-4f82-8125-4c1394da80db ovn-installed in OVS
Dec 02 10:05:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:05Z|00171|binding|INFO|Setting lport bf5295be-0321-4f82-8125-4c1394da80db up in Southbound
Dec 02 10:05:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:05.584 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:05.601 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:05.631 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:05.654 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:05 np0005541913.localdomain dnsmasq[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/addn_hosts - 1 addresses
Dec 02 10:05:05 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/host
Dec 02 10:05:05 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/opts
Dec 02 10:05:05 np0005541913.localdomain podman[312180]: 2025-12-02 10:05:05.750141064 +0000 UTC m=+0.055272900 container kill 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 02 10:05:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:05.937 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:05:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:05:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:05:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159754 "" "Go-http-client/1.1"
Dec 02 10:05:06 np0005541913.localdomain ceph-mon[298296]: pgmap v170: 177 pgs: 177 active+clean; 192 MiB data, 809 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 2.1 MiB/s wr, 81 op/s
Dec 02 10:05:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:05:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20200 "" "Go-http-client/1.1"
Dec 02 10:05:06 np0005541913.localdomain podman[312243]: 
Dec 02 10:05:06 np0005541913.localdomain podman[312243]: 2025-12-02 10:05:06.520460272 +0000 UTC m=+0.119550520 container create e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:05:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336.scope.
Dec 02 10:05:06 np0005541913.localdomain podman[312243]: 2025-12-02 10:05:06.457969439 +0000 UTC m=+0.057059767 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:05:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:05:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7a871c6a7c5e20892b6d31b8714daf06c35d9aa0dc6eaa8b0680c07851460a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:05:06 np0005541913.localdomain podman[312243]: 2025-12-02 10:05:06.578966336 +0000 UTC m=+0.178056594 container init e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:05:06 np0005541913.localdomain podman[312243]: 2025-12-02 10:05:06.586306743 +0000 UTC m=+0.185396991 container start e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:05:06 np0005541913.localdomain dnsmasq[312260]: started, version 2.85 cachesize 150
Dec 02 10:05:06 np0005541913.localdomain dnsmasq[312260]: DNS service limited to local subnets
Dec 02 10:05:06 np0005541913.localdomain dnsmasq[312260]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:05:06 np0005541913.localdomain dnsmasq[312260]: warning: no upstream servers configured
Dec 02 10:05:06 np0005541913.localdomain dnsmasq-dhcp[312260]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:05:06 np0005541913.localdomain dnsmasq[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/addn_hosts - 0 addresses
Dec 02 10:05:06 np0005541913.localdomain dnsmasq-dhcp[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/host
Dec 02 10:05:06 np0005541913.localdomain dnsmasq-dhcp[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/opts
Dec 02 10:05:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e110 e110: 6 total, 6 up, 6 in
Dec 02 10:05:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:07.672 263406 INFO neutron.agent.dhcp.agent [None req-dfef2434-02c5-45c2-b720-09a76b78cc03 - - - - - -] DHCP configuration for ports {'5043eaee-7a67-478c-bc16-9e2356ef58c3'} is completed
Dec 02 10:05:07 np0005541913.localdomain dnsmasq[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/addn_hosts - 0 addresses
Dec 02 10:05:07 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/host
Dec 02 10:05:07 np0005541913.localdomain dnsmasq-dhcp[308473]: read /var/lib/neutron/dhcp/13bbad22-ab61-4b1f-849e-c651aa8f3297/opts
Dec 02 10:05:07 np0005541913.localdomain podman[312276]: 2025-12-02 10:05:07.856712061 +0000 UTC m=+0.072179272 container kill 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:05:08 np0005541913.localdomain ceph-mon[298296]: pgmap v171: 177 pgs: 177 active+clean; 192 MiB data, 809 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 2.1 MiB/s wr, 81 op/s
Dec 02 10:05:08 np0005541913.localdomain ceph-mon[298296]: osdmap e110: 6 total, 6 up, 6 in
Dec 02 10:05:09 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:09Z|00172|binding|INFO|Releasing lport c4946b01-0395-4a62-9a39-4286d5803bca from this chassis (sb_readonly=0)
Dec 02 10:05:09 np0005541913.localdomain kernel: device tapc4946b01-03 left promiscuous mode
Dec 02 10:05:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:09.589 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:09 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:09Z|00173|binding|INFO|Setting lport c4946b01-0395-4a62-9a39-4286d5803bca down in Southbound
Dec 02 10:05:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:09.607 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:09.954 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13bbad22-ab61-4b1f-849e-c651aa8f3297', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd048f19ff5fc47dc88162ef5f9cebe8b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51e42abf-8647-4013-9c62-778191c64ad0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c4946b01-0395-4a62-9a39-4286d5803bca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:09.956 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c4946b01-0395-4a62-9a39-4286d5803bca in datapath 13bbad22-ab61-4b1f-849e-c651aa8f3297 unbound from our chassis
Dec 02 10:05:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:09.958 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 13bbad22-ab61-4b1f-849e-c651aa8f3297, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:05:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:09.959 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ec0b4b-16ff-4dfa-b664-35df202bb32b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:05:10 np0005541913.localdomain ceph-mon[298296]: pgmap v173: 177 pgs: 177 active+clean; 192 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 1.2 MiB/s rd, 2.7 MiB/s wr, 145 op/s
Dec 02 10:05:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:10.332 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:10.866 281858 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764669895.8651826, 82e23ec3-1d57-4166-9ba0-839ded943a78 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 02 10:05:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:10.867 281858 INFO nova.compute.manager [-] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] VM Stopped (Lifecycle Event)
Dec 02 10:05:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e111 e111: 6 total, 6 up, 6 in
Dec 02 10:05:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:10.940 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:10.956 281858 DEBUG nova.compute.manager [None req-91b579d9-7b1c-461e-afee-58fa95f1d2e6 - - - - - -] [instance: 82e23ec3-1d57-4166-9ba0-839ded943a78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 02 10:05:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:11 np0005541913.localdomain ceph-mon[298296]: osdmap e111: 6 total, 6 up, 6 in
Dec 02 10:05:11 np0005541913.localdomain ceph-mon[298296]: pgmap v175: 177 pgs: 177 active+clean; 192 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 1.2 MiB/s rd, 23 KiB/s wr, 86 op/s
Dec 02 10:05:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:05:13 np0005541913.localdomain podman[312298]: 2025-12-02 10:05:13.434317118 +0000 UTC m=+0.077102714 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:05:13 np0005541913.localdomain podman[312298]: 2025-12-02 10:05:13.477117313 +0000 UTC m=+0.119902879 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 10:05:13 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:05:14 np0005541913.localdomain ceph-mon[298296]: pgmap v176: 177 pgs: 177 active+clean; 192 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 23 KiB/s wr, 141 op/s
Dec 02 10:05:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:15.381 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:15.942 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:15 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:15Z|00174|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:05:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:15.997 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:16.016 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:16 np0005541913.localdomain ceph-mon[298296]: pgmap v177: 177 pgs: 177 active+clean; 192 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 23 KiB/s wr, 141 op/s
Dec 02 10:05:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:05:16 np0005541913.localdomain podman[312317]: 2025-12-02 10:05:16.455945315 +0000 UTC m=+0.088719464 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:05:16 np0005541913.localdomain podman[312317]: 2025-12-02 10:05:16.464985197 +0000 UTC m=+0.097759286 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 10:05:16 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:05:16 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:16.626 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:05:16Z, description=, device_id=1ad64abe-8977-48b7-83a3-2b942dce5ba9, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089e70d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089e7eb0>], id=3e752f51-4af7-48aa-9e8c-d2eedb1ead78, ip_allocation=immediate, mac_address=fa:16:3e:0d:96:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:05:01Z, description=, dns_domain=, id=a0d374a1-1751-4f10-b3b2-966d56e45d4e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-911438573-network, port_security_enabled=True, project_id=29134a5a6b554e34bc1729ff0e939209, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5631, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=763, status=ACTIVE, subnets=['59ecd7aa-b77a-48c2-b3b9-401ba5c9b11b'], tags=[], tenant_id=29134a5a6b554e34bc1729ff0e939209, updated_at=2025-12-02T10:05:03Z, vlan_transparent=None, network_id=a0d374a1-1751-4f10-b3b2-966d56e45d4e, port_security_enabled=False, project_id=29134a5a6b554e34bc1729ff0e939209, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=808, status=DOWN, tags=[], tenant_id=29134a5a6b554e34bc1729ff0e939209, updated_at=2025-12-02T10:05:16Z on network a0d374a1-1751-4f10-b3b2-966d56e45d4e
Dec 02 10:05:16 np0005541913.localdomain dnsmasq[308473]: exiting on receipt of SIGTERM
Dec 02 10:05:16 np0005541913.localdomain podman[312352]: 2025-12-02 10:05:16.630918077 +0000 UTC m=+0.045672254 container kill 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:05:16 np0005541913.localdomain systemd[1]: libpod-77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7.scope: Deactivated successfully.
Dec 02 10:05:16 np0005541913.localdomain podman[312367]: 2025-12-02 10:05:16.681561931 +0000 UTC m=+0.035426729 container died 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:05:16 np0005541913.localdomain podman[312367]: 2025-12-02 10:05:16.723509204 +0000 UTC m=+0.077374002 container remove 77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-13bbad22-ab61-4b1f-849e-c651aa8f3297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:05:16 np0005541913.localdomain systemd[1]: libpod-conmon-77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7.scope: Deactivated successfully.
Dec 02 10:05:16 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:16.758 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:16 np0005541913.localdomain dnsmasq[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/addn_hosts - 1 addresses
Dec 02 10:05:16 np0005541913.localdomain dnsmasq-dhcp[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/host
Dec 02 10:05:16 np0005541913.localdomain dnsmasq-dhcp[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/opts
Dec 02 10:05:16 np0005541913.localdomain podman[312409]: 2025-12-02 10:05:16.848825156 +0000 UTC m=+0.055047723 container kill e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:05:17 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:17.068 263406 INFO neutron.agent.dhcp.agent [None req-5728adaf-47c6-4148-b907-d1c645ccb334 - - - - - -] DHCP configuration for ports {'3e752f51-4af7-48aa-9e8c-d2eedb1ead78'} is completed
Dec 02 10:05:17 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:17.110 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-896dba9b1a38f0638159f863e9536c69068bcbb89b8facb5a357e5a5dc8cf960-merged.mount: Deactivated successfully.
Dec 02 10:05:17 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77037373fbf82d7f180f8f44af5375c4189d52dcc7de8304c6ea7370610e44f7-userdata-shm.mount: Deactivated successfully.
Dec 02 10:05:17 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d13bbad22\x2dab61\x2d4b1f\x2d849e\x2dc651aa8f3297.mount: Deactivated successfully.
Dec 02 10:05:17 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:17.717 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:05:16Z, description=, device_id=1ad64abe-8977-48b7-83a3-2b942dce5ba9, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908aef460>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a02a00>], id=3e752f51-4af7-48aa-9e8c-d2eedb1ead78, ip_allocation=immediate, mac_address=fa:16:3e:0d:96:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:05:01Z, description=, dns_domain=, id=a0d374a1-1751-4f10-b3b2-966d56e45d4e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-911438573-network, port_security_enabled=True, project_id=29134a5a6b554e34bc1729ff0e939209, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5631, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=763, status=ACTIVE, subnets=['59ecd7aa-b77a-48c2-b3b9-401ba5c9b11b'], tags=[], tenant_id=29134a5a6b554e34bc1729ff0e939209, updated_at=2025-12-02T10:05:03Z, vlan_transparent=None, network_id=a0d374a1-1751-4f10-b3b2-966d56e45d4e, port_security_enabled=False, project_id=29134a5a6b554e34bc1729ff0e939209, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=808, status=DOWN, tags=[], tenant_id=29134a5a6b554e34bc1729ff0e939209, updated_at=2025-12-02T10:05:16Z on network a0d374a1-1751-4f10-b3b2-966d56e45d4e
Dec 02 10:05:17 np0005541913.localdomain dnsmasq[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/addn_hosts - 1 addresses
Dec 02 10:05:17 np0005541913.localdomain dnsmasq-dhcp[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/host
Dec 02 10:05:17 np0005541913.localdomain podman[312446]: 2025-12-02 10:05:17.944713364 +0000 UTC m=+0.059475902 container kill e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:05:17 np0005541913.localdomain dnsmasq-dhcp[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/opts
Dec 02 10:05:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:18.262 263406 INFO neutron.agent.dhcp.agent [None req-f6ea4edc-e8e9-4ae3-b79c-4484998561d8 - - - - - -] DHCP configuration for ports {'3e752f51-4af7-48aa-9e8c-d2eedb1ead78'} is completed
Dec 02 10:05:18 np0005541913.localdomain ceph-mon[298296]: pgmap v178: 177 pgs: 177 active+clean; 192 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 19 KiB/s wr, 116 op/s
Dec 02 10:05:18 np0005541913.localdomain podman[312484]: 2025-12-02 10:05:18.657705839 +0000 UTC m=+0.044628515 container kill 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:05:18 np0005541913.localdomain dnsmasq[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/addn_hosts - 0 addresses
Dec 02 10:05:18 np0005541913.localdomain dnsmasq-dhcp[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/host
Dec 02 10:05:18 np0005541913.localdomain dnsmasq-dhcp[309469]: read /var/lib/neutron/dhcp/97ae066a-ecdb-4d1f-a021-787e342a02a4/opts
Dec 02 10:05:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:05:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:05:18 np0005541913.localdomain podman[312499]: 2025-12-02 10:05:18.748200829 +0000 UTC m=+0.059903853 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:05:18 np0005541913.localdomain podman[312499]: 2025-12-02 10:05:18.756887233 +0000 UTC m=+0.068590287 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:05:18 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:05:18 np0005541913.localdomain podman[312498]: 2025-12-02 10:05:18.807316552 +0000 UTC m=+0.123751573 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:05:18 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:18Z|00175|binding|INFO|Releasing lport ae9b1151-5912-406f-ae7b-9db37b471685 from this chassis (sb_readonly=0)
Dec 02 10:05:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:18.815 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:18 np0005541913.localdomain kernel: device tapae9b1151-59 left promiscuous mode
Dec 02 10:05:18 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:18Z|00176|binding|INFO|Setting lport ae9b1151-5912-406f-ae7b-9db37b471685 down in Southbound
Dec 02 10:05:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:18.825 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-97ae066a-ecdb-4d1f-a021-787e342a02a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-97ae066a-ecdb-4d1f-a021-787e342a02a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc1edab5ae5d43f08b967b5bf594f8b5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5764aa57-a87d-4e3f-89b1-49a48ee4f883, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=ae9b1151-5912-406f-ae7b-9db37b471685) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:18.826 160221 INFO neutron.agent.ovn.metadata.agent [-] Port ae9b1151-5912-406f-ae7b-9db37b471685 in datapath 97ae066a-ecdb-4d1f-a021-787e342a02a4 unbound from our chassis
Dec 02 10:05:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:18.827 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 97ae066a-ecdb-4d1f-a021-787e342a02a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:05:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:18.828 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[fc11fd53-a573-48a4-a2c8-ff9af4920ac8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:05:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:18.837 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:18 np0005541913.localdomain podman[312498]: 2025-12-02 10:05:18.847949548 +0000 UTC m=+0.164384569 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Dec 02 10:05:18 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:05:19 np0005541913.localdomain sudo[312548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:05:19 np0005541913.localdomain sudo[312548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:05:19 np0005541913.localdomain sudo[312548]: pam_unix(sudo:session): session closed for user root
Dec 02 10:05:19 np0005541913.localdomain sudo[312566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:05:19 np0005541913.localdomain sudo[312566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:05:19 np0005541913.localdomain systemd[1]: tmp-crun.V4gevc.mount: Deactivated successfully.
Dec 02 10:05:20 np0005541913.localdomain sudo[312566]: pam_unix(sudo:session): session closed for user root
Dec 02 10:05:20 np0005541913.localdomain ceph-mon[298296]: pgmap v179: 177 pgs: 177 active+clean; 192 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 1.4 MiB/s rd, 45 op/s
Dec 02 10:05:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:20.423 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:20 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:20Z|00177|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:05:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:20.591 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:20 np0005541913.localdomain sudo[312616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:05:20 np0005541913.localdomain sudo[312616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:05:20 np0005541913.localdomain sudo[312616]: pam_unix(sudo:session): session closed for user root
Dec 02 10:05:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:20.946 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:21 np0005541913.localdomain dnsmasq[309469]: exiting on receipt of SIGTERM
Dec 02 10:05:21 np0005541913.localdomain podman[312649]: 2025-12-02 10:05:21.343260676 +0000 UTC m=+0.043876956 container kill 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 02 10:05:21 np0005541913.localdomain systemd[1]: libpod-2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d.scope: Deactivated successfully.
Dec 02 10:05:21 np0005541913.localdomain podman[312662]: 2025-12-02 10:05:21.38567734 +0000 UTC m=+0.032429579 container died 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 02 10:05:21 np0005541913.localdomain systemd[1]: tmp-crun.bFFykn.mount: Deactivated successfully.
Dec 02 10:05:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:05:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:05:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:05:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:05:21 np0005541913.localdomain podman[312662]: 2025-12-02 10:05:21.482757558 +0000 UTC m=+0.129509817 container cleanup 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:05:21 np0005541913.localdomain systemd[1]: libpod-conmon-2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d.scope: Deactivated successfully.
Dec 02 10:05:21 np0005541913.localdomain podman[312668]: 2025-12-02 10:05:21.507129789 +0000 UTC m=+0.140615183 container remove 2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-97ae066a-ecdb-4d1f-a021-787e342a02a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:05:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:21.703 263406 INFO neutron.agent.dhcp.agent [None req-8256c916-4369-449f-98f7-d3a76b9976e7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:21.704 263406 INFO neutron.agent.dhcp.agent [None req-8256c916-4369-449f-98f7-d3a76b9976e7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:21.812 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f69386878a877368468586813b3dbb1937ee49b0390efbec5dd7e4f609902381-merged.mount: Deactivated successfully.
Dec 02 10:05:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2c8dc5a4dbc8911ab6f2c075727c856467cd80206751f8c06727935126920b4d-userdata-shm.mount: Deactivated successfully.
Dec 02 10:05:22 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d97ae066a\x2decdb\x2d4d1f\x2da021\x2d787e342a02a4.mount: Deactivated successfully.
Dec 02 10:05:22 np0005541913.localdomain ceph-mon[298296]: pgmap v180: 177 pgs: 177 active+clean; 192 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 1.3 MiB/s rd, 44 op/s
Dec 02 10:05:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:05:24 np0005541913.localdomain ceph-mon[298296]: pgmap v181: 177 pgs: 177 active+clean; 225 MiB data, 889 MiB used, 41 GiB / 42 GiB avail; 1.4 MiB/s rd, 2.1 MiB/s wr, 101 op/s
Dec 02 10:05:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:25.450 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:25.948 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:05:26 np0005541913.localdomain systemd[1]: tmp-crun.lY4SDO.mount: Deactivated successfully.
Dec 02 10:05:26 np0005541913.localdomain podman[312691]: 2025-12-02 10:05:26.454593477 +0000 UTC m=+0.091993661 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:05:26 np0005541913.localdomain podman[312691]: 2025-12-02 10:05:26.489990954 +0000 UTC m=+0.127391178 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 10:05:26 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:05:26 np0005541913.localdomain ceph-mon[298296]: pgmap v182: 177 pgs: 177 active+clean; 225 MiB data, 889 MiB used, 41 GiB / 42 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 02 10:05:26 np0005541913.localdomain dnsmasq[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/addn_hosts - 0 addresses
Dec 02 10:05:26 np0005541913.localdomain dnsmasq-dhcp[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/host
Dec 02 10:05:26 np0005541913.localdomain podman[312727]: 2025-12-02 10:05:26.64570782 +0000 UTC m=+0.081228764 container kill e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:05:26 np0005541913.localdomain dnsmasq-dhcp[312260]: read /var/lib/neutron/dhcp/a0d374a1-1751-4f10-b3b2-966d56e45d4e/opts
Dec 02 10:05:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:26Z|00178|binding|INFO|Releasing lport bf5295be-0321-4f82-8125-4c1394da80db from this chassis (sb_readonly=0)
Dec 02 10:05:26 np0005541913.localdomain kernel: device tapbf5295be-03 left promiscuous mode
Dec 02 10:05:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:26.817 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:26Z|00179|binding|INFO|Setting lport bf5295be-0321-4f82-8125-4c1394da80db down in Southbound
Dec 02 10:05:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:26.826 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-a0d374a1-1751-4f10-b3b2-966d56e45d4e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0d374a1-1751-4f10-b3b2-966d56e45d4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29134a5a6b554e34bc1729ff0e939209', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f78d00a-923c-4dce-8ef1-798cd9b95762, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=bf5295be-0321-4f82-8125-4c1394da80db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:26.828 160221 INFO neutron.agent.ovn.metadata.agent [-] Port bf5295be-0321-4f82-8125-4c1394da80db in datapath a0d374a1-1751-4f10-b3b2-966d56e45d4e unbound from our chassis
Dec 02 10:05:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:26.831 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a0d374a1-1751-4f10-b3b2-966d56e45d4e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:05:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:26.832 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[53805bbe-de4e-469c-bb25-27c74aff5dae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:05:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:26.845 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:28 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:28Z|00180|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:05:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:28.146 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:28 np0005541913.localdomain ceph-mon[298296]: pgmap v183: 177 pgs: 177 active+clean; 225 MiB data, 889 MiB used, 41 GiB / 42 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 02 10:05:28 np0005541913.localdomain systemd[1]: tmp-crun.tbw36N.mount: Deactivated successfully.
Dec 02 10:05:28 np0005541913.localdomain dnsmasq[312260]: exiting on receipt of SIGTERM
Dec 02 10:05:28 np0005541913.localdomain podman[312766]: 2025-12-02 10:05:28.633870469 +0000 UTC m=+0.071597086 container kill e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:05:28 np0005541913.localdomain systemd[1]: libpod-e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336.scope: Deactivated successfully.
Dec 02 10:05:28 np0005541913.localdomain podman[312778]: 2025-12-02 10:05:28.69334577 +0000 UTC m=+0.048926570 container died e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:05:28 np0005541913.localdomain systemd[1]: tmp-crun.qHM4Gf.mount: Deactivated successfully.
Dec 02 10:05:28 np0005541913.localdomain podman[312778]: 2025-12-02 10:05:28.785291529 +0000 UTC m=+0.140872329 container cleanup e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 10:05:28 np0005541913.localdomain systemd[1]: libpod-conmon-e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336.scope: Deactivated successfully.
Dec 02 10:05:28 np0005541913.localdomain podman[312785]: 2025-12-02 10:05:28.811650025 +0000 UTC m=+0.153414935 container remove e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a0d374a1-1751-4f10-b3b2-966d56e45d4e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:05:28 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:28.877 263406 INFO neutron.agent.dhcp.agent [None req-6b265581-9cc9-418d-afd1-98a43a0bdba5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:28 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:28.877 263406 INFO neutron.agent.dhcp.agent [None req-6b265581-9cc9-418d-afd1-98a43a0bdba5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7e7a871c6a7c5e20892b6d31b8714daf06c35d9aa0dc6eaa8b0680c07851460a-merged.mount: Deactivated successfully.
Dec 02 10:05:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e475fa1f3bd5e1e1c4c802a0246d97debe1749e5a828608645200f034bb18336-userdata-shm.mount: Deactivated successfully.
Dec 02 10:05:29 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2da0d374a1\x2d1751\x2d4f10\x2db3b2\x2d966d56e45d4e.mount: Deactivated successfully.
Dec 02 10:05:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:30.483 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:30 np0005541913.localdomain ceph-mon[298296]: pgmap v184: 177 pgs: 177 active+clean; 225 MiB data, 890 MiB used, 41 GiB / 42 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 02 10:05:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:30.950 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:05:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:05:31 np0005541913.localdomain podman[312808]: 2025-12-02 10:05:31.431976935 +0000 UTC m=+0.064193158 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Dec 02 10:05:31 np0005541913.localdomain systemd[1]: tmp-crun.eYTp05.mount: Deactivated successfully.
Dec 02 10:05:31 np0005541913.localdomain podman[312807]: 2025-12-02 10:05:31.49945043 +0000 UTC m=+0.131347595 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:05:31 np0005541913.localdomain podman[312807]: 2025-12-02 10:05:31.50689358 +0000 UTC m=+0.138790795 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:05:31 np0005541913.localdomain podman[312808]: 2025-12-02 10:05:31.517832743 +0000 UTC m=+0.150048886 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 10:05:31 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:05:31 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:05:31 np0005541913.localdomain ceph-mon[298296]: pgmap v185: 177 pgs: 177 active+clean; 225 MiB data, 890 MiB used, 41 GiB / 42 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 02 10:05:32 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:05:32.559 2 INFO neutron.agent.securitygroups_rpc [req-b389ec80-c5bd-4cf8-bcab-ec830d82cd86 req-ec3b1f9e-8373-4159-93fc-d4de0998f605 955214da09cd44dba70e1a06eabc9023 50df25ee29424615807a458690cdf8d7 - - default default] Security group member updated ['2e537c1e-d2f3-49fb-8c4c-0f6b2c3e354b']
Dec 02 10:05:32 np0005541913.localdomain systemd[1]: tmp-crun.ANUCOB.mount: Deactivated successfully.
Dec 02 10:05:32 np0005541913.localdomain dnsmasq[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/addn_hosts - 1 addresses
Dec 02 10:05:32 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/host
Dec 02 10:05:32 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/opts
Dec 02 10:05:32 np0005541913.localdomain podman[312872]: 2025-12-02 10:05:32.834762434 +0000 UTC m=+0.072334876 container kill 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:05:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:33.675 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:33.675 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:33.678 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:05:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:05:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:05:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:05:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:05:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:05:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:05:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:05:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:05:34 np0005541913.localdomain ceph-mon[298296]: pgmap v186: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail; 345 KiB/s rd, 2.1 MiB/s wr, 91 op/s
Dec 02 10:05:34 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2691401196' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:35 np0005541913.localdomain dnsmasq[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/addn_hosts - 0 addresses
Dec 02 10:05:35 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/host
Dec 02 10:05:35 np0005541913.localdomain dnsmasq-dhcp[311147]: read /var/lib/neutron/dhcp/45d02cf1-f511-4416-b7c1-b37c417f16f9/opts
Dec 02 10:05:35 np0005541913.localdomain podman[312909]: 2025-12-02 10:05:35.444210944 +0000 UTC m=+0.059157874 container kill 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 02 10:05:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:35.539 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:35Z|00181|binding|INFO|Releasing lport bd990115-9909-4e4e-a861-f26c2f53a28c from this chassis (sb_readonly=0)
Dec 02 10:05:35 np0005541913.localdomain kernel: device tapbd990115-99 left promiscuous mode
Dec 02 10:05:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:35.627 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:35Z|00182|binding|INFO|Setting lport bd990115-9909-4e4e-a861-f26c2f53a28c down in Southbound
Dec 02 10:05:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:35.636 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-45d02cf1-f511-4416-b7c1-b37c417f16f9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45d02cf1-f511-4416-b7c1-b37c417f16f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50df25ee29424615807a458690cdf8d7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b257864-5151-448f-941d-2c9a748f5881, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=bd990115-9909-4e4e-a861-f26c2f53a28c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:35.638 160221 INFO neutron.agent.ovn.metadata.agent [-] Port bd990115-9909-4e4e-a861-f26c2f53a28c in datapath 45d02cf1-f511-4416-b7c1-b37c417f16f9 unbound from our chassis
Dec 02 10:05:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:35.640 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45d02cf1-f511-4416-b7c1-b37c417f16f9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:05:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:35.641 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9e23b0-a6ec-4a2f-8eae-7f723f2602a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:05:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:35.649 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:35.952 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:05:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:05:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:05:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156107 "" "Go-http-client/1.1"
Dec 02 10:05:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:05:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19231 "" "Go-http-client/1.1"
Dec 02 10:05:36 np0005541913.localdomain ceph-mon[298296]: pgmap v187: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Dec 02 10:05:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:37Z|00183|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:05:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:37.648 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:37.681 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:05:38 np0005541913.localdomain ceph-mon[298296]: pgmap v188: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Dec 02 10:05:38 np0005541913.localdomain dnsmasq[311147]: exiting on receipt of SIGTERM
Dec 02 10:05:38 np0005541913.localdomain podman[312948]: 2025-12-02 10:05:38.394872875 +0000 UTC m=+0.068783922 container kill 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:05:38 np0005541913.localdomain systemd[1]: libpod-5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d.scope: Deactivated successfully.
Dec 02 10:05:38 np0005541913.localdomain podman[312970]: 2025-12-02 10:05:38.459666248 +0000 UTC m=+0.041517022 container died 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:05:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d-userdata-shm.mount: Deactivated successfully.
Dec 02 10:05:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-94147dbae9838956e714723c867733a25e47b3b6162526a89da5f485c251bb56-merged.mount: Deactivated successfully.
Dec 02 10:05:38 np0005541913.localdomain podman[312970]: 2025-12-02 10:05:38.506128011 +0000 UTC m=+0.087978795 container remove 5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d02cf1-f511-4416-b7c1-b37c417f16f9, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:05:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:38.545 263406 INFO neutron.agent.dhcp.agent [None req-87b7f267-6b80-4d43-a122-dd4b29ac1e77 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:38 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d45d02cf1\x2df511\x2d4416\x2db7c1\x2db37c417f16f9.mount: Deactivated successfully.
Dec 02 10:05:38 np0005541913.localdomain systemd[1]: libpod-conmon-5965ec520471163aaf8447e0aa55f7487dc0d208bec7096828d6c940d9f6539d.scope: Deactivated successfully.
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.141525) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669939141654, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2503, "num_deletes": 256, "total_data_size": 3268334, "memory_usage": 3316448, "flush_reason": "Manual Compaction"}
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669939160532, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2111274, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20131, "largest_seqno": 22629, "table_properties": {"data_size": 2102355, "index_size": 5489, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19897, "raw_average_key_size": 21, "raw_value_size": 2083908, "raw_average_value_size": 2205, "num_data_blocks": 241, "num_entries": 945, "num_filter_entries": 945, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669769, "oldest_key_time": 1764669769, "file_creation_time": 1764669939, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 19113 microseconds, and 7582 cpu microseconds.
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.160593) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2111274 bytes OK
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.160671) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.162836) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.162856) EVENT_LOG_v1 {"time_micros": 1764669939162850, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.162879) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3257109, prev total WAL file size 3257109, number of live WAL files 2.
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.163935) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end)
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2061KB)], [33(16MB)]
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669939163981, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 19278738, "oldest_snapshot_seqno": -1}
Dec 02 10:05:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:39.224 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12430 keys, 16638511 bytes, temperature: kUnknown
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669939253848, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 16638511, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16567485, "index_size": 38861, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31109, "raw_key_size": 333149, "raw_average_key_size": 26, "raw_value_size": 16355449, "raw_average_value_size": 1315, "num_data_blocks": 1480, "num_entries": 12430, "num_filter_entries": 12430, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764669939, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.254171) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 16638511 bytes
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.255661) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.2 rd, 184.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 16.4 +0.0 blob) out(15.9 +0.0 blob), read-write-amplify(17.0) write-amplify(7.9) OK, records in: 12959, records dropped: 529 output_compression: NoCompression
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.255682) EVENT_LOG_v1 {"time_micros": 1764669939255673, "job": 18, "event": "compaction_finished", "compaction_time_micros": 89989, "compaction_time_cpu_micros": 45970, "output_level": 6, "num_output_files": 1, "total_output_size": 16638511, "num_input_records": 12959, "num_output_records": 12430, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669939256006, "job": 18, "event": "table_file_deletion", "file_number": 35}
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764669939257748, "job": 18, "event": "table_file_deletion", "file_number": 33}
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.163834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.257855) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.257864) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.257867) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.257869) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:05:39 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:05:39.257872) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:05:40 np0005541913.localdomain ceph-mon[298296]: pgmap v189: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Dec 02 10:05:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:40.575 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:40.955 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:41 np0005541913.localdomain ceph-mon[298296]: pgmap v190: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Dec 02 10:05:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:42.820 263406 INFO neutron.agent.linux.ip_lib [None req-6cfb1242-45dd-447f-a6fc-1acdb666be60 - - - - - -] Device tapf7f7d342-f4 cannot be used as it has no MAC address
Dec 02 10:05:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:42.887 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:42 np0005541913.localdomain kernel: device tapf7f7d342-f4 entered promiscuous mode
Dec 02 10:05:42 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669942.8984] manager: (tapf7f7d342-f4): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Dec 02 10:05:42 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:42Z|00184|binding|INFO|Claiming lport f7f7d342-f447-418a-b17f-543c0a6fb6f4 for this chassis.
Dec 02 10:05:42 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:42Z|00185|binding|INFO|f7f7d342-f447-418a-b17f-543c0a6fb6f4: Claiming unknown
Dec 02 10:05:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:42.898 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:42 np0005541913.localdomain systemd-udevd[313000]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:05:42 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7f7d342-f4: No such device
Dec 02 10:05:42 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:42Z|00186|binding|INFO|Setting lport f7f7d342-f447-418a-b17f-543c0a6fb6f4 ovn-installed in OVS
Dec 02 10:05:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:42.937 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:42 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7f7d342-f4: No such device
Dec 02 10:05:42 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7f7d342-f4: No such device
Dec 02 10:05:42 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7f7d342-f4: No such device
Dec 02 10:05:42 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7f7d342-f4: No such device
Dec 02 10:05:42 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7f7d342-f4: No such device
Dec 02 10:05:42 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7f7d342-f4: No such device
Dec 02 10:05:42 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7f7d342-f4: No such device
Dec 02 10:05:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:42.981 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:43.003 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-dc1b6fff-63f9-4fbd-b22d-9d87141c4454', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc1b6fff-63f9-4fbd-b22d-9d87141c4454', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '91b4824d03bd43c4aca137037a18bd3d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1403e5f6-3958-4f2a-b5a7-a41f1931563b, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=f7f7d342-f447-418a-b17f-543c0a6fb6f4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:43.006 160221 INFO neutron.agent.ovn.metadata.agent [-] Port f7f7d342-f447-418a-b17f-543c0a6fb6f4 in datapath dc1b6fff-63f9-4fbd-b22d-9d87141c4454 bound to our chassis
Dec 02 10:05:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:43.007 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network dc1b6fff-63f9-4fbd-b22d-9d87141c4454 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:05:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:43.009 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb991b3-e045-47ab-84ee-c4c5a9a62ee3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:05:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:43Z|00187|binding|INFO|Setting lport f7f7d342-f447-418a-b17f-543c0a6fb6f4 up in Southbound
Dec 02 10:05:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:43.020 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:43 np0005541913.localdomain podman[313071]: 
Dec 02 10:05:43 np0005541913.localdomain podman[313071]: 2025-12-02 10:05:43.874349555 +0000 UTC m=+0.093321368 container create 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:05:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:05:43 np0005541913.localdomain systemd[1]: Started libpod-conmon-3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14.scope.
Dec 02 10:05:43 np0005541913.localdomain podman[313071]: 2025-12-02 10:05:43.831743115 +0000 UTC m=+0.050714938 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:05:43 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:05:43 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b9a5477488579bfa771c21dc4987fd440d0aad6300b05f1bb3cfe33ee68a44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:05:43 np0005541913.localdomain podman[313071]: 2025-12-02 10:05:43.97206751 +0000 UTC m=+0.191039313 container init 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 02 10:05:43 np0005541913.localdomain podman[313071]: 2025-12-02 10:05:43.981130772 +0000 UTC m=+0.200102575 container start 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:05:43 np0005541913.localdomain dnsmasq[313101]: started, version 2.85 cachesize 150
Dec 02 10:05:43 np0005541913.localdomain dnsmasq[313101]: DNS service limited to local subnets
Dec 02 10:05:43 np0005541913.localdomain dnsmasq[313101]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:05:43 np0005541913.localdomain dnsmasq[313101]: warning: no upstream servers configured
Dec 02 10:05:43 np0005541913.localdomain dnsmasq-dhcp[313101]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:05:43 np0005541913.localdomain dnsmasq[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/addn_hosts - 0 addresses
Dec 02 10:05:43 np0005541913.localdomain dnsmasq-dhcp[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/host
Dec 02 10:05:43 np0005541913.localdomain dnsmasq-dhcp[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/opts
Dec 02 10:05:44 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:44.050 263406 INFO neutron.agent.dhcp.agent [None req-6cfb1242-45dd-447f-a6fc-1acdb666be60 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:05:42Z, description=, device_id=0221007c-3d7c-420b-901d-7f4f12bcb06b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990896b880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990896b820>], id=8544d541-fb07-4eb3-912b-88ac82079cb6, ip_allocation=immediate, mac_address=fa:16:3e:e4:27:da, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:05:38Z, description=, dns_domain=, id=dc1b6fff-63f9-4fbd-b22d-9d87141c4454, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1809554588, port_security_enabled=True, project_id=91b4824d03bd43c4aca137037a18bd3d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=1956, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=930, status=ACTIVE, subnets=['c6378a19-844e-42c3-ad47-9aacfa4b56e5'], tags=[], tenant_id=91b4824d03bd43c4aca137037a18bd3d, updated_at=2025-12-02T10:05:41Z, vlan_transparent=None, network_id=dc1b6fff-63f9-4fbd-b22d-9d87141c4454, port_security_enabled=False, project_id=91b4824d03bd43c4aca137037a18bd3d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=955, status=DOWN, tags=[], tenant_id=91b4824d03bd43c4aca137037a18bd3d, updated_at=2025-12-02T10:05:42Z on network dc1b6fff-63f9-4fbd-b22d-9d87141c4454
Dec 02 10:05:44 np0005541913.localdomain podman[313086]: 2025-12-02 10:05:44.06631613 +0000 UTC m=+0.139691958 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 10:05:44 np0005541913.localdomain ceph-mon[298296]: pgmap v191: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Dec 02 10:05:44 np0005541913.localdomain podman[313086]: 2025-12-02 10:05:44.082109553 +0000 UTC m=+0.155485361 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:05:44 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:05:44 np0005541913.localdomain dnsmasq[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/addn_hosts - 1 addresses
Dec 02 10:05:44 np0005541913.localdomain dnsmasq-dhcp[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/host
Dec 02 10:05:44 np0005541913.localdomain podman[313128]: 2025-12-02 10:05:44.247805876 +0000 UTC m=+0.058466035 container kill 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:05:44 np0005541913.localdomain dnsmasq-dhcp[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/opts
Dec 02 10:05:44 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:44.371 263406 INFO neutron.agent.dhcp.agent [None req-c9803045-1c9a-4e59-9be1-e43b7f00297c - - - - - -] DHCP configuration for ports {'70e5bec2-545f-4237-982e-f54d4963b727'} is completed
Dec 02 10:05:44 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:44.542 263406 INFO neutron.agent.dhcp.agent [None req-b3773605-e597-4540-97c5-f9889f69d6be - - - - - -] DHCP configuration for ports {'8544d541-fb07-4eb3-912b-88ac82079cb6'} is completed
Dec 02 10:05:44 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:44.555 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:05:42Z, description=, device_id=0221007c-3d7c-420b-901d-7f4f12bcb06b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990899f8b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990899f2b0>], id=8544d541-fb07-4eb3-912b-88ac82079cb6, ip_allocation=immediate, mac_address=fa:16:3e:e4:27:da, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:05:38Z, description=, dns_domain=, id=dc1b6fff-63f9-4fbd-b22d-9d87141c4454, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1809554588, port_security_enabled=True, project_id=91b4824d03bd43c4aca137037a18bd3d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=1956, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=930, status=ACTIVE, subnets=['c6378a19-844e-42c3-ad47-9aacfa4b56e5'], tags=[], tenant_id=91b4824d03bd43c4aca137037a18bd3d, updated_at=2025-12-02T10:05:41Z, vlan_transparent=None, network_id=dc1b6fff-63f9-4fbd-b22d-9d87141c4454, port_security_enabled=False, project_id=91b4824d03bd43c4aca137037a18bd3d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=955, status=DOWN, tags=[], tenant_id=91b4824d03bd43c4aca137037a18bd3d, updated_at=2025-12-02T10:05:42Z on network dc1b6fff-63f9-4fbd-b22d-9d87141c4454
Dec 02 10:05:44 np0005541913.localdomain dnsmasq[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/addn_hosts - 1 addresses
Dec 02 10:05:44 np0005541913.localdomain dnsmasq-dhcp[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/host
Dec 02 10:05:44 np0005541913.localdomain dnsmasq-dhcp[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/opts
Dec 02 10:05:44 np0005541913.localdomain podman[313166]: 2025-12-02 10:05:44.730780727 +0000 UTC m=+0.053045930 container kill 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:05:45 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:45.064 263406 INFO neutron.agent.dhcp.agent [None req-298c58ea-adb1-4908-af51-f6caa1d2725b - - - - - -] DHCP configuration for ports {'8544d541-fb07-4eb3-912b-88ac82079cb6'} is completed
Dec 02 10:05:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:45.611 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:45.957 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:46 np0005541913.localdomain dnsmasq[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/addn_hosts - 0 addresses
Dec 02 10:05:46 np0005541913.localdomain dnsmasq-dhcp[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/host
Dec 02 10:05:46 np0005541913.localdomain podman[313201]: 2025-12-02 10:05:46.079501329 +0000 UTC m=+0.062023210 container kill 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:05:46 np0005541913.localdomain dnsmasq-dhcp[313101]: read /var/lib/neutron/dhcp/dc1b6fff-63f9-4fbd-b22d-9d87141c4454/opts
Dec 02 10:05:46 np0005541913.localdomain ceph-mon[298296]: pgmap v192: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:05:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:46 np0005541913.localdomain kernel: device tapf7f7d342-f4 left promiscuous mode
Dec 02 10:05:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:46.246 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:46 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:46Z|00188|binding|INFO|Releasing lport f7f7d342-f447-418a-b17f-543c0a6fb6f4 from this chassis (sb_readonly=0)
Dec 02 10:05:46 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:46Z|00189|binding|INFO|Setting lport f7f7d342-f447-418a-b17f-543c0a6fb6f4 down in Southbound
Dec 02 10:05:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:46.256 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-dc1b6fff-63f9-4fbd-b22d-9d87141c4454', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc1b6fff-63f9-4fbd-b22d-9d87141c4454', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '91b4824d03bd43c4aca137037a18bd3d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1403e5f6-3958-4f2a-b5a7-a41f1931563b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=f7f7d342-f447-418a-b17f-543c0a6fb6f4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:46.258 160221 INFO neutron.agent.ovn.metadata.agent [-] Port f7f7d342-f447-418a-b17f-543c0a6fb6f4 in datapath dc1b6fff-63f9-4fbd-b22d-9d87141c4454 unbound from our chassis
Dec 02 10:05:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:46.260 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network dc1b6fff-63f9-4fbd-b22d-9d87141c4454 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:05:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:46.261 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[fa570ebd-b71f-45ed-8770-63d45076fbb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:05:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:46.265 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:05:47 np0005541913.localdomain podman[313223]: 2025-12-02 10:05:47.439563553 +0000 UTC m=+0.079024425 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:05:47 np0005541913.localdomain podman[313223]: 2025-12-02 10:05:47.474147319 +0000 UTC m=+0.113608181 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:05:47 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:05:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:47.829 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:05:48 np0005541913.localdomain ceph-mon[298296]: pgmap v193: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:05:48 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e112 e112: 6 total, 6 up, 6 in
Dec 02 10:05:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 10:05:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 10:05:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 10:05:49 np0005541913.localdomain ceph-mon[298296]: osdmap e112: 6 total, 6 up, 6 in
Dec 02 10:05:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:05:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:05:49 np0005541913.localdomain podman[313242]: 2025-12-02 10:05:49.452911696 +0000 UTC m=+0.089398722 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, config_id=edpm, version=9.6, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 10:05:49 np0005541913.localdomain podman[313243]: 2025-12-02 10:05:49.54982648 +0000 UTC m=+0.182887885 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:05:49 np0005541913.localdomain podman[313242]: 2025-12-02 10:05:49.567199334 +0000 UTC m=+0.203686330 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, vcs-type=git, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 10:05:49 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:49.574 263406 INFO neutron.agent.linux.ip_lib [None req-b74ccd4a-60b2-4178-9daf-e28ff3ee92d9 - - - - - -] Device tap5624f1cd-ac cannot be used as it has no MAC address
Dec 02 10:05:49 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:05:49 np0005541913.localdomain podman[313243]: 2025-12-02 10:05:49.584674121 +0000 UTC m=+0.217735476 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:05:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:49.601 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:49 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:05:49 np0005541913.localdomain kernel: device tap5624f1cd-ac entered promiscuous mode
Dec 02 10:05:49 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669949.6115] manager: (tap5624f1cd-ac): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Dec 02 10:05:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:49.610 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:49Z|00190|binding|INFO|Claiming lport 5624f1cd-ac01-4dc0-b6cb-827f7161ed5c for this chassis.
Dec 02 10:05:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:49Z|00191|binding|INFO|5624f1cd-ac01-4dc0-b6cb-827f7161ed5c: Claiming unknown
Dec 02 10:05:49 np0005541913.localdomain systemd-udevd[313293]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:05:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:49.623 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-38b12dd1-ff52-416e-8f1c-79f301a7bf32', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38b12dd1-ff52-416e-8f1c-79f301a7bf32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37e4f8f0e4cd48f5b7b2d1cb4c67377c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29b3a7f0-a12b-42c9-87fc-78534c7005fc, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=5624f1cd-ac01-4dc0-b6cb-827f7161ed5c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:49.625 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 5624f1cd-ac01-4dc0-b6cb-827f7161ed5c in datapath 38b12dd1-ff52-416e-8f1c-79f301a7bf32 bound to our chassis
Dec 02 10:05:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:49.628 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port a007a046-bcdb-410b-a209-82b7bb4ffc2a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:05:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:49.628 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 38b12dd1-ff52-416e-8f1c-79f301a7bf32, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:05:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:49.629 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[16dbd353-eab1-4f7c-b2e3-98496cbcdc3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:05:49 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5624f1cd-ac: No such device
Dec 02 10:05:49 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5624f1cd-ac: No such device
Dec 02 10:05:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:49Z|00192|binding|INFO|Setting lport 5624f1cd-ac01-4dc0-b6cb-827f7161ed5c ovn-installed in OVS
Dec 02 10:05:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:49Z|00193|binding|INFO|Setting lport 5624f1cd-ac01-4dc0-b6cb-827f7161ed5c up in Southbound
Dec 02 10:05:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:49.643 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:49 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5624f1cd-ac: No such device
Dec 02 10:05:49 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5624f1cd-ac: No such device
Dec 02 10:05:49 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5624f1cd-ac: No such device
Dec 02 10:05:49 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5624f1cd-ac: No such device
Dec 02 10:05:49 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5624f1cd-ac: No such device
Dec 02 10:05:49 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5624f1cd-ac: No such device
Dec 02 10:05:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:49.676 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:49.701 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:50 np0005541913.localdomain ceph-mon[298296]: pgmap v195: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail; 7.1 KiB/s rd, 1.2 KiB/s wr, 10 op/s
Dec 02 10:05:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e113 e113: 6 total, 6 up, 6 in
Dec 02 10:05:50 np0005541913.localdomain podman[313364]: 
Dec 02 10:05:50 np0005541913.localdomain podman[313364]: 2025-12-02 10:05:50.576920107 +0000 UTC m=+0.095684391 container create e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:05:50 np0005541913.localdomain podman[313364]: 2025-12-02 10:05:50.533064294 +0000 UTC m=+0.051828618 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:05:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:50.654 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:50 np0005541913.localdomain systemd[1]: Started libpod-conmon-e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3.scope.
Dec 02 10:05:50 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:05:50 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/596a1204c6242efa2669558e42ecdbcf1d8d6dced90449cb010cd2b06bf89c11/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:05:50 np0005541913.localdomain podman[313364]: 2025-12-02 10:05:50.687832854 +0000 UTC m=+0.206597148 container init e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:05:50 np0005541913.localdomain podman[313364]: 2025-12-02 10:05:50.697716068 +0000 UTC m=+0.216480352 container start e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:05:50 np0005541913.localdomain dnsmasq[313382]: started, version 2.85 cachesize 150
Dec 02 10:05:50 np0005541913.localdomain dnsmasq[313382]: DNS service limited to local subnets
Dec 02 10:05:50 np0005541913.localdomain dnsmasq[313382]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:05:50 np0005541913.localdomain dnsmasq[313382]: warning: no upstream servers configured
Dec 02 10:05:50 np0005541913.localdomain dnsmasq-dhcp[313382]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:05:50 np0005541913.localdomain dnsmasq[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/addn_hosts - 0 addresses
Dec 02 10:05:50 np0005541913.localdomain dnsmasq-dhcp[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/host
Dec 02 10:05:50 np0005541913.localdomain dnsmasq-dhcp[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/opts
Dec 02 10:05:50 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:50.844 263406 INFO neutron.agent.dhcp.agent [None req-74c10537-b64f-4f0b-aecc-28aedf4a22d8 - - - - - -] DHCP configuration for ports {'3f430f3b-91ce-45f4-adeb-05cf984d7735'} is completed
Dec 02 10:05:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:50.959 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:51 np0005541913.localdomain ceph-mon[298296]: osdmap e113: 6 total, 6 up, 6 in
Dec 02 10:05:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:51.493 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:52 np0005541913.localdomain ceph-mon[298296]: pgmap v197: 177 pgs: 177 active+clean; 145 MiB data, 767 MiB used, 41 GiB / 42 GiB avail; 8.9 KiB/s rd, 1.5 KiB/s wr, 12 op/s
Dec 02 10:05:52 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:52.589 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:05:52Z, description=, device_id=798ad2c1-39c2-42cf-b43f-5f28ae054b5b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089b82e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089b8490>], id=a6fc5ad3-55a2-486d-84d2-256a704a8fbe, ip_allocation=immediate, mac_address=fa:16:3e:cb:eb:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:05:45Z, description=, dns_domain=, id=38b12dd1-ff52-416e-8f1c-79f301a7bf32, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-683263966-network, port_security_enabled=True, project_id=37e4f8f0e4cd48f5b7b2d1cb4c67377c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43495, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=976, status=ACTIVE, subnets=['aa72654c-fad9-468d-966a-038289774954'], tags=[], tenant_id=37e4f8f0e4cd48f5b7b2d1cb4c67377c, updated_at=2025-12-02T10:05:47Z, vlan_transparent=None, network_id=38b12dd1-ff52-416e-8f1c-79f301a7bf32, port_security_enabled=False, project_id=37e4f8f0e4cd48f5b7b2d1cb4c67377c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1020, status=DOWN, tags=[], tenant_id=37e4f8f0e4cd48f5b7b2d1cb4c67377c, updated_at=2025-12-02T10:05:52Z on network 38b12dd1-ff52-416e-8f1c-79f301a7bf32
Dec 02 10:05:52 np0005541913.localdomain dnsmasq[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/addn_hosts - 1 addresses
Dec 02 10:05:52 np0005541913.localdomain podman[313400]: 2025-12-02 10:05:52.807283976 +0000 UTC m=+0.062562055 container kill e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:05:52 np0005541913.localdomain dnsmasq-dhcp[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/host
Dec 02 10:05:52 np0005541913.localdomain dnsmasq-dhcp[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/opts
Dec 02 10:05:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:53.076 263406 INFO neutron.agent.dhcp.agent [None req-2df80f2f-6d14-491e-99e3-d652b998a29c - - - - - -] DHCP configuration for ports {'a6fc5ad3-55a2-486d-84d2-256a704a8fbe'} is completed
Dec 02 10:05:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:53.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:05:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:53.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:05:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:54.169 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:05:52Z, description=, device_id=798ad2c1-39c2-42cf-b43f-5f28ae054b5b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9909239880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9909239430>], id=a6fc5ad3-55a2-486d-84d2-256a704a8fbe, ip_allocation=immediate, mac_address=fa:16:3e:cb:eb:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:05:45Z, description=, dns_domain=, id=38b12dd1-ff52-416e-8f1c-79f301a7bf32, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-683263966-network, port_security_enabled=True, project_id=37e4f8f0e4cd48f5b7b2d1cb4c67377c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43495, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=976, status=ACTIVE, subnets=['aa72654c-fad9-468d-966a-038289774954'], tags=[], tenant_id=37e4f8f0e4cd48f5b7b2d1cb4c67377c, updated_at=2025-12-02T10:05:47Z, vlan_transparent=None, network_id=38b12dd1-ff52-416e-8f1c-79f301a7bf32, port_security_enabled=False, project_id=37e4f8f0e4cd48f5b7b2d1cb4c67377c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1020, status=DOWN, tags=[], tenant_id=37e4f8f0e4cd48f5b7b2d1cb4c67377c, updated_at=2025-12-02T10:05:52Z on network 38b12dd1-ff52-416e-8f1c-79f301a7bf32
Dec 02 10:05:54 np0005541913.localdomain ceph-mon[298296]: pgmap v198: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Dec 02 10:05:54 np0005541913.localdomain podman[313438]: 2025-12-02 10:05:54.381972053 +0000 UTC m=+0.061066474 container kill e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:05:54 np0005541913.localdomain dnsmasq[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/addn_hosts - 1 addresses
Dec 02 10:05:54 np0005541913.localdomain dnsmasq-dhcp[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/host
Dec 02 10:05:54 np0005541913.localdomain dnsmasq-dhcp[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/opts
Dec 02 10:05:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:54.628 263406 INFO neutron.agent.dhcp.agent [None req-51e3912d-4c17-4599-9679-205d78046919 - - - - - -] DHCP configuration for ports {'a6fc5ad3-55a2-486d-84d2-256a704a8fbe'} is completed
Dec 02 10:05:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:54.830 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:05:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:54.830 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:05:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:54.830 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:05:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:54.911 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:05:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:54.911 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:05:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:54.911 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:05:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:54.912 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:05:55 np0005541913.localdomain dnsmasq[313101]: exiting on receipt of SIGTERM
Dec 02 10:05:55 np0005541913.localdomain podman[313473]: 2025-12-02 10:05:55.479728831 +0000 UTC m=+0.058291340 container kill 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:05:55 np0005541913.localdomain systemd[1]: libpod-3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14.scope: Deactivated successfully.
Dec 02 10:05:55 np0005541913.localdomain podman[313486]: 2025-12-02 10:05:55.545422049 +0000 UTC m=+0.054915311 container died 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:05:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14-userdata-shm.mount: Deactivated successfully.
Dec 02 10:05:55 np0005541913.localdomain podman[313486]: 2025-12-02 10:05:55.632076987 +0000 UTC m=+0.141570179 container cleanup 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:05:55 np0005541913.localdomain systemd[1]: libpod-conmon-3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14.scope: Deactivated successfully.
Dec 02 10:05:55 np0005541913.localdomain podman[313493]: 2025-12-02 10:05:55.658571146 +0000 UTC m=+0.155812229 container remove 3eb2b5d7417c639d5cc5cb2a310f614b4eb583a4bb54796756c3ba78eaebdd14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dc1b6fff-63f9-4fbd-b22d-9d87141c4454, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.698 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.702 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.712 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.713 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.854 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.855 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.855 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.856 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.856 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:05:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:55.925 263406 INFO neutron.agent.dhcp.agent [None req-9d874735-3153-4d0e-b143-6a7609630514 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 e114: 6 total, 6 up, 6 in
Dec 02 10:05:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:55.961 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:05:56 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:05:56.168 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:05:56 np0005541913.localdomain ceph-mon[298296]: pgmap v199: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Dec 02 10:05:56 np0005541913.localdomain ceph-mon[298296]: osdmap e114: 6 total, 6 up, 6 in
Dec 02 10:05:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:05:56 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2634255040' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.326 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:05:56 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:56Z|00194|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.419 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.432 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.432 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:05:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-68b9a5477488579bfa771c21dc4987fd440d0aad6300b05f1bb3cfe33ee68a44-merged.mount: Deactivated successfully.
Dec 02 10:05:56 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2ddc1b6fff\x2d63f9\x2d4fbd\x2db22d\x2d9d87141c4454.mount: Deactivated successfully.
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.655 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.657 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11287MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.658 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.658 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.762 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.763 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.764 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:05:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:56.821 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:05:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:05:57 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2150290219' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:57.263 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:05:57 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2634255040' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:57.271 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:05:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:57.293 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:05:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:57.318 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:05:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:57.319 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:05:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:05:57 np0005541913.localdomain podman[313560]: 2025-12-02 10:05:57.428024224 +0000 UTC m=+0.069384008 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible)
Dec 02 10:05:57 np0005541913.localdomain podman[313560]: 2025-12-02 10:05:57.441217117 +0000 UTC m=+0.082576891 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=multipathd)
Dec 02 10:05:57 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:05:57 np0005541913.localdomain dnsmasq[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/addn_hosts - 0 addresses
Dec 02 10:05:57 np0005541913.localdomain dnsmasq-dhcp[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/host
Dec 02 10:05:57 np0005541913.localdomain podman[313595]: 2025-12-02 10:05:57.706808382 +0000 UTC m=+0.046447663 container kill e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:05:57 np0005541913.localdomain dnsmasq-dhcp[313382]: read /var/lib/neutron/dhcp/38b12dd1-ff52-416e-8f1c-79f301a7bf32/opts
Dec 02 10:05:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:57Z|00195|binding|INFO|Releasing lport 5624f1cd-ac01-4dc0-b6cb-827f7161ed5c from this chassis (sb_readonly=0)
Dec 02 10:05:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:57.872 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:05:57Z|00196|binding|INFO|Setting lport 5624f1cd-ac01-4dc0-b6cb-827f7161ed5c down in Southbound
Dec 02 10:05:57 np0005541913.localdomain kernel: device tap5624f1cd-ac left promiscuous mode
Dec 02 10:05:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:57.881 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-38b12dd1-ff52-416e-8f1c-79f301a7bf32', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38b12dd1-ff52-416e-8f1c-79f301a7bf32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37e4f8f0e4cd48f5b7b2d1cb4c67377c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29b3a7f0-a12b-42c9-87fc-78534c7005fc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=5624f1cd-ac01-4dc0-b6cb-827f7161ed5c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:05:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:57.884 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 5624f1cd-ac01-4dc0-b6cb-827f7161ed5c in datapath 38b12dd1-ff52-416e-8f1c-79f301a7bf32 unbound from our chassis
Dec 02 10:05:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:57.887 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 38b12dd1-ff52-416e-8f1c-79f301a7bf32, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:05:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:05:57.888 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[2cef9cff-93b2-4ca4-9f04-4215348208ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:05:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:57.898 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:05:58 np0005541913.localdomain ceph-mon[298296]: pgmap v201: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 2.4 KiB/s wr, 36 op/s
Dec 02 10:05:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2150290219' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:05:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:58.315 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:05:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:05:59.840 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:06:00 np0005541913.localdomain ceph-mon[298296]: pgmap v202: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 2.2 KiB/s wr, 33 op/s
Dec 02 10:06:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:00.719 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:00.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:06:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:00Z|00197|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:06:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:00.963 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:00.970 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:01 np0005541913.localdomain dnsmasq[313382]: exiting on receipt of SIGTERM
Dec 02 10:06:01 np0005541913.localdomain podman[313634]: 2025-12-02 10:06:01.388735593 +0000 UTC m=+0.063133090 container kill e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:06:01 np0005541913.localdomain systemd[1]: libpod-e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3.scope: Deactivated successfully.
Dec 02 10:06:01 np0005541913.localdomain podman[313647]: 2025-12-02 10:06:01.459204538 +0000 UTC m=+0.056688237 container died e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:06:01 np0005541913.localdomain systemd[1]: tmp-crun.iYeL04.mount: Deactivated successfully.
Dec 02 10:06:01 np0005541913.localdomain podman[313647]: 2025-12-02 10:06:01.497003879 +0000 UTC m=+0.094487518 container cleanup e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 02 10:06:01 np0005541913.localdomain systemd[1]: libpod-conmon-e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3.scope: Deactivated successfully.
Dec 02 10:06:01 np0005541913.localdomain podman[313650]: 2025-12-02 10:06:01.545755614 +0000 UTC m=+0.131753646 container remove e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38b12dd1-ff52-416e-8f1c-79f301a7bf32, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:06:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:01.569 263406 INFO neutron.agent.dhcp.agent [None req-b4145658-4bbe-44ca-ba78-ea4d0c003600 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:06:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:01.641 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:06:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:01.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:06:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:01.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:06:01 np0005541913.localdomain ceph-mon[298296]: pgmap v203: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.9 KiB/s wr, 29 op/s
Dec 02 10:06:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:06:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:06:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-596a1204c6242efa2669558e42ecdbcf1d8d6dced90449cb010cd2b06bf89c11-merged.mount: Deactivated successfully.
Dec 02 10:06:02 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e43830afe28deb1bd8b8fcd14551789f91413e767d6670fc3fe3c8529f743cf3-userdata-shm.mount: Deactivated successfully.
Dec 02 10:06:02 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d38b12dd1\x2dff52\x2d416e\x2d8f1c\x2d79f301a7bf32.mount: Deactivated successfully.
Dec 02 10:06:02 np0005541913.localdomain systemd[1]: tmp-crun.Kubx4N.mount: Deactivated successfully.
Dec 02 10:06:02 np0005541913.localdomain podman[313677]: 2025-12-02 10:06:02.449926163 +0000 UTC m=+0.091113378 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:06:02 np0005541913.localdomain podman[313678]: 2025-12-02 10:06:02.501941045 +0000 UTC m=+0.140180001 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:06:02 np0005541913.localdomain podman[313677]: 2025-12-02 10:06:02.513150855 +0000 UTC m=+0.154338060 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:06:02 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:06:02 np0005541913.localdomain podman[313678]: 2025-12-02 10:06:02.568003413 +0000 UTC m=+0.206242319 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:06:02 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:06:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:03.050 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:06:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:03.050 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:06:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:03.051 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:06:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/209252438' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:06:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:06:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:06:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:06:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:06:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:06:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:06:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:06:04 np0005541913.localdomain ceph-mon[298296]: pgmap v204: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/840405198' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:06:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/568623931' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:06:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/354601536' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:06:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2210170190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:06:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2210170190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:06:05 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:05.630 2 INFO neutron.agent.securitygroups_rpc [None req-616a401d-f858-48e0-bbb1-73e58fa51cbe 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['1e6a52d4-a530-4d1c-b3c3-fd5c65190a35']
Dec 02 10:06:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:05.757 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:05 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:05.895 2 INFO neutron.agent.securitygroups_rpc [None req-7c216765-b201-4648-8f14-301becf47f8c 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['1e6a52d4-a530-4d1c-b3c3-fd5c65190a35']
Dec 02 10:06:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:05.965 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:06:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:06:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:06:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:06:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:06:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18763 "" "Go-http-client/1.1"
Dec 02 10:06:06 np0005541913.localdomain ceph-mon[298296]: pgmap v205: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:06 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:06.718 2 INFO neutron.agent.securitygroups_rpc [None req-008f26a2-2a9a-4275-8fd3-0db0ae3965dc 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:06 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:06.888 2 INFO neutron.agent.securitygroups_rpc [None req-2ee1f36b-3e28-45da-9995-f4334e4d09c3 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:07 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:07.147 2 INFO neutron.agent.securitygroups_rpc [None req-07b24d70-b40e-4b6b-a4d7-126ffe953c74 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:07 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:07.300 2 INFO neutron.agent.securitygroups_rpc [None req-44887692-98ed-4bee-8196-cf2b44c61f3b 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:07 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:07.453 2 INFO neutron.agent.securitygroups_rpc [None req-b501e38f-a705-4a3f-a758-0a1e958e6279 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:07 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:07.593 2 INFO neutron.agent.securitygroups_rpc [None req-7ad226a4-da1c-44ea-8953-97466b6b7a50 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:07 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:07.947 2 INFO neutron.agent.securitygroups_rpc [None req-45db658a-901d-430c-aa11-8109c9f781eb 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:08 np0005541913.localdomain ceph-mon[298296]: pgmap v206: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:08 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:08.149 2 INFO neutron.agent.securitygroups_rpc [None req-525377d0-23a2-43bc-9048-c1bbf6915b2f 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:08 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:08.365 2 INFO neutron.agent.securitygroups_rpc [None req-09e30383-7aff-4fc0-a180-6509153c799d 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:08.411 263406 INFO neutron.agent.linux.ip_lib [None req-42dd99b0-37bd-408c-958e-fcfef47e6cca - - - - - -] Device tap2dbcd8ec-20 cannot be used as it has no MAC address
Dec 02 10:06:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:08.435 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:08 np0005541913.localdomain kernel: device tap2dbcd8ec-20 entered promiscuous mode
Dec 02 10:06:08 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669968.4440] manager: (tap2dbcd8ec-20): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Dec 02 10:06:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:08.444 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:08Z|00198|binding|INFO|Claiming lport 2dbcd8ec-20c4-46b0-aa36-003343647b6f for this chassis.
Dec 02 10:06:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:08Z|00199|binding|INFO|2dbcd8ec-20c4-46b0-aa36-003343647b6f: Claiming unknown
Dec 02 10:06:08 np0005541913.localdomain systemd-udevd[313732]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:06:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:08.458 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-bac36584-1981-4cdd-a84e-a0acd6701163', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bac36584-1981-4cdd-a84e-a0acd6701163', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1273a829a21431083b7acd4fe017c0f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88083dc8-f50d-46ad-8d21-9b887e2c23b6, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=2dbcd8ec-20c4-46b0-aa36-003343647b6f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:08.460 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 2dbcd8ec-20c4-46b0-aa36-003343647b6f in datapath bac36584-1981-4cdd-a84e-a0acd6701163 bound to our chassis
Dec 02 10:06:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:08.462 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4ace7270-8d40-4f6c-9551-b4408b1d4a48 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:06:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:08.463 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bac36584-1981-4cdd-a84e-a0acd6701163, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:06:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:08.464 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[460ce80c-ecd8-445e-b99f-d2a2792cf326]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:06:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:08Z|00200|binding|INFO|Setting lport 2dbcd8ec-20c4-46b0-aa36-003343647b6f ovn-installed in OVS
Dec 02 10:06:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:08Z|00201|binding|INFO|Setting lport 2dbcd8ec-20c4-46b0-aa36-003343647b6f up in Southbound
Dec 02 10:06:08 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2dbcd8ec-20: No such device
Dec 02 10:06:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:08.477 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:08 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2dbcd8ec-20: No such device
Dec 02 10:06:08 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2dbcd8ec-20: No such device
Dec 02 10:06:08 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2dbcd8ec-20: No such device
Dec 02 10:06:08 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2dbcd8ec-20: No such device
Dec 02 10:06:08 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2dbcd8ec-20: No such device
Dec 02 10:06:08 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2dbcd8ec-20: No such device
Dec 02 10:06:08 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2dbcd8ec-20: No such device
Dec 02 10:06:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:08.515 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:08 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:08.541 2 INFO neutron.agent.securitygroups_rpc [None req-0d565c71-9045-4969-b270-4f682b986cb3 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['9b89faa6-0a2d-4787-9ca6-c2d15c18c0cd']
Dec 02 10:06:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:08.543 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:09.130 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:09 np0005541913.localdomain podman[313803]: 
Dec 02 10:06:09 np0005541913.localdomain podman[313803]: 2025-12-02 10:06:09.467451672 +0000 UTC m=+0.095535728 container create 745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bac36584-1981-4cdd-a84e-a0acd6701163, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:06:09 np0005541913.localdomain systemd[1]: Started libpod-conmon-745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1.scope.
Dec 02 10:06:09 np0005541913.localdomain systemd[1]: tmp-crun.hAVMYS.mount: Deactivated successfully.
Dec 02 10:06:09 np0005541913.localdomain podman[313803]: 2025-12-02 10:06:09.420713961 +0000 UTC m=+0.048798007 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:06:09 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:06:09 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92d72d12a9f6e625de9b3e59f6b237633fb3ec1fad9bdbceba45fc659a116379/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:06:09 np0005541913.localdomain podman[313803]: 2025-12-02 10:06:09.550307688 +0000 UTC m=+0.178391664 container init 745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bac36584-1981-4cdd-a84e-a0acd6701163, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:06:09 np0005541913.localdomain podman[313803]: 2025-12-02 10:06:09.564723104 +0000 UTC m=+0.192807080 container start 745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bac36584-1981-4cdd-a84e-a0acd6701163, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:06:09 np0005541913.localdomain dnsmasq[313821]: started, version 2.85 cachesize 150
Dec 02 10:06:09 np0005541913.localdomain dnsmasq[313821]: DNS service limited to local subnets
Dec 02 10:06:09 np0005541913.localdomain dnsmasq[313821]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:06:09 np0005541913.localdomain dnsmasq[313821]: warning: no upstream servers configured
Dec 02 10:06:09 np0005541913.localdomain dnsmasq-dhcp[313821]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:06:09 np0005541913.localdomain dnsmasq[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/addn_hosts - 0 addresses
Dec 02 10:06:09 np0005541913.localdomain dnsmasq-dhcp[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/host
Dec 02 10:06:09 np0005541913.localdomain dnsmasq-dhcp[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/opts
Dec 02 10:06:09 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:09.682 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:06:09Z, description=, device_id=60f2c6f6-f230-49f9-b983-bd94d1e33602, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089924f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908992e20>], id=28ecbd52-e8a2-4b21-bab1-ec146a5b1ee7, ip_allocation=immediate, mac_address=fa:16:3e:9c:d8:b4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:06:06Z, description=, dns_domain=, id=bac36584-1981-4cdd-a84e-a0acd6701163, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestJSON-1838791269-network, port_security_enabled=True, project_id=b1273a829a21431083b7acd4fe017c0f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=53975, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1088, status=ACTIVE, subnets=['1e0a4610-062f-4e3c-a349-111b09a25932'], tags=[], tenant_id=b1273a829a21431083b7acd4fe017c0f, updated_at=2025-12-02T10:06:06Z, vlan_transparent=None, network_id=bac36584-1981-4cdd-a84e-a0acd6701163, port_security_enabled=False, project_id=b1273a829a21431083b7acd4fe017c0f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1122, status=DOWN, tags=[], tenant_id=b1273a829a21431083b7acd4fe017c0f, updated_at=2025-12-02T10:06:09Z on network bac36584-1981-4cdd-a84e-a0acd6701163
Dec 02 10:06:09 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:09.731 2 INFO neutron.agent.securitygroups_rpc [None req-61a0e9aa-9477-43cf-af07-616f49d4b972 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['94ceebea-e233-4f36-9a23-49456abf3258']
Dec 02 10:06:09 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:09.775 263406 INFO neutron.agent.dhcp.agent [None req-846f888e-dfef-4c03-997c-92e8af513eed - - - - - -] DHCP configuration for ports {'8d7aa2a9-6ad1-4777-a763-cbbab2495c79'} is completed
Dec 02 10:06:09 np0005541913.localdomain dnsmasq[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/addn_hosts - 1 addresses
Dec 02 10:06:09 np0005541913.localdomain podman[313838]: 2025-12-02 10:06:09.912383014 +0000 UTC m=+0.062365559 container kill 745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bac36584-1981-4cdd-a84e-a0acd6701163, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:06:09 np0005541913.localdomain dnsmasq-dhcp[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/host
Dec 02 10:06:09 np0005541913.localdomain dnsmasq-dhcp[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/opts
Dec 02 10:06:10 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:10.156 263406 INFO neutron.agent.dhcp.agent [None req-19b054e6-841b-49ab-b832-241cf2e7ed09 - - - - - -] DHCP configuration for ports {'28ecbd52-e8a2-4b21-bab1-ec146a5b1ee7'} is completed
Dec 02 10:06:10 np0005541913.localdomain ceph-mon[298296]: pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:10.792 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:10 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:10.822 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:06:09Z, description=, device_id=60f2c6f6-f230-49f9-b983-bd94d1e33602, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908ae50d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a0a2b0>], id=28ecbd52-e8a2-4b21-bab1-ec146a5b1ee7, ip_allocation=immediate, mac_address=fa:16:3e:9c:d8:b4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:06:06Z, description=, dns_domain=, id=bac36584-1981-4cdd-a84e-a0acd6701163, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestJSON-1838791269-network, port_security_enabled=True, project_id=b1273a829a21431083b7acd4fe017c0f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=53975, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1088, status=ACTIVE, subnets=['1e0a4610-062f-4e3c-a349-111b09a25932'], tags=[], tenant_id=b1273a829a21431083b7acd4fe017c0f, updated_at=2025-12-02T10:06:06Z, vlan_transparent=None, network_id=bac36584-1981-4cdd-a84e-a0acd6701163, port_security_enabled=False, project_id=b1273a829a21431083b7acd4fe017c0f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1122, status=DOWN, tags=[], tenant_id=b1273a829a21431083b7acd4fe017c0f, updated_at=2025-12-02T10:06:09Z on network bac36584-1981-4cdd-a84e-a0acd6701163
Dec 02 10:06:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:10.967 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:11 np0005541913.localdomain dnsmasq[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/addn_hosts - 1 addresses
Dec 02 10:06:11 np0005541913.localdomain podman[313875]: 2025-12-02 10:06:11.09600255 +0000 UTC m=+0.061561588 container kill 745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bac36584-1981-4cdd-a84e-a0acd6701163, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:06:11 np0005541913.localdomain dnsmasq-dhcp[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/host
Dec 02 10:06:11 np0005541913.localdomain dnsmasq-dhcp[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/opts
Dec 02 10:06:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:11 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:11.324 263406 INFO neutron.agent.dhcp.agent [None req-7e16ee11-e81d-4f19-baf8-a8ed7b0089cd - - - - - -] DHCP configuration for ports {'28ecbd52-e8a2-4b21-bab1-ec146a5b1ee7'} is completed
Dec 02 10:06:11 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:11.483 2 INFO neutron.agent.securitygroups_rpc [None req-a112143f-1afa-4cab-a314-c7a0cf01690b 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['bc246512-f2e7-49c6-b3c6-e51d67208518']
Dec 02 10:06:11 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:11.736 2 INFO neutron.agent.securitygroups_rpc [None req-2651c95b-dd09-4d1c-945d-8112466f351e 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['bc246512-f2e7-49c6-b3c6-e51d67208518']
Dec 02 10:06:11 np0005541913.localdomain ceph-mon[298296]: pgmap v208: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:12 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:12.928 2 INFO neutron.agent.securitygroups_rpc [None req-b3b89eff-a637-4d13-a86c-dcfc347ff722 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['482dba13-8db1-4254-a853-7fa4b3df0a8e']
Dec 02 10:06:13 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:13.105 2 INFO neutron.agent.securitygroups_rpc [None req-19b017d9-1c5d-463c-8fd1-6e01ac942ab6 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['482dba13-8db1-4254-a853-7fa4b3df0a8e']
Dec 02 10:06:13 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:13.582 2 INFO neutron.agent.securitygroups_rpc [None req-fdfabc3e-7f87-43e4-a533-8789992c1455 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['4bfb4e85-1f55-46a0-9d89-e38518cc2b18']
Dec 02 10:06:13 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:13.951 2 INFO neutron.agent.securitygroups_rpc [None req-64456867-c89f-4fe7-8d63-40c9e8a08ded e6f97ef89976422db171867e1c0c59f0 3f0966ca3eec4301b9d84b4543ff9fdf - - default default] Security group member updated ['a857935d-02ea-4e3d-98f4-258f4647959a']
Dec 02 10:06:14 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:14.035 2 INFO neutron.agent.securitygroups_rpc [None req-5a1a208f-7619-41f5-8ff5-57ae95d40ae9 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['4bfb4e85-1f55-46a0-9d89-e38518cc2b18']
Dec 02 10:06:14 np0005541913.localdomain ceph-mon[298296]: pgmap v209: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:14 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:14.235 2 INFO neutron.agent.securitygroups_rpc [None req-ba5eb572-7899-4268-a4d4-30f2b4a8108a 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['4bfb4e85-1f55-46a0-9d89-e38518cc2b18']
Dec 02 10:06:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:06:14 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:14.414 2 INFO neutron.agent.securitygroups_rpc [None req-9ba9d0cb-ece7-40eb-b408-fce3b926db2b e6f97ef89976422db171867e1c0c59f0 3f0966ca3eec4301b9d84b4543ff9fdf - - default default] Security group member updated ['a857935d-02ea-4e3d-98f4-258f4647959a']
Dec 02 10:06:14 np0005541913.localdomain podman[313896]: 2025-12-02 10:06:14.440084833 +0000 UTC m=+0.082869778 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:06:14 np0005541913.localdomain podman[313896]: 2025-12-02 10:06:14.450043769 +0000 UTC m=+0.092828744 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:06:14 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:14.449 2 INFO neutron.agent.securitygroups_rpc [None req-1dee4ff0-8cdb-4950-b86d-3d1f17272691 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['4bfb4e85-1f55-46a0-9d89-e38518cc2b18']
Dec 02 10:06:14 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:06:14 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:14.868 2 INFO neutron.agent.securitygroups_rpc [None req-c68f94c3-1ecf-4886-9e93-85ed57a0441f 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['4bfb4e85-1f55-46a0-9d89-e38518cc2b18']
Dec 02 10:06:15 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:15.566 2 INFO neutron.agent.securitygroups_rpc [None req-301242b3-4b9d-48c8-8e81-d6b06b4fcc41 e6f97ef89976422db171867e1c0c59f0 3f0966ca3eec4301b9d84b4543ff9fdf - - default default] Security group member updated ['a857935d-02ea-4e3d-98f4-258f4647959a']
Dec 02 10:06:15 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:15.600 2 INFO neutron.agent.securitygroups_rpc [None req-296124d6-ee7a-42c7-8fb1-fa352c7e4ccb 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['4bfb4e85-1f55-46a0-9d89-e38518cc2b18']
Dec 02 10:06:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:15.827 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:15.968 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.105 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceph-mon[298296]: pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.111 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8dc434e-dcce-493a-9b3e-8f9e1cf5db21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.107035', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a3966da-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': '664de14218e71f36c1887f2fec06607b374cddc1f08c57651a5be65ffd8c6bdf'}]}, 'timestamp': '2025-12-02 10:06:16.111983', '_unique_id': '5604fd87106545059086d741f070e4e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.113 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.115 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65f30223-114e-4b8c-bf76-4b72ff7a71e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.115152', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a39fb2c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': '0e34c1ac79cb05a330480108957e65990f9878479a69349b33aad1a96bc3d75c'}]}, 'timestamp': '2025-12-02 10:06:16.115771', '_unique_id': '9a1aac1d00914c26ab5be42f3f4896ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.116 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.143 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.143 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29f89972-430c-465b-bb06-864e69a77e9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:06:16.118173', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8a3e4876-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': '79377979492dbf96d3e85ff1df8c13c55105b72a2a5a2ebe43ee42f78cbea2fb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:06:16.118173', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8a3e5e74-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': 'c7411a9835948f13b04cc4826ce4d4bc2f0e143f28b4c1a89c63cb7a64c7086d'}]}, 'timestamp': '2025-12-02 10:06:16.144462', '_unique_id': '3e56c7d4abea43dc9ee89426e50d63cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.145 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.147 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.147 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6505a724-4c3c-466d-aec0-f1ac792b1893', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:06:16.147100', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8a3ed926-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': 'e2d93d6d1905d63d823ce9b697c93ae0cc4cef7c099371b2248b8b9f51bd9e00'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:06:16.147100', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8a3eeb6e-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': 'f07218578cb1ba24e63603e052ff81f859d8ad3158d06c5e83c5439a6b5f2b26'}]}, 'timestamp': '2025-12-02 10:06:16.148097', '_unique_id': 'cfa238b391574207995cc2081afe25b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.149 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.150 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.150 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1a97c9a-2894-42ad-a529-7ee8ab47281b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:06:16.150419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8a3f5b62-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': 'e2b8f43a3e7c173e5cfa91a3735c83bae59fd85e5b8e1c666ec5fbaab7a85bbf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:06:16.150419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8a3f6bb6-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': '7112aca97c9d816bfed63bc4be5825514961d830ab20c50ccd4c419b1f5b49ed'}]}, 'timestamp': '2025-12-02 10:06:16.151338', '_unique_id': '4064233348724d86a5aa226636db0ef8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.152 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.153 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40f95c25-b220-43e3-aaa1-e992c85e8413', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.153676', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a3fd9ca-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': '4a54e05ded664a8e4f27897b1d395779dc31aa7d5a5e1f905044afd4ecff7427'}]}, 'timestamp': '2025-12-02 10:06:16.154188', '_unique_id': '6c038ab2af084d249f75c1b4422f2286'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.156 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fb39997-72ea-4edd-a990-4ba89b04bf96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.156451', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a404590-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': '117dcb3def9957b68dc9bf4990927b9fb489d9312904d2195c439434c01688b0'}]}, 'timestamp': '2025-12-02 10:06:16.156975', '_unique_id': '82da09e24c08429caad55ad1df77b9b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.168 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.169 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30b864c1-06dd-436e-b954-94609ba1508c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:06:16.159200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8a4224d2-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.378301184, 'message_signature': '4e5e88c015ac8a18bf7806808223aa976f1363f69ea8727554f96cc95ffc098d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:06:16.159200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8a423922-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.378301184, 'message_signature': '934b743e56e6e4d2391f2bb9143dc4eef947dee8466d3a23fe8aac48144badc2'}]}, 'timestamp': '2025-12-02 10:06:16.169751', '_unique_id': 'baf396672f6343caba9e093f9a631706'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.171 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.172 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.184 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 17080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b074786b-c75f-4ce1-afbc-cb0286ba37be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17080000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:06:16.172510', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8a448434-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.403117728, 'message_signature': '4bf19b91e1f1c06d935a05aec2ca3da04279e1e66aaa7327ca766b72a01008cf'}]}, 'timestamp': '2025-12-02 10:06:16.184840', '_unique_id': '370fa08b0cae43798a6159c762e8cf66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.187 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.187 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02c2a0b0-7464-481e-9ba2-e3a30db041b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:06:16.187424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8a44ff7c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.403117728, 'message_signature': 'd022cc7ca3350c8a6b119663e584eab82b98f09c33a0d49d1c2ebac06e770bfd'}]}, 'timestamp': '2025-12-02 10:06:16.187907', '_unique_id': '14770a9766dc456c91f5c545b1e952e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.190 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67357ef1-f4c8-405c-81c1-30a7681f4f5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.190669', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a458014-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': '14f2fcb25114c5171464d96940286ab29551426d5f7b18d12b1163d5bcfb4f29'}]}, 'timestamp': '2025-12-02 10:06:16.191256', '_unique_id': 'e5080024210145b888dd549ae58ea09d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.192 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.193 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.193 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dd7fd34-e0f4-4b3a-b248-c4a0fbe9fee8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:06:16.193426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8a45e9b4-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': '3d94821860d3a0c60ea9cb3a0e40289793ee0d26de65daff35193d5596daa0fa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:06:16.193426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8a45fa26-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': '2a92d9afc2c21e86eadd5a71cd59d7783012ff1b7612e96fe88d2100906d5e7f'}]}, 'timestamp': '2025-12-02 10:06:16.194317', '_unique_id': '80d32e33f83140b4b46e12fbdfd2e06f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.195 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a891641-8d36-44e3-9f47-0402e26f35cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.196497', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a4661dc-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': 'e332853b3002f297b7e8681f6c8858e7aa72398a7c009061f265db146c4de062'}]}, 'timestamp': '2025-12-02 10:06:16.196990', '_unique_id': '5c94736b037f4328b44e366e04c1c3b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'deff5126-8e96-4fc5-a953-d5fbaa84acec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.199980', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a46e8f0-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': 'cb64a0ae6cf8c726f29bb6e109a3405d73f37dd8a5057629bb8a6ac4755bdd0b'}]}, 'timestamp': '2025-12-02 10:06:16.200450', '_unique_id': '2d7fe07ebf824183836f49b7b53f7c6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.202 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a080cdb-e0b7-46ed-852c-34c54181f773', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:06:16.202751', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8a4754c0-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.378301184, 'message_signature': 'fcb44c45d97b3450f2fa9fe8cfdda4e1d08d70cdfb6c0fd3387c68042099782e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:06:16.202751', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8a4764f6-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.378301184, 'message_signature': '9a55757f4096465058c8f38b4e31ec4dfcc1f78a2a3bfc9245962a9f0e014c9e'}]}, 'timestamp': '2025-12-02 10:06:16.203644', '_unique_id': '6e262f24d7a94f799c3bc6460a744473'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.205 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06f6b79c-9eb5-474a-b55a-21774a403621', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:06:16.205810', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8a47cc20-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': '782b380354e73a905fbe62185942ea6147024c812457c08dd974292a9a4fa769'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:06:16.205810', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8a47dcf6-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': 'ffe98361e5d1a936e05aa33138eac0e66e685e37fe1951ba26e99bd5b9c45f53'}]}, 'timestamp': '2025-12-02 10:06:16.206719', '_unique_id': '04f442ff0c5e45718c7e4de814141bf1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.209 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03dca642-ab21-4337-853f-65362af7671d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.209153', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a484f1a-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': '4821f8daceab2eb95f7d07c04fba5b45368fb86df502e9b09f409c8b5b652714'}]}, 'timestamp': '2025-12-02 10:06:16.209655', '_unique_id': '2b8df72cc37843f9842d20b59e231d98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.211 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.212 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.212 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ed1f541-d53b-4eda-a574-b373a5ea2d5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:06:16.212143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8a48c36e-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.378301184, 'message_signature': 'ca6e337537a1e876c6e2c3491e3206df475808d08f0edd3bfbc9d9351bb4f924'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:06:16.212143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8a48d520-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.378301184, 'message_signature': '9aa776524eeb2727da4e5e8d975978b77a58d10afdd1848e82fffd5e33ed5b82'}]}, 'timestamp': '2025-12-02 10:06:16.213021', '_unique_id': '07c3536820e349d280ad32f22cb366b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.214 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bca26634-4ec8-4e06-9f9a-58601f831bf6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:06:16.214941', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8a492d90-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': '022d90360ebb026d833f0a9c8400dad83beb9080b0863dbf6fe5ce2b4cca2078'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:06:16.214941', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8a49374a-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.337273047, 'message_signature': '075af2b68cc51ba493f5066cd1f7e01cdc42d4d6c9fe388b159c13f5a3253276'}]}, 'timestamp': '2025-12-02 10:06:16.215455', '_unique_id': '2c31d7544b35409daaecbc7fb6482ad1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.216 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d7b858b-7979-4389-b319-5eb627089e52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.216779', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a49757a-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': '58822057fdddc3d3eddf908515fa11526ae1fa9374b4d115bb796b1471e5844a'}]}, 'timestamp': '2025-12-02 10:06:16.217064', '_unique_id': '84c93caf711d459ca1dfe9aa594da345'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.218 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '291f535a-d0a2-4702-b6a3-92f67478513d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:06:16.218352', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '8a49b2ec-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12138.326113138, 'message_signature': 'add63f74528b0413dc717d779895a1a4d48bc8551318cd3785f529bf8f17b090'}]}, 'timestamp': '2025-12-02 10:06:16.218665', '_unique_id': '1053af5e96994cc381117442e3870bdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:06:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:06:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:06:17 np0005541913.localdomain dnsmasq[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/addn_hosts - 0 addresses
Dec 02 10:06:17 np0005541913.localdomain dnsmasq-dhcp[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/host
Dec 02 10:06:17 np0005541913.localdomain dnsmasq-dhcp[313821]: read /var/lib/neutron/dhcp/bac36584-1981-4cdd-a84e-a0acd6701163/opts
Dec 02 10:06:17 np0005541913.localdomain podman[313932]: 2025-12-02 10:06:17.05544866 +0000 UTC m=+0.066166911 container kill 745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bac36584-1981-4cdd-a84e-a0acd6701163, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:06:17 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:17.247 263406 INFO neutron.agent.linux.ip_lib [None req-d41693c2-6454-415e-b4b8-a89e8f4d47b0 - - - - - -] Device tap9452ee28-23 cannot be used as it has no MAC address
Dec 02 10:06:17 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:17.264 2 INFO neutron.agent.securitygroups_rpc [None req-bb5bb253-d3fa-4182-a08a-6c77d73857f6 07e8b8b380b44de1bc08a311f30e4dd1 4ae019f3db6248368641f9ff9e7acce4 - - default default] Security group rule updated ['dc8aaaaf-7a11-4a4d-8334-5511e0a6c147']
Dec 02 10:06:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:17.273 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:17 np0005541913.localdomain kernel: device tap9452ee28-23 entered promiscuous mode
Dec 02 10:06:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:17.278 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:17 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669977.2792] manager: (tap9452ee28-23): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Dec 02 10:06:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:17Z|00202|binding|INFO|Claiming lport 9452ee28-2385-4409-9420-aba511c252a5 for this chassis.
Dec 02 10:06:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:17Z|00203|binding|INFO|9452ee28-2385-4409-9420-aba511c252a5: Claiming unknown
Dec 02 10:06:17 np0005541913.localdomain systemd-udevd[313963]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:06:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:17.286 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.243/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-45b393b4-6935-41d7-9b33-e0a50bae89a0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45b393b4-6935-41d7-9b33-e0a50bae89a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d4d3b48096450197194eed29ad68df', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1555f2d-4a9f-4453-a712-6d4c971353c9, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=9452ee28-2385-4409-9420-aba511c252a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:17.287 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 9452ee28-2385-4409-9420-aba511c252a5 in datapath 45b393b4-6935-41d7-9b33-e0a50bae89a0 bound to our chassis
Dec 02 10:06:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:17.290 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 144a91f0-4a98-4ccb-bad3-9780bb2aa0f5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:06:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:17.290 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45b393b4-6935-41d7-9b33-e0a50bae89a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:06:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:17.291 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3329a6-cc03-4f4d-a456-97d89cc8d5b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:06:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:17Z|00204|binding|INFO|Setting lport 9452ee28-2385-4409-9420-aba511c252a5 ovn-installed in OVS
Dec 02 10:06:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:17Z|00205|binding|INFO|Setting lport 9452ee28-2385-4409-9420-aba511c252a5 up in Southbound
Dec 02 10:06:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:17.334 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:17Z|00206|binding|INFO|Releasing lport 2dbcd8ec-20c4-46b0-aa36-003343647b6f from this chassis (sb_readonly=0)
Dec 02 10:06:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:17Z|00207|binding|INFO|Setting lport 2dbcd8ec-20c4-46b0-aa36-003343647b6f down in Southbound
Dec 02 10:06:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:17.343 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:17.351 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-bac36584-1981-4cdd-a84e-a0acd6701163', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bac36584-1981-4cdd-a84e-a0acd6701163', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1273a829a21431083b7acd4fe017c0f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88083dc8-f50d-46ad-8d21-9b887e2c23b6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=2dbcd8ec-20c4-46b0-aa36-003343647b6f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:17 np0005541913.localdomain kernel: device tap2dbcd8ec-20 left promiscuous mode
Dec 02 10:06:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:17.353 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:17.354 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 2dbcd8ec-20c4-46b0-aa36-003343647b6f in datapath bac36584-1981-4cdd-a84e-a0acd6701163 unbound from our chassis
Dec 02 10:06:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:17.356 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bac36584-1981-4cdd-a84e-a0acd6701163, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:06:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:17.357 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[e660699b-d4ff-4321-bf9c-216723c70bbf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:06:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:17.366 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:17.371 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:17.389 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:17 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:17.419 2 INFO neutron.agent.securitygroups_rpc [None req-c73fb2ed-db0d-4633-bf8b-3646a66fbf65 e6f97ef89976422db171867e1c0c59f0 3f0966ca3eec4301b9d84b4543ff9fdf - - default default] Security group member updated ['a857935d-02ea-4e3d-98f4-258f4647959a']
Dec 02 10:06:18 np0005541913.localdomain ceph-mon[298296]: pgmap v211: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:18 np0005541913.localdomain podman[314019]: 
Dec 02 10:06:18 np0005541913.localdomain podman[314019]: 2025-12-02 10:06:18.218589437 +0000 UTC m=+0.092379272 container create 99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45b393b4-6935-41d7-9b33-e0a50bae89a0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:06:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:06:18 np0005541913.localdomain systemd[1]: Started libpod-conmon-99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818.scope.
Dec 02 10:06:18 np0005541913.localdomain podman[314019]: 2025-12-02 10:06:18.171703874 +0000 UTC m=+0.045493739 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:06:18 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:06:18 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1bc83d9e759505b4e4211d0f2aa7f5dccd534965c1725514fdb16919fad32d11/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:06:18 np0005541913.localdomain podman[314019]: 2025-12-02 10:06:18.358380067 +0000 UTC m=+0.232169892 container init 99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45b393b4-6935-41d7-9b33-e0a50bae89a0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:06:18 np0005541913.localdomain podman[314019]: 2025-12-02 10:06:18.37079935 +0000 UTC m=+0.244589175 container start 99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45b393b4-6935-41d7-9b33-e0a50bae89a0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:06:18 np0005541913.localdomain dnsmasq[314056]: started, version 2.85 cachesize 150
Dec 02 10:06:18 np0005541913.localdomain dnsmasq[314056]: DNS service limited to local subnets
Dec 02 10:06:18 np0005541913.localdomain dnsmasq[314056]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:06:18 np0005541913.localdomain dnsmasq[314056]: warning: no upstream servers configured
Dec 02 10:06:18 np0005541913.localdomain dnsmasq-dhcp[314056]: DHCP, static leases only on 10.100.255.240, lease time 1d
Dec 02 10:06:18 np0005541913.localdomain dnsmasq[314056]: read /var/lib/neutron/dhcp/45b393b4-6935-41d7-9b33-e0a50bae89a0/addn_hosts - 0 addresses
Dec 02 10:06:18 np0005541913.localdomain dnsmasq-dhcp[314056]: read /var/lib/neutron/dhcp/45b393b4-6935-41d7-9b33-e0a50bae89a0/host
Dec 02 10:06:18 np0005541913.localdomain dnsmasq-dhcp[314056]: read /var/lib/neutron/dhcp/45b393b4-6935-41d7-9b33-e0a50bae89a0/opts
Dec 02 10:06:18 np0005541913.localdomain podman[314033]: 2025-12-02 10:06:18.356763964 +0000 UTC m=+0.094204061 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 10:06:18 np0005541913.localdomain podman[314033]: 2025-12-02 10:06:18.444216534 +0000 UTC m=+0.181656621 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:06:18 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:06:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:18.529 263406 INFO neutron.agent.dhcp.agent [None req-8f1cfe6e-4a40-4fe0-af18-ad87f9287eb7 - - - - - -] DHCP configuration for ports {'d64539fc-1ec9-4bb1-9f63-20bffe6b7020'} is completed
Dec 02 10:06:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:19Z|00208|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:06:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:19.680 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:20 np0005541913.localdomain dnsmasq[313821]: exiting on receipt of SIGTERM
Dec 02 10:06:20 np0005541913.localdomain podman[314074]: 2025-12-02 10:06:20.122741319 +0000 UTC m=+0.059009970 container kill 745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bac36584-1981-4cdd-a84e-a0acd6701163, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: libpod-745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1.scope: Deactivated successfully.
Dec 02 10:06:20 np0005541913.localdomain ceph-mon[298296]: pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:20 np0005541913.localdomain podman[314094]: 2025-12-02 10:06:20.186579367 +0000 UTC m=+0.036379574 container died 745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bac36584-1981-4cdd-a84e-a0acd6701163, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: tmp-crun.OeU36P.mount: Deactivated successfully.
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-92d72d12a9f6e625de9b3e59f6b237633fb3ec1fad9bdbceba45fc659a116379-merged.mount: Deactivated successfully.
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1-userdata-shm.mount: Deactivated successfully.
Dec 02 10:06:20 np0005541913.localdomain podman[314094]: 2025-12-02 10:06:20.23640892 +0000 UTC m=+0.086209117 container remove 745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bac36584-1981-4cdd-a84e-a0acd6701163, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: libpod-conmon-745bd6ed97c2a0850c3eb5ac0df955d5f986d06ceaa8165ec71dc925a2aa2aa1.scope: Deactivated successfully.
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dbac36584\x2d1981\x2d4cdd\x2da84e\x2da0acd6701163.mount: Deactivated successfully.
Dec 02 10:06:20 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:20.262 263406 INFO neutron.agent.dhcp.agent [None req-3f8b7e1e-35d8-46c4-8281-4a5160107d7f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:06:20 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:20.266 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:06:20 np0005541913.localdomain podman[314095]: 2025-12-02 10:06:20.297433862 +0000 UTC m=+0.145923455 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 10:06:20 np0005541913.localdomain podman[314095]: 2025-12-02 10:06:20.333695122 +0000 UTC m=+0.182184795 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter)
Dec 02 10:06:20 np0005541913.localdomain podman[314097]: 2025-12-02 10:06:20.347740618 +0000 UTC m=+0.192883080 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:06:20 np0005541913.localdomain podman[314097]: 2025-12-02 10:06:20.360024197 +0000 UTC m=+0.205166719 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:06:20 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:06:20 np0005541913.localdomain sudo[314156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:06:20 np0005541913.localdomain sudo[314156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:06:20 np0005541913.localdomain sudo[314156]: pam_unix(sudo:session): session closed for user root
Dec 02 10:06:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:20.830 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:20 np0005541913.localdomain sudo[314174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:06:20 np0005541913.localdomain sudo[314174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:06:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:20.969 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:21 np0005541913.localdomain sudo[314174]: pam_unix(sudo:session): session closed for user root
Dec 02 10:06:21 np0005541913.localdomain sudo[314223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:06:21 np0005541913.localdomain sudo[314223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:06:21 np0005541913.localdomain sudo[314223]: pam_unix(sudo:session): session closed for user root
Dec 02 10:06:21 np0005541913.localdomain ceph-mon[298296]: pgmap v213: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:06:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:06:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:06:21 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:06:23 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:06:24 np0005541913.localdomain ceph-mon[298296]: pgmap v214: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:24 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:24.205 263406 INFO neutron.agent.linux.ip_lib [None req-bfa62cba-e49b-4769-89da-990f37989974 - - - - - -] Device tap2ac07a58-29 cannot be used as it has no MAC address
Dec 02 10:06:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:24.231 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:24 np0005541913.localdomain kernel: device tap2ac07a58-29 entered promiscuous mode
Dec 02 10:06:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:24Z|00209|binding|INFO|Claiming lport 2ac07a58-2990-406d-9523-fa045a2131dc for this chassis.
Dec 02 10:06:24 np0005541913.localdomain NetworkManager[5965]: <info>  [1764669984.2417] manager: (tap2ac07a58-29): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Dec 02 10:06:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:24.241 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:24Z|00210|binding|INFO|2ac07a58-2990-406d-9523-fa045a2131dc: Claiming unknown
Dec 02 10:06:24 np0005541913.localdomain systemd-udevd[314251]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:06:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2ac07a58-29: No such device
Dec 02 10:06:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:24Z|00211|binding|INFO|Setting lport 2ac07a58-2990-406d-9523-fa045a2131dc ovn-installed in OVS
Dec 02 10:06:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:24Z|00212|binding|INFO|Setting lport 2ac07a58-2990-406d-9523-fa045a2131dc up in Southbound
Dec 02 10:06:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2ac07a58-29: No such device
Dec 02 10:06:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:24.326 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-fe22b18d-d633-497b-bdda-39e8c539a772', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe22b18d-d633-497b-bdda-39e8c539a772', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d4d3b48096450197194eed29ad68df', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af4132dc-7b0a-4214-baa9-f7da22b4f1e1, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=2ac07a58-2990-406d-9523-fa045a2131dc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:24.325 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:24.327 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 2ac07a58-2990-406d-9523-fa045a2131dc in datapath fe22b18d-d633-497b-bdda-39e8c539a772 bound to our chassis
Dec 02 10:06:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:24.328 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fe22b18d-d633-497b-bdda-39e8c539a772 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:06:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:24.329 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[d99fde58-3075-4469-8df5-6b2ffb4ef2db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:06:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2ac07a58-29: No such device
Dec 02 10:06:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2ac07a58-29: No such device
Dec 02 10:06:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2ac07a58-29: No such device
Dec 02 10:06:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2ac07a58-29: No such device
Dec 02 10:06:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:24.345 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2ac07a58-29: No such device
Dec 02 10:06:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap2ac07a58-29: No such device
Dec 02 10:06:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:24.365 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:25 np0005541913.localdomain podman[314321]: 
Dec 02 10:06:25 np0005541913.localdomain podman[314321]: 2025-12-02 10:06:25.114551314 +0000 UTC m=+0.085077546 container create 1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe22b18d-d633-497b-bdda-39e8c539a772, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:06:25 np0005541913.localdomain systemd[1]: Started libpod-conmon-1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf.scope.
Dec 02 10:06:25 np0005541913.localdomain podman[314321]: 2025-12-02 10:06:25.072196961 +0000 UTC m=+0.042723233 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:06:25 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:25Z|00213|binding|INFO|Removing iface tap2ac07a58-29 ovn-installed in OVS
Dec 02 10:06:25 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:25Z|00214|binding|INFO|Removing lport 2ac07a58-2990-406d-9523-fa045a2131dc ovn-installed in OVS
Dec 02 10:06:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:25.178 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:25 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:25.176 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 001be18f-4e9c-45ce-91c6-140e331db5ae with type ""
Dec 02 10:06:25 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:25.181 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-fe22b18d-d633-497b-bdda-39e8c539a772', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe22b18d-d633-497b-bdda-39e8c539a772', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d4d3b48096450197194eed29ad68df', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af4132dc-7b0a-4214-baa9-f7da22b4f1e1, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=2ac07a58-2990-406d-9523-fa045a2131dc) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:25.185 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:25 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:25.186 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 2ac07a58-2990-406d-9523-fa045a2131dc in datapath fe22b18d-d633-497b-bdda-39e8c539a772 unbound from our chassis
Dec 02 10:06:25 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:25.188 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fe22b18d-d633-497b-bdda-39e8c539a772 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:06:25 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:25.189 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[127ce47f-100e-4618-a994-a6c2345ca6ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:06:25 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:06:25 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41f4087dc34d8c0d48ea5d346f07ce2299945782a2fd8fa7e387242ee05d14e4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:06:25 np0005541913.localdomain podman[314321]: 2025-12-02 10:06:25.209352461 +0000 UTC m=+0.179878693 container init 1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe22b18d-d633-497b-bdda-39e8c539a772, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:06:25 np0005541913.localdomain podman[314321]: 2025-12-02 10:06:25.21870075 +0000 UTC m=+0.189226992 container start 1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe22b18d-d633-497b-bdda-39e8c539a772, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:06:25 np0005541913.localdomain dnsmasq[314339]: started, version 2.85 cachesize 150
Dec 02 10:06:25 np0005541913.localdomain dnsmasq[314339]: DNS service limited to local subnets
Dec 02 10:06:25 np0005541913.localdomain dnsmasq[314339]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:06:25 np0005541913.localdomain dnsmasq[314339]: warning: no upstream servers configured
Dec 02 10:06:25 np0005541913.localdomain dnsmasq-dhcp[314339]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:06:25 np0005541913.localdomain dnsmasq[314339]: read /var/lib/neutron/dhcp/fe22b18d-d633-497b-bdda-39e8c539a772/addn_hosts - 0 addresses
Dec 02 10:06:25 np0005541913.localdomain dnsmasq-dhcp[314339]: read /var/lib/neutron/dhcp/fe22b18d-d633-497b-bdda-39e8c539a772/host
Dec 02 10:06:25 np0005541913.localdomain dnsmasq-dhcp[314339]: read /var/lib/neutron/dhcp/fe22b18d-d633-497b-bdda-39e8c539a772/opts
Dec 02 10:06:25 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:25Z|00215|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:06:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:25.375 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:25 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:25.402 263406 INFO neutron.agent.dhcp.agent [None req-9c835e05-0daa-4f97-808f-ddaedf82ed46 - - - - - -] DHCP configuration for ports {'a54abf5d-e342-4c56-9ad6-39e109e70c55'} is completed
Dec 02 10:06:25 np0005541913.localdomain systemd[1]: tmp-crun.zl9R65.mount: Deactivated successfully.
Dec 02 10:06:25 np0005541913.localdomain dnsmasq[314339]: exiting on receipt of SIGTERM
Dec 02 10:06:25 np0005541913.localdomain podman[314356]: 2025-12-02 10:06:25.591162425 +0000 UTC m=+0.068581565 container kill 1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe22b18d-d633-497b-bdda-39e8c539a772, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 10:06:25 np0005541913.localdomain systemd[1]: libpod-1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf.scope: Deactivated successfully.
Dec 02 10:06:25 np0005541913.localdomain podman[314368]: 2025-12-02 10:06:25.659645228 +0000 UTC m=+0.054059428 container died 1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe22b18d-d633-497b-bdda-39e8c539a772, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:06:25 np0005541913.localdomain podman[314368]: 2025-12-02 10:06:25.690180454 +0000 UTC m=+0.084594594 container cleanup 1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe22b18d-d633-497b-bdda-39e8c539a772, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:06:25 np0005541913.localdomain systemd[1]: libpod-conmon-1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf.scope: Deactivated successfully.
Dec 02 10:06:25 np0005541913.localdomain podman[314370]: 2025-12-02 10:06:25.750023885 +0000 UTC m=+0.136493013 container remove 1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe22b18d-d633-497b-bdda-39e8c539a772, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:06:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:06:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Cumulative writes: 2144 writes, 23K keys, 2144 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.07 MB/s
                                                           Cumulative WAL: 2144 writes, 2144 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2144 writes, 23K keys, 2144 commit groups, 1.0 writes per commit group, ingest: 41.34 MB, 0.07 MB/s
                                                           Interval WAL: 2144 writes, 2144 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    113.2      0.26              0.08         9    0.029       0      0       0.0       0.0
                                                             L6      1/0   15.87 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5    180.7    163.6      0.81              0.33         8    0.101     98K   3974       0.0       0.0
                                                            Sum      1/0   15.87 MB   0.0      0.1     0.0      0.1       0.2      0.0       0.0   5.5    136.4    151.2      1.07              0.41        17    0.063     98K   3974       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.2      0.0       0.0   5.5    138.9    154.0      1.05              0.41        16    0.066     98K   3974       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    180.7    163.6      0.81              0.33         8    0.101     98K   3974       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    122.2      0.24              0.08         8    0.030       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.019       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.029, interval 0.029
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.16 GB write, 0.27 MB/s write, 0.14 GB read, 0.24 MB/s read, 1.1 seconds
                                                           Interval compaction: 0.16 GB write, 0.27 MB/s write, 0.14 GB read, 0.24 MB/s read, 1.1 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x563183c47350#2 capacity: 308.00 MB usage: 29.19 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.0003 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1959,28.51 MB,9.25684%) FilterBlock(17,302.48 KB,0.0959074%) IndexBlock(17,388.86 KB,0.123294%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 02 10:06:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:25.762 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:25 np0005541913.localdomain kernel: device tap2ac07a58-29 left promiscuous mode
Dec 02 10:06:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:25.777 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:25 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:25.798 263406 INFO neutron.agent.dhcp.agent [None req-f6b26f45-c4f5-43f1-922f-9dd07a2c45a0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:06:25 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:25.799 263406 INFO neutron.agent.dhcp.agent [None req-f6b26f45-c4f5-43f1-922f-9dd07a2c45a0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:06:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:25.831 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:25.971 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-41f4087dc34d8c0d48ea5d346f07ce2299945782a2fd8fa7e387242ee05d14e4-merged.mount: Deactivated successfully.
Dec 02 10:06:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b3f857c124e3f98309a9f8e87b2b956861e7ec8434d2c564a7929dbf05455bf-userdata-shm.mount: Deactivated successfully.
Dec 02 10:06:26 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dfe22b18d\x2dd633\x2d497b\x2dbdda\x2d39e8c539a772.mount: Deactivated successfully.
Dec 02 10:06:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:26 np0005541913.localdomain ceph-mon[298296]: pgmap v215: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:26 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:26.457 2 INFO neutron.agent.securitygroups_rpc [None req-55db991b-3b52-42dd-b07a-80ab3fefc470 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['72de153d-340c-4642-ae21-72dcd91d8ceb']
Dec 02 10:06:26 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:26.625 2 INFO neutron.agent.securitygroups_rpc [None req-3231b6f4-a459-453f-8b66-50672725d94d ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['72de153d-340c-4642-ae21-72dcd91d8ceb']
Dec 02 10:06:27 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:27.732 2 INFO neutron.agent.securitygroups_rpc [None req-c8b7833a-90ae-4c85-a76b-1cc28b8de3de ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:28 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:28.064 2 INFO neutron.agent.securitygroups_rpc [None req-7e69b2ff-a5bb-4d6e-b782-47cf6529c7b2 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:28 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:28.202 2 INFO neutron.agent.securitygroups_rpc [None req-01786b72-4967-4664-b0a8-c04c5ee043aa ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:06:28 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:28.411 2 INFO neutron.agent.securitygroups_rpc [None req-660a054c-2d13-4375-bdeb-e4f5c9661010 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:28 np0005541913.localdomain ceph-mon[298296]: pgmap v216: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:28 np0005541913.localdomain podman[314397]: 2025-12-02 10:06:28.45531884 +0000 UTC m=+0.090932164 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:06:28 np0005541913.localdomain podman[314397]: 2025-12-02 10:06:28.494991291 +0000 UTC m=+0.130604625 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:06:28 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:06:28 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:28.601 2 INFO neutron.agent.securitygroups_rpc [None req-62dae022-f3c9-4be5-80ba-7d0557eedc03 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:28 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:28.890 2 INFO neutron.agent.securitygroups_rpc [None req-ec75e4da-10e7-4730-8e6b-71c48e471048 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:29 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:29.257 2 INFO neutron.agent.securitygroups_rpc [None req-405af18e-31af-407d-8854-f380b293accc ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:29 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:29.526 2 INFO neutron.agent.securitygroups_rpc [None req-f8e3c35c-c137-4121-a943-a4b83494d8a2 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:30 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:30.179 2 INFO neutron.agent.securitygroups_rpc [None req-9f2bf60d-db80-42ad-806a-1445118d8a03 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:30 np0005541913.localdomain ceph-mon[298296]: pgmap v217: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:30 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:30.785 2 INFO neutron.agent.securitygroups_rpc [None req-9a1f4b78-6988-4ca6-b0f0-b52a2438af33 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['df2c54da-38ba-4edc-acd1-4c6b2da63f7a']
Dec 02 10:06:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:30.833 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:30.973 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:31 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:31.764 2 INFO neutron.agent.securitygroups_rpc [None req-1c27ab60-bacc-4e9e-b1c9-d6f9cf0e1b32 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['8ebd5526-cfd6-4dd0-8888-3d40098feb1a']
Dec 02 10:06:32 np0005541913.localdomain ceph-mon[298296]: pgmap v218: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:33 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/705508170' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:06:33 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/705508170' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:06:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:06:33 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:33.365 2 INFO neutron.agent.securitygroups_rpc [None req-ec9364dd-dd89-44e2-a668-097e6474f1a7 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['f835d0d9-69c7-416b-b19f-71e98abbea19']
Dec 02 10:06:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:06:33 np0005541913.localdomain podman[314415]: 2025-12-02 10:06:33.448407369 +0000 UTC m=+0.073143637 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 02 10:06:33 np0005541913.localdomain podman[314414]: 2025-12-02 10:06:33.501272794 +0000 UTC m=+0.127542254 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:06:33 np0005541913.localdomain podman[314415]: 2025-12-02 10:06:33.518017412 +0000 UTC m=+0.142753620 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:06:33 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:06:33 np0005541913.localdomain podman[314414]: 2025-12-02 10:06:33.539167087 +0000 UTC m=+0.165436567 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:06:33 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:06:33 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:33.625 2 INFO neutron.agent.securitygroups_rpc [None req-37778547-7b0e-4196-bc87-08fdc55b8adf ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['f835d0d9-69c7-416b-b19f-71e98abbea19']
Dec 02 10:06:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:33.900 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:33.901 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:06:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:33.908 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:06:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:06:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:06:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:06:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:06:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:06:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:06:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:06:34 np0005541913.localdomain ceph-mon[298296]: pgmap v219: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:35.837 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:35.975 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:06:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:06:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:06:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156111 "" "Go-http-client/1.1"
Dec 02 10:06:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:06:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19237 "" "Go-http-client/1.1"
Dec 02 10:06:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:36.119 2 INFO neutron.agent.securitygroups_rpc [None req-4d405741-5ff5-4de3-bee3-cffdae397b25 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['a13484c0-648a-48f0-a8cb-29cdca97e066']
Dec 02 10:06:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:36 np0005541913.localdomain ceph-mon[298296]: pgmap v220: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:36.314 2 INFO neutron.agent.securitygroups_rpc [None req-b1eefb6d-0b2e-4576-bbe4-d313eb7d9799 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['a13484c0-648a-48f0-a8cb-29cdca97e066']
Dec 02 10:06:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:36.902 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:06:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:36.983 2 INFO neutron.agent.securitygroups_rpc [None req-9e362cac-bb61-4369-ad00-9f073d908c17 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['0d1aa800-00f4-4e0d-be41-caba26c873bd']
Dec 02 10:06:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e115 e115: 6 total, 6 up, 6 in
Dec 02 10:06:37 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:37.404 2 INFO neutron.agent.securitygroups_rpc [None req-24128e00-40c9-486f-b5be-6d4fcff90c40 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['0d1aa800-00f4-4e0d-be41-caba26c873bd']
Dec 02 10:06:37 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:37.944 2 INFO neutron.agent.securitygroups_rpc [None req-c72c5dd9-5995-4df5-a6ad-e067a8fdaf10 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['0d1aa800-00f4-4e0d-be41-caba26c873bd']
Dec 02 10:06:38 np0005541913.localdomain ceph-mon[298296]: pgmap v221: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:06:38 np0005541913.localdomain ceph-mon[298296]: osdmap e115: 6 total, 6 up, 6 in
Dec 02 10:06:38 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:38.315 2 INFO neutron.agent.securitygroups_rpc [None req-1079cc4d-64e8-4687-8a8b-22fa9980bbe7 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['0d1aa800-00f4-4e0d-be41-caba26c873bd']
Dec 02 10:06:38 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:38.599 2 INFO neutron.agent.securitygroups_rpc [None req-37425f6d-3678-4ce3-8643-3e657bae2eff ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['0d1aa800-00f4-4e0d-be41-caba26c873bd']
Dec 02 10:06:38 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:38.920 2 INFO neutron.agent.securitygroups_rpc [None req-5dbcc1aa-f594-4336-8b87-6f8ec002ed0b ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['0d1aa800-00f4-4e0d-be41-caba26c873bd']
Dec 02 10:06:39 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e116 e116: 6 total, 6 up, 6 in
Dec 02 10:06:40 np0005541913.localdomain ceph-mon[298296]: pgmap v223: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 1.6 KiB/s wr, 14 op/s
Dec 02 10:06:40 np0005541913.localdomain ceph-mon[298296]: osdmap e116: 6 total, 6 up, 6 in
Dec 02 10:06:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:40.841 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:40 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:06:40.868 2 INFO neutron.agent.securitygroups_rpc [None req-bbaaa1af-170e-49f9-ab51-fca299624b09 ed3b4dffeb0d4a4f93cbf0470d2fba06 1512807ff8de4caaab2cbe4666784e7d - - default default] Security group rule updated ['ef31c17a-e7e0-47e3-9c93-83c68ae18a93']
Dec 02 10:06:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:40.976 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:42 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e117 e117: 6 total, 6 up, 6 in
Dec 02 10:06:42 np0005541913.localdomain ceph-mon[298296]: pgmap v225: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 KiB/s wr, 18 op/s
Dec 02 10:06:43 np0005541913.localdomain ceph-mon[298296]: osdmap e117: 6 total, 6 up, 6 in
Dec 02 10:06:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:43.233 263406 INFO neutron.agent.linux.ip_lib [None req-a7dd135d-703c-48ef-ab14-fbb77957ada6 - - - - - -] Device tap9a3b1032-b9 cannot be used as it has no MAC address
Dec 02 10:06:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:43.255 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:43 np0005541913.localdomain kernel: device tap9a3b1032-b9 entered promiscuous mode
Dec 02 10:06:43 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670003.2643] manager: (tap9a3b1032-b9): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Dec 02 10:06:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:43.263 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:43Z|00216|binding|INFO|Claiming lport 9a3b1032-b966-42f6-bd53-33e478407e4c for this chassis.
Dec 02 10:06:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:43Z|00217|binding|INFO|9a3b1032-b966-42f6-bd53-33e478407e4c: Claiming unknown
Dec 02 10:06:43 np0005541913.localdomain systemd-udevd[314473]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:06:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:43.279 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-0add6e7f-ff15-4bda-be2f-1e656e720929', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0add6e7f-ff15-4bda-be2f-1e656e720929', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d4d3b48096450197194eed29ad68df', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd99a7fe-e968-4c3f-96c7-184bb6138ed4, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=9a3b1032-b966-42f6-bd53-33e478407e4c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:43.280 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 9a3b1032-b966-42f6-bd53-33e478407e4c in datapath 0add6e7f-ff15-4bda-be2f-1e656e720929 bound to our chassis
Dec 02 10:06:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:43.281 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0add6e7f-ff15-4bda-be2f-1e656e720929 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:06:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:43.282 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ef29d26c-0e09-4af3-9935-79a405885c23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:06:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9a3b1032-b9: No such device
Dec 02 10:06:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:43.295 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:43Z|00218|binding|INFO|Setting lport 9a3b1032-b966-42f6-bd53-33e478407e4c ovn-installed in OVS
Dec 02 10:06:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:43Z|00219|binding|INFO|Setting lport 9a3b1032-b966-42f6-bd53-33e478407e4c up in Southbound
Dec 02 10:06:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:43.301 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9a3b1032-b9: No such device
Dec 02 10:06:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:43.304 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9a3b1032-b9: No such device
Dec 02 10:06:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9a3b1032-b9: No such device
Dec 02 10:06:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9a3b1032-b9: No such device
Dec 02 10:06:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9a3b1032-b9: No such device
Dec 02 10:06:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9a3b1032-b9: No such device
Dec 02 10:06:43 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9a3b1032-b9: No such device
Dec 02 10:06:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:43.339 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:43.363 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e118 e118: 6 total, 6 up, 6 in
Dec 02 10:06:44 np0005541913.localdomain ceph-mon[298296]: pgmap v227: 177 pgs: 177 active+clean; 209 MiB data, 922 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 11 MiB/s wr, 136 op/s
Dec 02 10:06:45 np0005541913.localdomain podman[314542]: 
Dec 02 10:06:45 np0005541913.localdomain podman[314542]: 2025-12-02 10:06:45.053012835 +0000 UTC m=+0.075170851 container create 783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0add6e7f-ff15-4bda-be2f-1e656e720929, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:06:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:06:45 np0005541913.localdomain systemd[1]: Started libpod-conmon-783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba.scope.
Dec 02 10:06:45 np0005541913.localdomain systemd[1]: tmp-crun.V9Ac3E.mount: Deactivated successfully.
Dec 02 10:06:45 np0005541913.localdomain podman[314542]: 2025-12-02 10:06:45.006468691 +0000 UTC m=+0.028626717 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:06:45 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:06:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b1be98159917b52c833b727e30d0192370db67304a645d168e02eafe4efdd29/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:06:45 np0005541913.localdomain ceph-mon[298296]: osdmap e118: 6 total, 6 up, 6 in
Dec 02 10:06:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e119 e119: 6 total, 6 up, 6 in
Dec 02 10:06:45 np0005541913.localdomain podman[314556]: 2025-12-02 10:06:45.18250161 +0000 UTC m=+0.094963561 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 02 10:06:45 np0005541913.localdomain podman[314542]: 2025-12-02 10:06:45.190754521 +0000 UTC m=+0.212912537 container init 783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0add6e7f-ff15-4bda-be2f-1e656e720929, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:06:45 np0005541913.localdomain podman[314556]: 2025-12-02 10:06:45.195143718 +0000 UTC m=+0.107605719 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 10:06:45 np0005541913.localdomain dnsmasq[314580]: started, version 2.85 cachesize 150
Dec 02 10:06:45 np0005541913.localdomain dnsmasq[314580]: DNS service limited to local subnets
Dec 02 10:06:45 np0005541913.localdomain dnsmasq[314580]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:06:45 np0005541913.localdomain dnsmasq[314580]: warning: no upstream servers configured
Dec 02 10:06:45 np0005541913.localdomain dnsmasq-dhcp[314580]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:06:45 np0005541913.localdomain dnsmasq[314580]: read /var/lib/neutron/dhcp/0add6e7f-ff15-4bda-be2f-1e656e720929/addn_hosts - 0 addresses
Dec 02 10:06:45 np0005541913.localdomain dnsmasq-dhcp[314580]: read /var/lib/neutron/dhcp/0add6e7f-ff15-4bda-be2f-1e656e720929/host
Dec 02 10:06:45 np0005541913.localdomain dnsmasq-dhcp[314580]: read /var/lib/neutron/dhcp/0add6e7f-ff15-4bda-be2f-1e656e720929/opts
Dec 02 10:06:45 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:06:45 np0005541913.localdomain podman[314542]: 2025-12-02 10:06:45.251704401 +0000 UTC m=+0.273862417 container start 783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0add6e7f-ff15-4bda-be2f-1e656e720929, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:06:45 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:45.474 263406 INFO neutron.agent.dhcp.agent [None req-e150396d-fe2a-47f4-bd57-c8391360238b - - - - - -] DHCP configuration for ports {'0e2e4efc-b473-4e89-b805-dc238578b32c'} is completed
Dec 02 10:06:45 np0005541913.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 02 10:06:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:45Z|00220|binding|INFO|Removing iface tap9a3b1032-b9 ovn-installed in OVS
Dec 02 10:06:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:45.858 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port da2c3546-7ee1-4f37-b792-e7fd0de3d136 with type ""
Dec 02 10:06:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:45Z|00221|binding|INFO|Removing lport 9a3b1032-b966-42f6-bd53-33e478407e4c ovn-installed in OVS
Dec 02 10:06:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:45.860 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-0add6e7f-ff15-4bda-be2f-1e656e720929', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0add6e7f-ff15-4bda-be2f-1e656e720929', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d4d3b48096450197194eed29ad68df', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd99a7fe-e968-4c3f-96c7-184bb6138ed4, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=9a3b1032-b966-42f6-bd53-33e478407e4c) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:45.862 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 9a3b1032-b966-42f6-bd53-33e478407e4c in datapath 0add6e7f-ff15-4bda-be2f-1e656e720929 unbound from our chassis
Dec 02 10:06:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:45.866 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0add6e7f-ff15-4bda-be2f-1e656e720929, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:06:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:45.867 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[7d4ad27c-2092-4399-8f34-4743a7a1eae7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:06:45 np0005541913.localdomain kernel: device tap9a3b1032-b9 left promiscuous mode
Dec 02 10:06:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:45.879 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:45.896 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:45.977 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:46 np0005541913.localdomain ceph-mon[298296]: pgmap v229: 177 pgs: 177 active+clean; 209 MiB data, 922 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 11 MiB/s wr, 111 op/s
Dec 02 10:06:46 np0005541913.localdomain ceph-mon[298296]: osdmap e119: 6 total, 6 up, 6 in
Dec 02 10:06:46 np0005541913.localdomain dnsmasq[314580]: read /var/lib/neutron/dhcp/0add6e7f-ff15-4bda-be2f-1e656e720929/addn_hosts - 0 addresses
Dec 02 10:06:46 np0005541913.localdomain dnsmasq-dhcp[314580]: read /var/lib/neutron/dhcp/0add6e7f-ff15-4bda-be2f-1e656e720929/host
Dec 02 10:06:46 np0005541913.localdomain dnsmasq-dhcp[314580]: read /var/lib/neutron/dhcp/0add6e7f-ff15-4bda-be2f-1e656e720929/opts
Dec 02 10:06:46 np0005541913.localdomain podman[314600]: 2025-12-02 10:06:46.46238403 +0000 UTC m=+0.056429421 container kill 783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0add6e7f-ff15-4bda-be2f-1e656e720929, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent [None req-fb7cfb80-ee60-4bb4-bfb4-e8433251b8a3 - - - - - -] Unable to reload_allocations dhcp for 0add6e7f-ff15-4bda-be2f-1e656e720929.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap9a3b1032-b9 not found in namespace qdhcp-0add6e7f-ff15-4bda-be2f-1e656e720929.
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     return fut.result()
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     raise self._exception
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap9a3b1032-b9 not found in namespace qdhcp-0add6e7f-ff15-4bda-be2f-1e656e720929.
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.485 263406 ERROR neutron.agent.dhcp.agent 
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.495 263406 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Dec 02 10:06:46 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:46Z|00222|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:06:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:46.917 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.973 263406 INFO neutron.agent.dhcp.agent [None req-c86fd9c9-8c92-4391-b127-f78692fda812 - - - - - -] All active networks have been fetched through RPC.
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.974 263406 INFO neutron.agent.dhcp.agent [-] Starting network 0add6e7f-ff15-4bda-be2f-1e656e720929 dhcp configuration
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.974 263406 INFO neutron.agent.dhcp.agent [-] Finished network 0add6e7f-ff15-4bda-be2f-1e656e720929 dhcp configuration
Dec 02 10:06:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:46.975 263406 INFO neutron.agent.dhcp.agent [None req-c86fd9c9-8c92-4391-b127-f78692fda812 - - - - - -] Synchronizing state complete
Dec 02 10:06:47 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e120 e120: 6 total, 6 up, 6 in
Dec 02 10:06:47 np0005541913.localdomain systemd[1]: tmp-crun.Qur7bZ.mount: Deactivated successfully.
Dec 02 10:06:47 np0005541913.localdomain dnsmasq[314580]: exiting on receipt of SIGTERM
Dec 02 10:06:47 np0005541913.localdomain podman[314630]: 2025-12-02 10:06:47.201392061 +0000 UTC m=+0.078875661 container kill 783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0add6e7f-ff15-4bda-be2f-1e656e720929, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:06:47 np0005541913.localdomain systemd[1]: libpod-783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba.scope: Deactivated successfully.
Dec 02 10:06:47 np0005541913.localdomain podman[314644]: 2025-12-02 10:06:47.273160791 +0000 UTC m=+0.051138359 container died 783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0add6e7f-ff15-4bda-be2f-1e656e720929, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:06:47 np0005541913.localdomain systemd[1]: tmp-crun.KNQrpf.mount: Deactivated successfully.
Dec 02 10:06:47 np0005541913.localdomain podman[314644]: 2025-12-02 10:06:47.389950656 +0000 UTC m=+0.167928174 container remove 783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0add6e7f-ff15-4bda-be2f-1e656e720929, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:06:47 np0005541913.localdomain systemd[1]: libpod-conmon-783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba.scope: Deactivated successfully.
Dec 02 10:06:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:47.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:06:48 np0005541913.localdomain ceph-mon[298296]: pgmap v231: 177 pgs: 177 active+clean; 209 MiB data, 922 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 11 MiB/s wr, 111 op/s
Dec 02 10:06:48 np0005541913.localdomain ceph-mon[298296]: osdmap e120: 6 total, 6 up, 6 in
Dec 02 10:06:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4b1be98159917b52c833b727e30d0192370db67304a645d168e02eafe4efdd29-merged.mount: Deactivated successfully.
Dec 02 10:06:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-783a875b8c2c04f3c57936e1c3177f445ef57df7b3da156745996c243a97e3ba-userdata-shm.mount: Deactivated successfully.
Dec 02 10:06:48 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d0add6e7f\x2dff15\x2d4bda\x2dbe2f\x2d1e656e720929.mount: Deactivated successfully.
Dec 02 10:06:48 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e121 e121: 6 total, 6 up, 6 in
Dec 02 10:06:49 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e122 e122: 6 total, 6 up, 6 in
Dec 02 10:06:49 np0005541913.localdomain ceph-mon[298296]: osdmap e121: 6 total, 6 up, 6 in
Dec 02 10:06:49 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:06:49 np0005541913.localdomain podman[314672]: 2025-12-02 10:06:49.442545978 +0000 UTC m=+0.084855701 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 02 10:06:49 np0005541913.localdomain podman[314672]: 2025-12-02 10:06:49.477165425 +0000 UTC m=+0.119475158 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 02 10:06:49 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:06:50 np0005541913.localdomain ceph-mon[298296]: pgmap v234: 177 pgs: 177 active+clean; 145 MiB data, 922 MiB used, 41 GiB / 42 GiB avail; 152 KiB/s rd, 9.8 MiB/s wr, 216 op/s
Dec 02 10:06:50 np0005541913.localdomain ceph-mon[298296]: osdmap e122: 6 total, 6 up, 6 in
Dec 02 10:06:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e123 e123: 6 total, 6 up, 6 in
Dec 02 10:06:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:50.881 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:50.979 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e124 e124: 6 total, 6 up, 6 in
Dec 02 10:06:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:51 np0005541913.localdomain ceph-mon[298296]: osdmap e123: 6 total, 6 up, 6 in
Dec 02 10:06:51 np0005541913.localdomain ceph-mon[298296]: osdmap e124: 6 total, 6 up, 6 in
Dec 02 10:06:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:06:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:06:51 np0005541913.localdomain podman[314690]: 2025-12-02 10:06:51.437644274 +0000 UTC m=+0.078454560 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 10:06:51 np0005541913.localdomain podman[314690]: 2025-12-02 10:06:51.449877852 +0000 UTC m=+0.090688138 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 10:06:51 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:06:51 np0005541913.localdomain podman[314691]: 2025-12-02 10:06:51.502085848 +0000 UTC m=+0.135979258 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:06:51 np0005541913.localdomain podman[314691]: 2025-12-02 10:06:51.534863006 +0000 UTC m=+0.168756436 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:06:51 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:06:52 np0005541913.localdomain ceph-mon[298296]: pgmap v238: 177 pgs: 177 active+clean; 145 MiB data, 922 MiB used, 41 GiB / 42 GiB avail; 196 KiB/s rd, 13 MiB/s wr, 277 op/s
Dec 02 10:06:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e125 e125: 6 total, 6 up, 6 in
Dec 02 10:06:53 np0005541913.localdomain ceph-mon[298296]: osdmap e125: 6 total, 6 up, 6 in
Dec 02 10:06:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:53.435 263406 INFO neutron.agent.linux.ip_lib [None req-bd00e9ce-f091-424f-abf7-918223fa7a24 - - - - - -] Device tapa4104ef1-2f cannot be used as it has no MAC address
Dec 02 10:06:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:53.495 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:53 np0005541913.localdomain kernel: device tapa4104ef1-2f entered promiscuous mode
Dec 02 10:06:53 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670013.5028] manager: (tapa4104ef1-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Dec 02 10:06:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:53.503 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:53Z|00223|binding|INFO|Claiming lport a4104ef1-2fd9-417a-9b3a-16b3d9d44a5b for this chassis.
Dec 02 10:06:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:53Z|00224|binding|INFO|a4104ef1-2fd9-417a-9b3a-16b3d9d44a5b: Claiming unknown
Dec 02 10:06:53 np0005541913.localdomain systemd-udevd[314742]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:06:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:53.512 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-c6404ebe-76b1-43df-87e7-66f2b8167fb2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6404ebe-76b1-43df-87e7-66f2b8167fb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d4d3b48096450197194eed29ad68df', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c964373-75d0-46a8-9c74-4755190151c0, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=a4104ef1-2fd9-417a-9b3a-16b3d9d44a5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:53.513 160221 INFO neutron.agent.ovn.metadata.agent [-] Port a4104ef1-2fd9-417a-9b3a-16b3d9d44a5b in datapath c6404ebe-76b1-43df-87e7-66f2b8167fb2 bound to our chassis
Dec 02 10:06:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:53.513 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c6404ebe-76b1-43df-87e7-66f2b8167fb2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:06:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:53.514 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1b9698-87ce-4f66-bc51-d256188385e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:06:53 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa4104ef1-2f: No such device
Dec 02 10:06:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:53.530 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:53 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa4104ef1-2f: No such device
Dec 02 10:06:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:53Z|00225|binding|INFO|Setting lport a4104ef1-2fd9-417a-9b3a-16b3d9d44a5b ovn-installed in OVS
Dec 02 10:06:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:53Z|00226|binding|INFO|Setting lport a4104ef1-2fd9-417a-9b3a-16b3d9d44a5b up in Southbound
Dec 02 10:06:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:53.536 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:53.538 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:53 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa4104ef1-2f: No such device
Dec 02 10:06:53 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa4104ef1-2f: No such device
Dec 02 10:06:53 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa4104ef1-2f: No such device
Dec 02 10:06:53 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa4104ef1-2f: No such device
Dec 02 10:06:53 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa4104ef1-2f: No such device
Dec 02 10:06:53 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa4104ef1-2f: No such device
Dec 02 10:06:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:53.565 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:53.591 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:54 np0005541913.localdomain ceph-mon[298296]: pgmap v240: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 87 KiB/s rd, 5.5 KiB/s wr, 118 op/s
Dec 02 10:06:54 np0005541913.localdomain podman[314813]: 
Dec 02 10:06:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e126 e126: 6 total, 6 up, 6 in
Dec 02 10:06:54 np0005541913.localdomain podman[314813]: 2025-12-02 10:06:54.389529696 +0000 UTC m=+0.080242107 container create f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6404ebe-76b1-43df-87e7-66f2b8167fb2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:06:54 np0005541913.localdomain systemd[1]: Started libpod-conmon-f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4.scope.
Dec 02 10:06:54 np0005541913.localdomain systemd[1]: tmp-crun.2mI1Ly.mount: Deactivated successfully.
Dec 02 10:06:54 np0005541913.localdomain podman[314813]: 2025-12-02 10:06:54.352582728 +0000 UTC m=+0.043295149 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:06:54 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:06:54 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d598670f9239e3b6596f34e730fcc2feb27320ff94456532e197a96679a822a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:06:54 np0005541913.localdomain podman[314813]: 2025-12-02 10:06:54.475445775 +0000 UTC m=+0.166158206 container init f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6404ebe-76b1-43df-87e7-66f2b8167fb2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:06:54 np0005541913.localdomain podman[314813]: 2025-12-02 10:06:54.486049159 +0000 UTC m=+0.176761570 container start f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6404ebe-76b1-43df-87e7-66f2b8167fb2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:06:54 np0005541913.localdomain dnsmasq[314832]: started, version 2.85 cachesize 150
Dec 02 10:06:54 np0005541913.localdomain dnsmasq[314832]: DNS service limited to local subnets
Dec 02 10:06:54 np0005541913.localdomain dnsmasq[314832]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:06:54 np0005541913.localdomain dnsmasq[314832]: warning: no upstream servers configured
Dec 02 10:06:54 np0005541913.localdomain dnsmasq-dhcp[314832]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:06:54 np0005541913.localdomain dnsmasq[314832]: read /var/lib/neutron/dhcp/c6404ebe-76b1-43df-87e7-66f2b8167fb2/addn_hosts - 0 addresses
Dec 02 10:06:54 np0005541913.localdomain dnsmasq-dhcp[314832]: read /var/lib/neutron/dhcp/c6404ebe-76b1-43df-87e7-66f2b8167fb2/host
Dec 02 10:06:54 np0005541913.localdomain dnsmasq-dhcp[314832]: read /var/lib/neutron/dhcp/c6404ebe-76b1-43df-87e7-66f2b8167fb2/opts
Dec 02 10:06:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:54Z|00227|binding|INFO|Removing iface tapa4104ef1-2f ovn-installed in OVS
Dec 02 10:06:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:54.630 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 688cb983-65f3-4a37-8227-7785be0bfe50 with type ""
Dec 02 10:06:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:54.632 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-c6404ebe-76b1-43df-87e7-66f2b8167fb2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6404ebe-76b1-43df-87e7-66f2b8167fb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d4d3b48096450197194eed29ad68df', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c964373-75d0-46a8-9c74-4755190151c0, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=a4104ef1-2fd9-417a-9b3a-16b3d9d44a5b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:06:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:54.634 160221 INFO neutron.agent.ovn.metadata.agent [-] Port a4104ef1-2fd9-417a-9b3a-16b3d9d44a5b in datapath c6404ebe-76b1-43df-87e7-66f2b8167fb2 unbound from our chassis
Dec 02 10:06:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:54Z|00228|binding|INFO|Removing lport a4104ef1-2fd9-417a-9b3a-16b3d9d44a5b ovn-installed in OVS
Dec 02 10:06:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:54.635 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c6404ebe-76b1-43df-87e7-66f2b8167fb2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:06:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:06:54.670 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[745a4d76-aeed-4bf9-b5ac-7824fe9c83f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:06:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:54.671 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:54.673 263406 INFO neutron.agent.dhcp.agent [None req-27eab582-0bb0-44af-abee-65f8b3c16f57 - - - - - -] DHCP configuration for ports {'99741eee-a4cf-401b-bedc-bcdbd3907469'} is completed
Dec 02 10:06:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:06:54Z|00229|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:06:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:54.820 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:54 np0005541913.localdomain dnsmasq[314832]: exiting on receipt of SIGTERM
Dec 02 10:06:54 np0005541913.localdomain systemd[1]: libpod-f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4.scope: Deactivated successfully.
Dec 02 10:06:54 np0005541913.localdomain podman[314850]: 2025-12-02 10:06:54.848860285 +0000 UTC m=+0.070615860 container kill f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6404ebe-76b1-43df-87e7-66f2b8167fb2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:06:54 np0005541913.localdomain podman[314864]: 2025-12-02 10:06:54.930720044 +0000 UTC m=+0.067887997 container died f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6404ebe-76b1-43df-87e7-66f2b8167fb2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:06:54 np0005541913.localdomain podman[314864]: 2025-12-02 10:06:54.961133758 +0000 UTC m=+0.098301621 container cleanup f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6404ebe-76b1-43df-87e7-66f2b8167fb2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:06:54 np0005541913.localdomain systemd[1]: libpod-conmon-f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4.scope: Deactivated successfully.
Dec 02 10:06:55 np0005541913.localdomain podman[314866]: 2025-12-02 10:06:55.012000359 +0000 UTC m=+0.142818512 container remove f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6404ebe-76b1-43df-87e7-66f2b8167fb2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:06:55 np0005541913.localdomain kernel: device tapa4104ef1-2f left promiscuous mode
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.026 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.051 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:55.073 263406 INFO neutron.agent.dhcp.agent [None req-e768fa79-8492-4015-a261-3893c4758a69 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:06:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:06:55.073 263406 INFO neutron.agent.dhcp.agent [None req-e768fa79-8492-4015-a261-3893c4758a69 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:06:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0d598670f9239e3b6596f34e730fcc2feb27320ff94456532e197a96679a822a-merged.mount: Deactivated successfully.
Dec 02 10:06:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f42a51183e3cd63134ff68027ab168b27c9435d67c22030f26e0286bac929fb4-userdata-shm.mount: Deactivated successfully.
Dec 02 10:06:55 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dc6404ebe\x2d76b1\x2d43df\x2d87e7\x2d66f2b8167fb2.mount: Deactivated successfully.
Dec 02 10:06:55 np0005541913.localdomain ceph-mon[298296]: osdmap e126: 6 total, 6 up, 6 in
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.917 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.933 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.933 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.933 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.934 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:06:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:55.981 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:06:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e127 e127: 6 total, 6 up, 6 in
Dec 02 10:06:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:06:56 np0005541913.localdomain ceph-mon[298296]: pgmap v242: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 4.6 KiB/s wr, 99 op/s
Dec 02 10:06:56 np0005541913.localdomain ceph-mon[298296]: osdmap e127: 6 total, 6 up, 6 in
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.635 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.652 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.653 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.653 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.654 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.654 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.670 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.671 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.671 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.672 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:06:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:56.672 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:06:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:06:57 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2616537687' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.132 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.213 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.214 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:06:57 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2616537687' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:06:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e128 e128: 6 total, 6 up, 6 in
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.464 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.465 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11278MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.466 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.466 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.591 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.591 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.592 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:06:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:57.654 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:06:58 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:06:58 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3420145848' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:06:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:58.079 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:06:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:58.086 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:06:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:58.108 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:06:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:58.111 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:06:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:58.111 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:06:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:06:58.285 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:06:58 np0005541913.localdomain ceph-mon[298296]: pgmap v244: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 3.7 KiB/s wr, 78 op/s
Dec 02 10:06:58 np0005541913.localdomain ceph-mon[298296]: osdmap e128: 6 total, 6 up, 6 in
Dec 02 10:06:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3420145848' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:06:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:06:59 np0005541913.localdomain podman[314939]: 2025-12-02 10:06:59.455843296 +0000 UTC m=+0.088658533 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 02 10:06:59 np0005541913.localdomain podman[314939]: 2025-12-02 10:06:59.474336431 +0000 UTC m=+0.107151678 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 02 10:06:59 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:06:59 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e129 e129: 6 total, 6 up, 6 in
Dec 02 10:07:00 np0005541913.localdomain ceph-mon[298296]: pgmap v246: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.7 KiB/s wr, 56 op/s
Dec 02 10:07:00 np0005541913.localdomain ceph-mon[298296]: osdmap e129: 6 total, 6 up, 6 in
Dec 02 10:07:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:00.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:07:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:00.921 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:00.983 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e130 e130: 6 total, 6 up, 6 in
Dec 02 10:07:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:02 np0005541913.localdomain ceph-mon[298296]: pgmap v248: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.7 KiB/s wr, 56 op/s
Dec 02 10:07:02 np0005541913.localdomain ceph-mon[298296]: osdmap e130: 6 total, 6 up, 6 in
Dec 02 10:07:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:02.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:07:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:02.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:07:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:03.050 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:07:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:03.051 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:07:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:03.052 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:07:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e131 e131: 6 total, 6 up, 6 in
Dec 02 10:07:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:03.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:07:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:07:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:07:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:07:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:07:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:07:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:07:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:07:04 np0005541913.localdomain ceph-mon[298296]: pgmap v250: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 67 KiB/s rd, 4.2 KiB/s wr, 91 op/s
Dec 02 10:07:04 np0005541913.localdomain ceph-mon[298296]: osdmap e131: 6 total, 6 up, 6 in
Dec 02 10:07:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/7662486' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:07:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:07:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:07:04 np0005541913.localdomain podman[314959]: 2025-12-02 10:07:04.432763225 +0000 UTC m=+0.078790198 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:07:04 np0005541913.localdomain podman[314959]: 2025-12-02 10:07:04.470107665 +0000 UTC m=+0.116134588 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:07:04 np0005541913.localdomain podman[314960]: 2025-12-02 10:07:04.486108953 +0000 UTC m=+0.125685824 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Dec 02 10:07:04 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:07:04 np0005541913.localdomain podman[314960]: 2025-12-02 10:07:04.52710614 +0000 UTC m=+0.166683071 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:07:04 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:07:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2367667588' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:07:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/38028727' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:07:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/38028727' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:07:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2800528967' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:07:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3029616690' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:07:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e132 e132: 6 total, 6 up, 6 in
Dec 02 10:07:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:05.923 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:05.984 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:07:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:07:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:07:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156111 "" "Go-http-client/1.1"
Dec 02 10:07:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:07:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19247 "" "Go-http-client/1.1"
Dec 02 10:07:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:06 np0005541913.localdomain ceph-mon[298296]: pgmap v252: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.5 KiB/s wr, 34 op/s
Dec 02 10:07:06 np0005541913.localdomain ceph-mon[298296]: osdmap e132: 6 total, 6 up, 6 in
Dec 02 10:07:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/786304527' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:07:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/786304527' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:07:08 np0005541913.localdomain ceph-mon[298296]: pgmap v254: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.5 KiB/s wr, 34 op/s
Dec 02 10:07:10 np0005541913.localdomain ceph-mon[298296]: pgmap v255: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 46 KiB/s rd, 2.4 KiB/s wr, 61 op/s
Dec 02 10:07:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:07:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 8033 writes, 34K keys, 8033 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 8033 writes, 1906 syncs, 4.21 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3169 writes, 12K keys, 3169 commit groups, 1.0 writes per commit group, ingest: 12.35 MB, 0.02 MB/s
                                                          Interval WAL: 3169 writes, 1306 syncs, 2.43 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 10:07:10 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:10.773 2 INFO neutron.agent.securitygroups_rpc [None req-f9cb0719-6f1e-4498-ae5c-3d1490c7cf9b c04b0c1b682647b3a235292b9ca1b605 2b57b1fad39449b49cbbffbb5c62906d - - default default] Security group member updated ['dba82d8e-ac81-4899-ab61-fcab2136c60b']
Dec 02 10:07:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:10.926 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:10.985 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 e133: 6 total, 6 up, 6 in
Dec 02 10:07:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:11 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:11.353 2 INFO neutron.agent.securitygroups_rpc [None req-0dd50eb4-1108-4728-8f57-13a5878ac244 c04b0c1b682647b3a235292b9ca1b605 2b57b1fad39449b49cbbffbb5c62906d - - default default] Security group member updated ['dba82d8e-ac81-4899-ab61-fcab2136c60b']
Dec 02 10:07:11 np0005541913.localdomain dnsmasq[314056]: exiting on receipt of SIGTERM
Dec 02 10:07:11 np0005541913.localdomain podman[315022]: 2025-12-02 10:07:11.798205142 +0000 UTC m=+0.066610393 container kill 99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45b393b4-6935-41d7-9b33-e0a50bae89a0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:07:11 np0005541913.localdomain systemd[1]: libpod-99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818.scope: Deactivated successfully.
Dec 02 10:07:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:11.826 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 144a91f0-4a98-4ccb-bad3-9780bb2aa0f5 with type ""
Dec 02 10:07:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:11Z|00230|binding|INFO|Removing iface tap9452ee28-23 ovn-installed in OVS
Dec 02 10:07:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:11Z|00231|binding|INFO|Removing lport 9452ee28-2385-4409-9420-aba511c252a5 ovn-installed in OVS
Dec 02 10:07:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:11.828 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.243/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-45b393b4-6935-41d7-9b33-e0a50bae89a0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45b393b4-6935-41d7-9b33-e0a50bae89a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d4d3b48096450197194eed29ad68df', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1555f2d-4a9f-4453-a712-6d4c971353c9, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=9452ee28-2385-4409-9420-aba511c252a5) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:11.828 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:11.831 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 9452ee28-2385-4409-9420-aba511c252a5 in datapath 45b393b4-6935-41d7-9b33-e0a50bae89a0 unbound from our chassis
Dec 02 10:07:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:11.833 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45b393b4-6935-41d7-9b33-e0a50bae89a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:07:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:11.835 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:11.835 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[d7104cf0-2bf7-4791-8e8e-7c79f99ae902]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:11 np0005541913.localdomain podman[315036]: 2025-12-02 10:07:11.861832564 +0000 UTC m=+0.052069383 container died 99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45b393b4-6935-41d7-9b33-e0a50bae89a0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:07:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818-userdata-shm.mount: Deactivated successfully.
Dec 02 10:07:11 np0005541913.localdomain podman[315036]: 2025-12-02 10:07:11.944600979 +0000 UTC m=+0.134837798 container cleanup 99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45b393b4-6935-41d7-9b33-e0a50bae89a0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:07:11 np0005541913.localdomain systemd[1]: libpod-conmon-99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818.scope: Deactivated successfully.
Dec 02 10:07:11 np0005541913.localdomain podman[315043]: 2025-12-02 10:07:11.96928234 +0000 UTC m=+0.139095943 container remove 99f3b0be960c9abe405b7cae3b1b11db238865a75f90ba42cfe676a03242a818 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45b393b4-6935-41d7-9b33-e0a50bae89a0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:07:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:12.024 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:12 np0005541913.localdomain kernel: device tap9452ee28-23 left promiscuous mode
Dec 02 10:07:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:12.046 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:12 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:12.061 263406 INFO neutron.agent.dhcp.agent [None req-c1de1da9-7936-44ad-a0fd-32d4afbddc6f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:07:12 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:12.062 263406 INFO neutron.agent.dhcp.agent [None req-c1de1da9-7936-44ad-a0fd-32d4afbddc6f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:07:12 np0005541913.localdomain ceph-mon[298296]: pgmap v256: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.2 KiB/s wr, 35 op/s
Dec 02 10:07:12 np0005541913.localdomain ceph-mon[298296]: osdmap e133: 6 total, 6 up, 6 in
Dec 02 10:07:12 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:12Z|00232|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:07:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:12.195 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:12 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-1bc83d9e759505b4e4211d0f2aa7f5dccd534965c1725514fdb16919fad32d11-merged.mount: Deactivated successfully.
Dec 02 10:07:12 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d45b393b4\x2d6935\x2d41d7\x2d9b33\x2de0a50bae89a0.mount: Deactivated successfully.
Dec 02 10:07:14 np0005541913.localdomain ceph-mon[298296]: pgmap v258: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.2 KiB/s wr, 35 op/s
Dec 02 10:07:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:07:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.2 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2705 syncs, 3.78 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4296 writes, 14K keys, 4296 commit groups, 1.0 writes per commit group, ingest: 13.86 MB, 0.02 MB/s
                                                          Interval WAL: 4296 writes, 1841 syncs, 2.33 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 10:07:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:07:15 np0005541913.localdomain podman[315065]: 2025-12-02 10:07:15.446895514 +0000 UTC m=+0.086681450 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:07:15 np0005541913.localdomain podman[315065]: 2025-12-02 10:07:15.465672216 +0000 UTC m=+0.105458183 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:07:15 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:07:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:15.929 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:15.987 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:16 np0005541913.localdomain ceph-mon[298296]: pgmap v259: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 1.0 KiB/s wr, 28 op/s
Dec 02 10:07:18 np0005541913.localdomain ceph-mon[298296]: pgmap v260: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1023 B/s wr, 28 op/s
Dec 02 10:07:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:07:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3564427763' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:07:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:07:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3564427763' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:07:20 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:20.070 2 INFO neutron.agent.securitygroups_rpc [None req-0ca3d260-f659-40d6-b699-f106118e6211 f6abbbfcc7d54e81b5693b2401a25e09 5ea39db037534e2087a54e8a052ad24e - - default default] Security group member updated ['377ae0fe-81df-41e0-8ef6-1afd307f6beb']
Dec 02 10:07:20 np0005541913.localdomain ceph-mon[298296]: pgmap v261: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:07:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3564427763' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:07:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3564427763' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:07:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:07:20 np0005541913.localdomain podman[315086]: 2025-12-02 10:07:20.438669298 +0000 UTC m=+0.076076696 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec 02 10:07:20 np0005541913.localdomain podman[315086]: 2025-12-02 10:07:20.449246061 +0000 UTC m=+0.086653399 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:07:20 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:07:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:20.934 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:21.021 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:21 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:21.634 2 INFO neutron.agent.securitygroups_rpc [None req-9c96eaf7-e1a3-4805-a32c-c883041fe7ca f6abbbfcc7d54e81b5693b2401a25e09 5ea39db037534e2087a54e8a052ad24e - - default default] Security group member updated ['377ae0fe-81df-41e0-8ef6-1afd307f6beb']
Dec 02 10:07:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:07:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:07:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:21.758 263406 INFO neutron.agent.linux.ip_lib [None req-3234796a-2d9a-434c-a89f-a4edb2007520 - - - - - -] Device tap308729f2-5c cannot be used as it has no MAC address
Dec 02 10:07:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:21.790 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:21 np0005541913.localdomain systemd[1]: tmp-crun.TXZXEP.mount: Deactivated successfully.
Dec 02 10:07:21 np0005541913.localdomain podman[315106]: 2025-12-02 10:07:21.800795849 +0000 UTC m=+0.101564108 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:07:21 np0005541913.localdomain kernel: device tap308729f2-5c entered promiscuous mode
Dec 02 10:07:21 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:21Z|00233|binding|INFO|Claiming lport 308729f2-5cef-4da6-a8d1-12678a8ce24b for this chassis.
Dec 02 10:07:21 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670041.8101] manager: (tap308729f2-5c): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Dec 02 10:07:21 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:21Z|00234|binding|INFO|308729f2-5cef-4da6-a8d1-12678a8ce24b: Claiming unknown
Dec 02 10:07:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:21.809 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:21.821 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27cf39916c5c4bc1833487052acaa22a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c696e62-8d3a-4321-b6e9-84ecff5ee056, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=308729f2-5cef-4da6-a8d1-12678a8ce24b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:21.823 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 308729f2-5cef-4da6-a8d1-12678a8ce24b in datapath a69b1d7f-7b6a-4e57-97d2-3e13016a1afd bound to our chassis
Dec 02 10:07:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:21.825 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a69b1d7f-7b6a-4e57-97d2-3e13016a1afd or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:21 np0005541913.localdomain systemd-udevd[315145]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:07:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:21.826 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[028e13ff-efea-4a9a-91ef-dc64d382f765]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:21 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:21Z|00235|binding|INFO|Setting lport 308729f2-5cef-4da6-a8d1-12678a8ce24b ovn-installed in OVS
Dec 02 10:07:21 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:21Z|00236|binding|INFO|Setting lport 308729f2-5cef-4da6-a8d1-12678a8ce24b up in Southbound
Dec 02 10:07:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:21.834 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:21.844 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:21 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap308729f2-5c: No such device
Dec 02 10:07:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:21.855 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:21 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap308729f2-5c: No such device
Dec 02 10:07:21 np0005541913.localdomain systemd[1]: tmp-crun.nZ2C9J.mount: Deactivated successfully.
Dec 02 10:07:21 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap308729f2-5c: No such device
Dec 02 10:07:21 np0005541913.localdomain podman[315105]: 2025-12-02 10:07:21.872674202 +0000 UTC m=+0.178588599 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Dec 02 10:07:21 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap308729f2-5c: No such device
Dec 02 10:07:21 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap308729f2-5c: No such device
Dec 02 10:07:21 np0005541913.localdomain podman[315106]: 2025-12-02 10:07:21.883371159 +0000 UTC m=+0.184139378 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:07:21 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap308729f2-5c: No such device
Dec 02 10:07:21 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap308729f2-5c: No such device
Dec 02 10:07:21 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap308729f2-5c: No such device
Dec 02 10:07:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:21.897 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:21 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:07:21 np0005541913.localdomain podman[315105]: 2025-12-02 10:07:21.910462423 +0000 UTC m=+0.216376820 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Dec 02 10:07:21 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:07:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:21.937 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:21 np0005541913.localdomain sudo[315178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:07:21 np0005541913.localdomain sudo[315178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:07:21 np0005541913.localdomain sudo[315178]: pam_unix(sudo:session): session closed for user root
Dec 02 10:07:22 np0005541913.localdomain sudo[315201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:07:22 np0005541913.localdomain sudo[315201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:07:22 np0005541913.localdomain ceph-mon[298296]: pgmap v262: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:07:22 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1857178396' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:07:22 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1857178396' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:07:22 np0005541913.localdomain sudo[315201]: pam_unix(sudo:session): session closed for user root
Dec 02 10:07:22 np0005541913.localdomain podman[315295]: 
Dec 02 10:07:22 np0005541913.localdomain podman[315295]: 2025-12-02 10:07:22.869020637 +0000 UTC m=+0.098517877 container create 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:07:22 np0005541913.localdomain podman[315295]: 2025-12-02 10:07:22.821767633 +0000 UTC m=+0.051264923 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:07:22 np0005541913.localdomain systemd[1]: Started libpod-conmon-2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f.scope.
Dec 02 10:07:22 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:07:22 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55241c631c38010f44af375554574eb4384ea2395685ed6f7e3376d51f08e8a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:07:22 np0005541913.localdomain podman[315295]: 2025-12-02 10:07:22.958564562 +0000 UTC m=+0.188061802 container init 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:07:22 np0005541913.localdomain podman[315295]: 2025-12-02 10:07:22.968387326 +0000 UTC m=+0.197884576 container start 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:07:22 np0005541913.localdomain dnsmasq[315330]: started, version 2.85 cachesize 150
Dec 02 10:07:22 np0005541913.localdomain dnsmasq[315330]: DNS service limited to local subnets
Dec 02 10:07:22 np0005541913.localdomain dnsmasq[315330]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:07:22 np0005541913.localdomain dnsmasq[315330]: warning: no upstream servers configured
Dec 02 10:07:22 np0005541913.localdomain dnsmasq-dhcp[315330]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:07:22 np0005541913.localdomain dnsmasq[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/addn_hosts - 0 addresses
Dec 02 10:07:22 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/host
Dec 02 10:07:22 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/opts
Dec 02 10:07:22 np0005541913.localdomain sudo[315309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:07:22 np0005541913.localdomain sudo[315309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:07:22 np0005541913.localdomain sudo[315309]: pam_unix(sudo:session): session closed for user root
Dec 02 10:07:23 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:07:23 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:07:23 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:07:23 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:07:23 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:23.163 263406 INFO neutron.agent.dhcp.agent [None req-89d967dc-0fa0-4d0c-80ef-94d7aaebaba3 - - - - - -] DHCP configuration for ports {'33a72976-31c7-4593-980e-42e36905a8a1'} is completed
Dec 02 10:07:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:23.476 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:24 np0005541913.localdomain ceph-mon[298296]: pgmap v263: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 938 B/s wr, 15 op/s
Dec 02 10:07:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:25.937 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:26.023 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:26 np0005541913.localdomain ceph-mon[298296]: pgmap v264: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 938 B/s wr, 15 op/s
Dec 02 10:07:26 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3870090034' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:07:26 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3870090034' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:07:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:26 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:26.984 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:07:28 np0005541913.localdomain ceph-mon[298296]: pgmap v265: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 938 B/s wr, 15 op/s
Dec 02 10:07:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:07:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:28.334 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:28 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:28.611 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:28Z, description=, device_id=d23c300d-2106-463f-ba69-eebcc6860c57, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a00c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a00160>], id=084a8cd7-7670-4ca5-8e64-7cf76391b695, ip_allocation=immediate, mac_address=fa:16:3e:41:3f:64, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:19Z, description=, dns_domain=, id=a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPAdminTestJSON-test-network-994176625, port_security_enabled=True, project_id=27cf39916c5c4bc1833487052acaa22a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64351, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1587, status=ACTIVE, subnets=['21bb9eae-69df-40d9-8e2c-fcdc977cf0ec'], tags=[], tenant_id=27cf39916c5c4bc1833487052acaa22a, updated_at=2025-12-02T10:07:20Z, vlan_transparent=None, network_id=a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, port_security_enabled=False, project_id=27cf39916c5c4bc1833487052acaa22a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1661, status=DOWN, tags=[], tenant_id=27cf39916c5c4bc1833487052acaa22a, updated_at=2025-12-02T10:07:28Z on network a69b1d7f-7b6a-4e57-97d2-3e13016a1afd
Dec 02 10:07:28 np0005541913.localdomain dnsmasq[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/addn_hosts - 1 addresses
Dec 02 10:07:28 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/host
Dec 02 10:07:28 np0005541913.localdomain podman[315348]: 2025-12-02 10:07:28.857884677 +0000 UTC m=+0.070226050 container kill 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:07:28 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/opts
Dec 02 10:07:28 np0005541913.localdomain systemd[1]: tmp-crun.y1r30i.mount: Deactivated successfully.
Dec 02 10:07:29 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:29.049 263406 INFO neutron.agent.dhcp.agent [None req-a9fcff92-5fbd-4c4d-9439-73c8a59bdda5 - - - - - -] DHCP configuration for ports {'084a8cd7-7670-4ca5-8e64-7cf76391b695'} is completed
Dec 02 10:07:29 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1103887549' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:07:29 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1103887549' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:07:30 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:30.099 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:28Z, description=, device_id=d23c300d-2106-463f-ba69-eebcc6860c57, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990924a670>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990924a430>], id=084a8cd7-7670-4ca5-8e64-7cf76391b695, ip_allocation=immediate, mac_address=fa:16:3e:41:3f:64, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:19Z, description=, dns_domain=, id=a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPAdminTestJSON-test-network-994176625, port_security_enabled=True, project_id=27cf39916c5c4bc1833487052acaa22a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64351, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1587, status=ACTIVE, subnets=['21bb9eae-69df-40d9-8e2c-fcdc977cf0ec'], tags=[], tenant_id=27cf39916c5c4bc1833487052acaa22a, updated_at=2025-12-02T10:07:20Z, vlan_transparent=None, network_id=a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, port_security_enabled=False, project_id=27cf39916c5c4bc1833487052acaa22a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1661, status=DOWN, tags=[], tenant_id=27cf39916c5c4bc1833487052acaa22a, updated_at=2025-12-02T10:07:28Z on network a69b1d7f-7b6a-4e57-97d2-3e13016a1afd
Dec 02 10:07:30 np0005541913.localdomain ceph-mon[298296]: pgmap v266: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.5 KiB/s wr, 29 op/s
Dec 02 10:07:30 np0005541913.localdomain podman[315387]: 2025-12-02 10:07:30.310038847 +0000 UTC m=+0.059033511 container kill 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:07:30 np0005541913.localdomain dnsmasq[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/addn_hosts - 1 addresses
Dec 02 10:07:30 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/host
Dec 02 10:07:30 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/opts
Dec 02 10:07:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:07:30 np0005541913.localdomain podman[315401]: 2025-12-02 10:07:30.426404699 +0000 UTC m=+0.090998485 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:07:30 np0005541913.localdomain podman[315401]: 2025-12-02 10:07:30.467440437 +0000 UTC m=+0.132034253 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:07:30 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:07:30 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:30Z|00237|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:07:30 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:30.615 263406 INFO neutron.agent.dhcp.agent [None req-34b364a2-5ef4-4684-a915-b750ca049285 - - - - - -] DHCP configuration for ports {'084a8cd7-7670-4ca5-8e64-7cf76391b695'} is completed
Dec 02 10:07:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:30.615 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:30.941 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:31.025 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:32 np0005541913.localdomain ceph-mon[298296]: pgmap v267: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.5 KiB/s wr, 29 op/s
Dec 02 10:07:32 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:32.906 2 INFO neutron.agent.securitygroups_rpc [None req-4959ff7d-aca7-46d0-9143-48c9e561106c c695c8d7887d4f5d99397fbd9a108bd7 27cf39916c5c4bc1833487052acaa22a - - default default] Security group member updated ['202778bd-7cc5-43e0-846c-ad0385193194']
Dec 02 10:07:33 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:33.373 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:32Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089924c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908992370>], id=11cf6714-735d-4d40-b8a7-a3e1e579243a, ip_allocation=immediate, mac_address=fa:16:3e:1e:20:fe, name=tempest-FloatingIPAdminTestJSON-2021538240, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:19Z, description=, dns_domain=, id=a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPAdminTestJSON-test-network-994176625, port_security_enabled=True, project_id=27cf39916c5c4bc1833487052acaa22a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64351, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1587, status=ACTIVE, subnets=['21bb9eae-69df-40d9-8e2c-fcdc977cf0ec'], tags=[], tenant_id=27cf39916c5c4bc1833487052acaa22a, updated_at=2025-12-02T10:07:20Z, vlan_transparent=None, network_id=a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, port_security_enabled=True, project_id=27cf39916c5c4bc1833487052acaa22a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['202778bd-7cc5-43e0-846c-ad0385193194'], standard_attr_id=1687, status=DOWN, tags=[], tenant_id=27cf39916c5c4bc1833487052acaa22a, updated_at=2025-12-02T10:07:32Z on network a69b1d7f-7b6a-4e57-97d2-3e13016a1afd
Dec 02 10:07:33 np0005541913.localdomain dnsmasq[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/addn_hosts - 2 addresses
Dec 02 10:07:33 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/host
Dec 02 10:07:33 np0005541913.localdomain podman[315446]: 2025-12-02 10:07:33.682454287 +0000 UTC m=+0.059973955 container kill 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:07:33 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/opts
Dec 02 10:07:33 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:33Z|00238|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:07:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:33.849 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:34.018 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:34.020 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:07:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:34.025 263406 INFO neutron.agent.dhcp.agent [None req-e0392596-821e-43c4-80e0-e18c24f994c3 - - - - - -] DHCP configuration for ports {'11cf6714-735d-4d40-b8a7-a3e1e579243a'} is completed
Dec 02 10:07:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:07:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:07:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:07:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:07:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:07:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:07:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:07:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:07:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:34.059 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:34 np0005541913.localdomain ceph-mon[298296]: pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 1.7 KiB/s wr, 43 op/s
Dec 02 10:07:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:34.579 263406 INFO neutron.agent.linux.ip_lib [None req-c4a37e31-b281-45ef-9781-e6100274d33c - - - - - -] Device tap0f8026b8-d6 cannot be used as it has no MAC address
Dec 02 10:07:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:34.602 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:34 np0005541913.localdomain kernel: device tap0f8026b8-d6 entered promiscuous mode
Dec 02 10:07:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:34.612 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:34 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670054.6143] manager: (tap0f8026b8-d6): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Dec 02 10:07:34 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:34Z|00239|binding|INFO|Claiming lport 0f8026b8-d62d-493f-8190-d2e80ee812a4 for this chassis.
Dec 02 10:07:34 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:34Z|00240|binding|INFO|0f8026b8-d62d-493f-8190-d2e80ee812a4: Claiming unknown
Dec 02 10:07:34 np0005541913.localdomain systemd-udevd[315477]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:07:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:34.630 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0f8026b8-d62d-493f-8190-d2e80ee812a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:34.631 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0f8026b8-d62d-493f-8190-d2e80ee812a4 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:07:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:34.632 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:34.633 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[e8053f1e-783f-4d1c-9453-6e9c3ffa9959]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:07:34 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0f8026b8-d6: No such device
Dec 02 10:07:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:07:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:34.656 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:34 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:34Z|00241|binding|INFO|Setting lport 0f8026b8-d62d-493f-8190-d2e80ee812a4 ovn-installed in OVS
Dec 02 10:07:34 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:34Z|00242|binding|INFO|Setting lport 0f8026b8-d62d-493f-8190-d2e80ee812a4 up in Southbound
Dec 02 10:07:34 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0f8026b8-d6: No such device
Dec 02 10:07:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:34.662 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:34.668 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:34 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0f8026b8-d6: No such device
Dec 02 10:07:34 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0f8026b8-d6: No such device
Dec 02 10:07:34 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0f8026b8-d6: No such device
Dec 02 10:07:34 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0f8026b8-d6: No such device
Dec 02 10:07:34 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0f8026b8-d6: No such device
Dec 02 10:07:34 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0f8026b8-d6: No such device
Dec 02 10:07:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:34.696 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:34 np0005541913.localdomain podman[315482]: 2025-12-02 10:07:34.713641816 +0000 UTC m=+0.055724312 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:07:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:34.719 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:34 np0005541913.localdomain podman[315482]: 2025-12-02 10:07:34.727936458 +0000 UTC m=+0.070018954 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 10:07:34 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:07:34 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:34.798 2 INFO neutron.agent.securitygroups_rpc [None req-7ec4eb97-1d35-4cb1-ad23-be6283df01c3 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:34 np0005541913.localdomain podman[315484]: 2025-12-02 10:07:34.795533116 +0000 UTC m=+0.131802287 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:07:34 np0005541913.localdomain podman[315484]: 2025-12-02 10:07:34.854533715 +0000 UTC m=+0.190802866 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 10:07:34 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:07:35 np0005541913.localdomain podman[315596]: 
Dec 02 10:07:35 np0005541913.localdomain podman[315596]: 2025-12-02 10:07:35.538078081 +0000 UTC m=+0.075763177 container create 8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:07:35 np0005541913.localdomain systemd[1]: Started libpod-conmon-8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79.scope.
Dec 02 10:07:35 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:07:35 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/064d138bd0413126e8f09e8b434779c28a6d201de1de908638c213d1831aaab5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:07:35 np0005541913.localdomain podman[315596]: 2025-12-02 10:07:35.502089388 +0000 UTC m=+0.039774484 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:07:35 np0005541913.localdomain podman[315596]: 2025-12-02 10:07:35.613850078 +0000 UTC m=+0.151535154 container init 8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:07:35 np0005541913.localdomain podman[315596]: 2025-12-02 10:07:35.620080175 +0000 UTC m=+0.157765251 container start 8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:07:35 np0005541913.localdomain dnsmasq[315615]: started, version 2.85 cachesize 150
Dec 02 10:07:35 np0005541913.localdomain dnsmasq[315615]: DNS service limited to local subnets
Dec 02 10:07:35 np0005541913.localdomain dnsmasq[315615]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:07:35 np0005541913.localdomain dnsmasq[315615]: warning: no upstream servers configured
Dec 02 10:07:35 np0005541913.localdomain dnsmasq-dhcp[315615]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:07:35 np0005541913.localdomain dnsmasq[315615]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:35 np0005541913.localdomain dnsmasq-dhcp[315615]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:07:35 np0005541913.localdomain dnsmasq-dhcp[315615]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:07:35 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:35.678 263406 INFO neutron.agent.dhcp.agent [None req-c4a37e31-b281-45ef-9781-e6100274d33c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:33Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088b0a30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088b0ac0>], id=c16bd636-0682-4847-ab96-1785d25e2f0c, ip_allocation=immediate, mac_address=fa:16:3e:30:e1:e6, name=tempest-NetworksTestDHCPv6-1739025647, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['5b8ccefb-7e5f-4954-b69b-9a64408e7e8c'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:32Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=1691, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:34Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:07:35 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:35.728 2 INFO neutron.agent.securitygroups_rpc [None req-87d4804a-2e84-429a-b45c-6794fadb1faa 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:35 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:35.794 263406 INFO neutron.agent.dhcp.agent [None req-471b32fd-095e-4de9-bd9e-ab58bb9e7131 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:07:35 np0005541913.localdomain dnsmasq[315615]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:07:35 np0005541913.localdomain dnsmasq-dhcp[315615]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:07:35 np0005541913.localdomain dnsmasq-dhcp[315615]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:07:35 np0005541913.localdomain podman[315634]: 2025-12-02 10:07:35.84379207 +0000 UTC m=+0.062246516 container kill 8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:07:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:35.943 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:36.027 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:07:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:07:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:07:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157924 "" "Go-http-client/1.1"
Dec 02 10:07:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:07:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19702 "" "Go-http-client/1.1"
Dec 02 10:07:36 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:36.113 263406 INFO neutron.agent.dhcp.agent [None req-091756a7-1138-4199-913a-fd3b6800d700 - - - - - -] DHCP configuration for ports {'c16bd636-0682-4847-ab96-1785d25e2f0c'} is completed
Dec 02 10:07:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:36 np0005541913.localdomain systemd[1]: tmp-crun.DNSUVa.mount: Deactivated successfully.
Dec 02 10:07:36 np0005541913.localdomain podman[315669]: 2025-12-02 10:07:36.192803917 +0000 UTC m=+0.044701837 container kill 8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:07:36 np0005541913.localdomain dnsmasq[315615]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:36 np0005541913.localdomain dnsmasq-dhcp[315615]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:07:36 np0005541913.localdomain dnsmasq-dhcp[315615]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:07:36 np0005541913.localdomain ceph-mon[298296]: pgmap v269: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 852 B/s wr, 27 op/s
Dec 02 10:07:36 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/792357925' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:07:36 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/792357925' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:07:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:36.385 2 INFO neutron.agent.securitygroups_rpc [None req-d47733e1-0ad6-43a1-b5b9-ff46ea82484c 71c1ab73f6584cdc8a5ac07abc1165b6 c83c01183aba40c080a7dde4126b2e3b - - default default] Security group member updated ['8d157c15-6c1c-467c-9dbb-a97c83d265b6']
Dec 02 10:07:36 np0005541913.localdomain dnsmasq[315615]: exiting on receipt of SIGTERM
Dec 02 10:07:36 np0005541913.localdomain podman[315708]: 2025-12-02 10:07:36.631123423 +0000 UTC m=+0.063050157 container kill 8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:07:36 np0005541913.localdomain systemd[1]: libpod-8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79.scope: Deactivated successfully.
Dec 02 10:07:36 np0005541913.localdomain podman[315721]: 2025-12-02 10:07:36.710109796 +0000 UTC m=+0.063656164 container died 8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:07:36 np0005541913.localdomain podman[315721]: 2025-12-02 10:07:36.740082178 +0000 UTC m=+0.093628536 container cleanup 8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:07:36 np0005541913.localdomain systemd[1]: libpod-conmon-8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79.scope: Deactivated successfully.
Dec 02 10:07:36 np0005541913.localdomain podman[315722]: 2025-12-02 10:07:36.780242392 +0000 UTC m=+0.127944113 container remove 8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:07:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:36Z|00243|binding|INFO|Releasing lport 0f8026b8-d62d-493f-8190-d2e80ee812a4 from this chassis (sb_readonly=0)
Dec 02 10:07:36 np0005541913.localdomain kernel: device tap0f8026b8-d6 left promiscuous mode
Dec 02 10:07:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:36Z|00244|binding|INFO|Setting lport 0f8026b8-d62d-493f-8190-d2e80ee812a4 down in Southbound
Dec 02 10:07:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:36.833 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:36.858 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:36.946 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0f8026b8-d62d-493f-8190-d2e80ee812a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:36.948 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0f8026b8-d62d-493f-8190-d2e80ee812a4 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:07:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:36.950 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:36.951 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[feb73755-0eb1-4679-8b57-b91193cb404b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-064d138bd0413126e8f09e8b434779c28a6d201de1de908638c213d1831aaab5-merged.mount: Deactivated successfully.
Dec 02 10:07:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e2345bb7e79b321f7f69de1dea61d247897e9cd12c8e6dccd2a5b72c4a9ce79-userdata-shm.mount: Deactivated successfully.
Dec 02 10:07:37 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:07:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:38.172 263406 INFO neutron.agent.linux.ip_lib [None req-5c1620e9-aa10-42c0-8fb1-7023c95aa6d0 - - - - - -] Device tape072121f-22 cannot be used as it has no MAC address
Dec 02 10:07:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:38.243 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:38 np0005541913.localdomain kernel: device tape072121f-22 entered promiscuous mode
Dec 02 10:07:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:38Z|00245|binding|INFO|Claiming lport e072121f-2250-4257-996f-2505d571a3a6 for this chassis.
Dec 02 10:07:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:38Z|00246|binding|INFO|e072121f-2250-4257-996f-2505d571a3a6: Claiming unknown
Dec 02 10:07:38 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670058.2508] manager: (tape072121f-22): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Dec 02 10:07:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:38.249 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:38 np0005541913.localdomain systemd-udevd[315762]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:07:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:38Z|00247|binding|INFO|Setting lport e072121f-2250-4257-996f-2505d571a3a6 ovn-installed in OVS
Dec 02 10:07:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:38.263 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:38.266 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:38 np0005541913.localdomain ceph-mon[298296]: pgmap v270: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 853 B/s wr, 27 op/s
Dec 02 10:07:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:38.284 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:38.335 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:38.349 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=e072121f-2250-4257-996f-2505d571a3a6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:38Z|00248|binding|INFO|Setting lport e072121f-2250-4257-996f-2505d571a3a6 up in Southbound
Dec 02 10:07:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:38.351 160221 INFO neutron.agent.ovn.metadata.agent [-] Port e072121f-2250-4257-996f-2505d571a3a6 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:07:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:38.353 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:38.354 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[46978b01-3263-4b7a-badb-9d3eba150ad7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:38.371 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:38 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:38.978 2 INFO neutron.agent.securitygroups_rpc [None req-c850bb74-fc0e-4a51-908b-f066332bb7ea 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:39 np0005541913.localdomain podman[315833]: 
Dec 02 10:07:39 np0005541913.localdomain podman[315833]: 2025-12-02 10:07:39.205415103 +0000 UTC m=+0.082304192 container create 0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:07:39 np0005541913.localdomain systemd[1]: Started libpod-conmon-0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093.scope.
Dec 02 10:07:39 np0005541913.localdomain podman[315833]: 2025-12-02 10:07:39.169931434 +0000 UTC m=+0.046820503 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:07:39 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:07:39 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf763cfd106b91e614a3bf8217f658b71d6696de983741c29fd7dbb50f4fb61d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:07:39 np0005541913.localdomain podman[315833]: 2025-12-02 10:07:39.305046529 +0000 UTC m=+0.181935608 container init 0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:07:39 np0005541913.localdomain podman[315833]: 2025-12-02 10:07:39.316758352 +0000 UTC m=+0.193647431 container start 0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:07:39 np0005541913.localdomain dnsmasq[315851]: started, version 2.85 cachesize 150
Dec 02 10:07:39 np0005541913.localdomain dnsmasq[315851]: DNS service limited to local subnets
Dec 02 10:07:39 np0005541913.localdomain dnsmasq[315851]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:07:39 np0005541913.localdomain dnsmasq[315851]: warning: no upstream servers configured
Dec 02 10:07:39 np0005541913.localdomain dnsmasq[315851]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:39.383 263406 INFO neutron.agent.dhcp.agent [None req-5c1620e9-aa10-42c0-8fb1-7023c95aa6d0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:37Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088bc520>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088bcaf0>], id=4c39f2f9-96bf-4f1d-a072-d7adff3db5da, ip_allocation=immediate, mac_address=fa:16:3e:cb:58:35, name=tempest-NetworksTestDHCPv6-848696062, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=4, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['4215fce0-2435-4b41-9600-1b6971be6569'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:36Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=1720, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:37Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:07:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:39.534 263406 INFO neutron.agent.dhcp.agent [None req-466bf31e-a784-4286-911a-e8a5868942f3 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:07:39 np0005541913.localdomain dnsmasq[315851]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:07:39 np0005541913.localdomain podman[315868]: 2025-12-02 10:07:39.580806736 +0000 UTC m=+0.057367676 container kill 0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:07:39 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:39Z|00249|binding|INFO|Releasing lport e072121f-2250-4257-996f-2505d571a3a6 from this chassis (sb_readonly=0)
Dec 02 10:07:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:39.748 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:39 np0005541913.localdomain kernel: device tape072121f-22 left promiscuous mode
Dec 02 10:07:39 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:39Z|00250|binding|INFO|Setting lport e072121f-2250-4257-996f-2505d571a3a6 down in Southbound
Dec 02 10:07:39 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:39.765 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=e072121f-2250-4257-996f-2505d571a3a6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:39 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:39.767 160221 INFO neutron.agent.ovn.metadata.agent [-] Port e072121f-2250-4257-996f-2505d571a3a6 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:07:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:39.767 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:39 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:39.768 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:39 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:39.769 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8707e8-bd51-41c5-a279-bdebd2c6581f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:39.797 263406 INFO neutron.agent.dhcp.agent [None req-f791ce28-6d7d-4c3c-9166-50850a11f7b7 - - - - - -] DHCP configuration for ports {'4c39f2f9-96bf-4f1d-a072-d7adff3db5da'} is completed
Dec 02 10:07:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:39.895 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:07:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:40.022 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:07:40 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:40.176 2 INFO neutron.agent.securitygroups_rpc [None req-7bcf89da-35a6-4e58-8ea7-d94146bd4928 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:40 np0005541913.localdomain ceph-mon[298296]: pgmap v271: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 1.4 KiB/s wr, 42 op/s
Dec 02 10:07:40 np0005541913.localdomain dnsmasq[315851]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:40 np0005541913.localdomain podman[315908]: 2025-12-02 10:07:40.586126861 +0000 UTC m=+0.053360988 container kill 0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for 7d517d9d-ba68-4c0f-b344-6c3be9d614a4.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tape072121f-22 not found in namespace qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4.
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     return fut.result()
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     raise self._exception
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tape072121f-22 not found in namespace qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4.
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.615 263406 ERROR neutron.agent.dhcp.agent 
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.618 263406 INFO neutron.agent.dhcp.agent [None req-c86fd9c9-8c92-4391-b127-f78692fda812 - - - - - -] Synchronizing state
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.900 263406 INFO neutron.agent.dhcp.agent [None req-9232755b-b057-437d-b308-8d060aa8cc33 - - - - - -] All active networks have been fetched through RPC.
Dec 02 10:07:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:40.901 263406 INFO neutron.agent.dhcp.agent [-] Starting network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 dhcp configuration
Dec 02 10:07:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:40.946 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:41.030 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:41 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:41.080 2 INFO neutron.agent.securitygroups_rpc [None req-5e11d272-dc8e-4f26-afbf-45da4f1c93dd c695c8d7887d4f5d99397fbd9a108bd7 27cf39916c5c4bc1833487052acaa22a - - default default] Security group member updated ['202778bd-7cc5-43e0-846c-ad0385193194']
Dec 02 10:07:41 np0005541913.localdomain dnsmasq[315851]: exiting on receipt of SIGTERM
Dec 02 10:07:41 np0005541913.localdomain podman[315939]: 2025-12-02 10:07:41.08771452 +0000 UTC m=+0.064288761 container kill 0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:07:41 np0005541913.localdomain systemd[1]: libpod-0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093.scope: Deactivated successfully.
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.091003) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670061091064, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1966, "num_deletes": 267, "total_data_size": 2549366, "memory_usage": 2596800, "flush_reason": "Manual Compaction"}
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670061106019, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1653647, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22634, "largest_seqno": 24595, "table_properties": {"data_size": 1646223, "index_size": 4318, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16375, "raw_average_key_size": 20, "raw_value_size": 1630923, "raw_average_value_size": 2054, "num_data_blocks": 189, "num_entries": 794, "num_filter_entries": 794, "num_deletions": 267, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669940, "oldest_key_time": 1764669940, "file_creation_time": 1764670061, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 15105 microseconds, and 7248 cpu microseconds.
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.106100) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1653647 bytes OK
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.106140) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.108100) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.108128) EVENT_LOG_v1 {"time_micros": 1764670061108120, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.108158) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 2540257, prev total WAL file size 2541006, number of live WAL files 2.
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.109084) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303138' seq:72057594037927935, type:22 .. '6C6F676D0034323731' seq:0, type:0; will stop at (end)
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1614KB)], [36(15MB)]
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670061109161, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 18292158, "oldest_snapshot_seqno": -1}
Dec 02 10:07:41 np0005541913.localdomain podman[315951]: 2025-12-02 10:07:41.170532845 +0000 UTC m=+0.068967085 container died 0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12678 keys, 17928820 bytes, temperature: kUnknown
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670061218827, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 17928820, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17854048, "index_size": 41967, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31749, "raw_key_size": 339637, "raw_average_key_size": 26, "raw_value_size": 17635555, "raw_average_value_size": 1391, "num_data_blocks": 1601, "num_entries": 12678, "num_filter_entries": 12678, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670061, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.220846) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 17928820 bytes
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.222575) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.6 rd, 163.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 15.9 +0.0 blob) out(17.1 +0.0 blob), read-write-amplify(21.9) write-amplify(10.8) OK, records in: 13224, records dropped: 546 output_compression: NoCompression
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.222640) EVENT_LOG_v1 {"time_micros": 1764670061222594, "job": 20, "event": "compaction_finished", "compaction_time_micros": 109780, "compaction_time_cpu_micros": 45994, "output_level": 6, "num_output_files": 1, "total_output_size": 17928820, "num_input_records": 13224, "num_output_records": 12678, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670061223028, "job": 20, "event": "table_file_deletion", "file_number": 38}
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670061225537, "job": 20, "event": "table_file_deletion", "file_number": 36}
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.108973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.225690) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.225699) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.225702) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.225706) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:07:41 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:07:41.225710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:07:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093-userdata-shm.mount: Deactivated successfully.
Dec 02 10:07:41 np0005541913.localdomain podman[315951]: 2025-12-02 10:07:41.251411969 +0000 UTC m=+0.149846159 container cleanup 0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:07:41 np0005541913.localdomain systemd[1]: libpod-conmon-0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093.scope: Deactivated successfully.
Dec 02 10:07:41 np0005541913.localdomain podman[315958]: 2025-12-02 10:07:41.275562525 +0000 UTC m=+0.155686676 container remove 0b5b640ed8daf95d1ba27bc3c6769ede223cce7e4045475d3e61ebff0eb58093 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:07:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:41.340 263406 INFO neutron.agent.linux.ip_lib [-] Device tape072121f-22 cannot be used as it has no MAC address
Dec 02 10:07:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:41.359 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:41 np0005541913.localdomain kernel: device tape072121f-22 entered promiscuous mode
Dec 02 10:07:41 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670061.3646] manager: (tape072121f-22): new Generic device (/org/freedesktop/NetworkManager/Devices/41)
Dec 02 10:07:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:41.365 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:41 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:41Z|00251|binding|INFO|Claiming lport e072121f-2250-4257-996f-2505d571a3a6 for this chassis.
Dec 02 10:07:41 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:41Z|00252|binding|INFO|e072121f-2250-4257-996f-2505d571a3a6: Claiming unknown
Dec 02 10:07:41 np0005541913.localdomain systemd-udevd[315985]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:07:41 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:41Z|00253|binding|INFO|Setting lport e072121f-2250-4257-996f-2505d571a3a6 ovn-installed in OVS
Dec 02 10:07:41 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:41Z|00254|binding|INFO|Setting lport e072121f-2250-4257-996f-2505d571a3a6 up in Southbound
Dec 02 10:07:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:41.377 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:41.378 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=e072121f-2250-4257-996f-2505d571a3a6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:41.379 160221 INFO neutron.agent.ovn.metadata.agent [-] Port e072121f-2250-4257-996f-2505d571a3a6 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:07:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:41.380 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:41.381 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[8de6cea5-f25b-4962-8b9e-51a506de5a3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:41 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:41 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:41.391 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:41 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:41 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:41 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:41 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:41 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:41 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tape072121f-22: No such device
Dec 02 10:07:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:41.427 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:41.449 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:42 np0005541913.localdomain podman[316054]: 
Dec 02 10:07:42 np0005541913.localdomain podman[316054]: 2025-12-02 10:07:42.084659161 +0000 UTC m=+0.096432951 container create 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:07:42 np0005541913.localdomain systemd[1]: Started libpod-conmon-44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db.scope.
Dec 02 10:07:42 np0005541913.localdomain podman[316054]: 2025-12-02 10:07:42.038232849 +0000 UTC m=+0.050006679 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:07:42 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:07:42 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9bfca610369bb0363be82922c5eec6ba87562c0b61ac20065a0acd14246d573/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:07:42 np0005541913.localdomain podman[316054]: 2025-12-02 10:07:42.176030515 +0000 UTC m=+0.187804305 container init 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 10:07:42 np0005541913.localdomain podman[316054]: 2025-12-02 10:07:42.186413663 +0000 UTC m=+0.198187453 container start 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 02 10:07:42 np0005541913.localdomain dnsmasq[316072]: started, version 2.85 cachesize 150
Dec 02 10:07:42 np0005541913.localdomain dnsmasq[316072]: DNS service limited to local subnets
Dec 02 10:07:42 np0005541913.localdomain dnsmasq[316072]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:07:42 np0005541913.localdomain dnsmasq[316072]: warning: no upstream servers configured
Dec 02 10:07:42 np0005541913.localdomain dnsmasq[316072]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bf763cfd106b91e614a3bf8217f658b71d6696de983741c29fd7dbb50f4fb61d-merged.mount: Deactivated successfully.
Dec 02 10:07:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:42.251 263406 INFO neutron.agent.dhcp.agent [-] Finished network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 dhcp configuration
Dec 02 10:07:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:42.252 263406 INFO neutron.agent.dhcp.agent [None req-9232755b-b057-437d-b308-8d060aa8cc33 - - - - - -] Synchronizing state complete
Dec 02 10:07:42 np0005541913.localdomain ceph-mon[298296]: pgmap v272: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 852 B/s wr, 27 op/s
Dec 02 10:07:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:42.468 263406 INFO neutron.agent.dhcp.agent [None req-eb266d90-0b4a-4720-a38e-2fa2f6803911 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', 'e072121f-2250-4257-996f-2505d571a3a6'} is completed
Dec 02 10:07:42 np0005541913.localdomain dnsmasq[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/addn_hosts - 1 addresses
Dec 02 10:07:42 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/host
Dec 02 10:07:42 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/opts
Dec 02 10:07:42 np0005541913.localdomain podman[316104]: 2025-12-02 10:07:42.53467122 +0000 UTC m=+0.065936365 container kill 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:07:42 np0005541913.localdomain podman[316117]: 2025-12-02 10:07:42.58777677 +0000 UTC m=+0.071381410 container kill 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:07:42 np0005541913.localdomain dnsmasq[316072]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:42 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:42.807 2 INFO neutron.agent.securitygroups_rpc [None req-d64f03a9-f848-43c0-ae5d-2c025d3e76ac 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:43.047 263406 INFO neutron.agent.dhcp.agent [None req-832d7563-db5b-47a9-b1b4-9a0866470304 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', 'e072121f-2250-4257-996f-2505d571a3a6'} is completed
Dec 02 10:07:43 np0005541913.localdomain dnsmasq[316072]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:07:43 np0005541913.localdomain podman[316177]: 2025-12-02 10:07:43.083865852 +0000 UTC m=+0.071751060 container kill 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:07:43 np0005541913.localdomain dnsmasq[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/addn_hosts - 0 addresses
Dec 02 10:07:43 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/host
Dec 02 10:07:43 np0005541913.localdomain podman[316192]: 2025-12-02 10:07:43.128057735 +0000 UTC m=+0.077553095 container kill 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:07:43 np0005541913.localdomain dnsmasq-dhcp[315330]: read /var/lib/neutron/dhcp/a69b1d7f-7b6a-4e57-97d2-3e13016a1afd/opts
Dec 02 10:07:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:43.279 263406 INFO neutron.agent.dhcp.agent [None req-a8f71ee9-d37e-445c-b22b-5ca69a7f864a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:42Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089920a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908992f70>], id=94ca77a3-9f26-4682-9bc1-a9f1d339b3ab, ip_allocation=immediate, mac_address=fa:16:3e:55:87:45, name=tempest-NetworksTestDHCPv6-624163590, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=6, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['cdc6da6b-39a6-4f38-b5f6-c65fbcfe2d84'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:41Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=1743, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:42Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:07:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:43Z|00255|binding|INFO|Releasing lport 308729f2-5cef-4da6-a8d1-12678a8ce24b from this chassis (sb_readonly=0)
Dec 02 10:07:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:43Z|00256|binding|INFO|Setting lport 308729f2-5cef-4da6-a8d1-12678a8ce24b down in Southbound
Dec 02 10:07:43 np0005541913.localdomain kernel: device tap308729f2-5c left promiscuous mode
Dec 02 10:07:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:43.490 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:43.498 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27cf39916c5c4bc1833487052acaa22a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c696e62-8d3a-4321-b6e9-84ecff5ee056, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=308729f2-5cef-4da6-a8d1-12678a8ce24b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:43.499 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 308729f2-5cef-4da6-a8d1-12678a8ce24b in datapath a69b1d7f-7b6a-4e57-97d2-3e13016a1afd unbound from our chassis
Dec 02 10:07:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:43.501 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:07:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:43.502 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[2656bcfb-0423-455d-ba39-6070df2e894f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:43.518 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:43 np0005541913.localdomain podman[316242]: 2025-12-02 10:07:43.537456237 +0000 UTC m=+0.116501607 container kill 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:07:43 np0005541913.localdomain systemd[1]: tmp-crun.1KFbZi.mount: Deactivated successfully.
Dec 02 10:07:43 np0005541913.localdomain dnsmasq[316072]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:07:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:43.539 263406 INFO neutron.agent.dhcp.agent [None req-513abf6a-7940-4289-9c91-69d49c5bb937 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', 'e072121f-2250-4257-996f-2505d571a3a6', '94ca77a3-9f26-4682-9bc1-a9f1d339b3ab'} is completed
Dec 02 10:07:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:43.930 263406 INFO neutron.agent.dhcp.agent [None req-d4832c45-2d9c-474f-ba84-3aa26b005a1f - - - - - -] DHCP configuration for ports {'94ca77a3-9f26-4682-9bc1-a9f1d339b3ab'} is completed
Dec 02 10:07:44 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:44.321 2 INFO neutron.agent.securitygroups_rpc [None req-4929d793-e537-4c53-a2f5-ddc0b60f2500 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:44 np0005541913.localdomain ceph-mon[298296]: pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 852 B/s wr, 27 op/s
Dec 02 10:07:44 np0005541913.localdomain podman[316282]: 2025-12-02 10:07:44.573265418 +0000 UTC m=+0.064178448 container kill 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:07:44 np0005541913.localdomain dnsmasq[316072]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:45 np0005541913.localdomain dnsmasq[316072]: exiting on receipt of SIGTERM
Dec 02 10:07:45 np0005541913.localdomain systemd[1]: libpod-44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db.scope: Deactivated successfully.
Dec 02 10:07:45 np0005541913.localdomain podman[316318]: 2025-12-02 10:07:45.300966747 +0000 UTC m=+0.060616073 container kill 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:07:45 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:07:45 np0005541913.localdomain podman[316330]: 2025-12-02 10:07:45.36722615 +0000 UTC m=+0.055369373 container died 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:07:45 np0005541913.localdomain podman[316330]: 2025-12-02 10:07:45.462188939 +0000 UTC m=+0.150332122 container cleanup 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:07:45 np0005541913.localdomain systemd[1]: libpod-conmon-44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db.scope: Deactivated successfully.
Dec 02 10:07:45 np0005541913.localdomain podman[316337]: 2025-12-02 10:07:45.487740584 +0000 UTC m=+0.163263410 container remove 44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:07:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:45.502 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:45Z|00257|binding|INFO|Releasing lport e072121f-2250-4257-996f-2505d571a3a6 from this chassis (sb_readonly=0)
Dec 02 10:07:45 np0005541913.localdomain kernel: device tape072121f-22 left promiscuous mode
Dec 02 10:07:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:45Z|00258|binding|INFO|Setting lport e072121f-2250-4257-996f-2505d571a3a6 down in Southbound
Dec 02 10:07:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:45.512 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=e072121f-2250-4257-996f-2505d571a3a6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:45.514 160221 INFO neutron.agent.ovn.metadata.agent [-] Port e072121f-2250-4257-996f-2505d571a3a6 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:07:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:45.516 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:45.516 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4759d7-fafa-4b82-a171-c3406f294a6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:45.523 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:45 np0005541913.localdomain systemd[1]: tmp-crun.xK0ej0.mount: Deactivated successfully.
Dec 02 10:07:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b9bfca610369bb0363be82922c5eec6ba87562c0b61ac20065a0acd14246d573-merged.mount: Deactivated successfully.
Dec 02 10:07:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44c1ba813977e8bbff0e0dcb62f4f0487d4aa476c73a4c6dfe1f9e4c716dd3db-userdata-shm.mount: Deactivated successfully.
Dec 02 10:07:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:07:45 np0005541913.localdomain podman[316358]: 2025-12-02 10:07:45.692543653 +0000 UTC m=+0.092297501 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec 02 10:07:45 np0005541913.localdomain podman[316358]: 2025-12-02 10:07:45.708359465 +0000 UTC m=+0.108113373 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:07:45 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:07:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:45.949 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:46.032 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:46 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:07:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:46.242 263406 INFO neutron.agent.dhcp.agent [None req-35a03a92-44c4-4711-a86c-b37cca1320f8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:07:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "57640910-43d5-4fb7-b9cf-1d15a1cbc8ab", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:07:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "57640910-43d5-4fb7-b9cf-1d15a1cbc8ab", "format": "json"}]: dispatch
Dec 02 10:07:46 np0005541913.localdomain ceph-mon[298296]: pgmap v274: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Dec 02 10:07:46 np0005541913.localdomain ceph-mon[298296]: mgrmap e44: np0005541914.lljzmk(active, since 7m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:07:46 np0005541913.localdomain dnsmasq[315330]: exiting on receipt of SIGTERM
Dec 02 10:07:46 np0005541913.localdomain podman[316397]: 2025-12-02 10:07:46.726308869 +0000 UTC m=+0.063311745 container kill 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:07:46 np0005541913.localdomain systemd[1]: tmp-crun.MlY6G6.mount: Deactivated successfully.
Dec 02 10:07:46 np0005541913.localdomain systemd[1]: libpod-2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f.scope: Deactivated successfully.
Dec 02 10:07:46 np0005541913.localdomain podman[316411]: 2025-12-02 10:07:46.790936178 +0000 UTC m=+0.048918630 container died 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:07:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f-userdata-shm.mount: Deactivated successfully.
Dec 02 10:07:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-55241c631c38010f44af375554574eb4384ea2395685ed6f7e3376d51f08e8a7-merged.mount: Deactivated successfully.
Dec 02 10:07:46 np0005541913.localdomain systemd-journald[47611]: Data hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 75.0 (53725 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Dec 02 10:07:46 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 10:07:46 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 10:07:46 np0005541913.localdomain podman[316411]: 2025-12-02 10:07:46.894353175 +0000 UTC m=+0.152335547 container remove 2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a69b1d7f-7b6a-4e57-97d2-3e13016a1afd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:07:46 np0005541913.localdomain systemd[1]: libpod-conmon-2622ee810d8e0767daa7057fe51a609ee033814717882cdb0e83d8745d102a7f.scope: Deactivated successfully.
Dec 02 10:07:46 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 10:07:46 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:46.988 2 INFO neutron.agent.securitygroups_rpc [None req-771b159d-2ba2-4111-b13e-47ca58a8e2e2 71c1ab73f6584cdc8a5ac07abc1165b6 c83c01183aba40c080a7dde4126b2e3b - - default default] Security group member updated ['8d157c15-6c1c-467c-9dbb-a97c83d265b6']
Dec 02 10:07:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:47.162 263406 INFO neutron.agent.linux.ip_lib [None req-2aa4fae5-cf1c-4a3b-b0ca-d4485ba1cf08 - - - - - -] Device tapc0620a36-cb cannot be used as it has no MAC address
Dec 02 10:07:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:47.188 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:47 np0005541913.localdomain kernel: device tapc0620a36-cb entered promiscuous mode
Dec 02 10:07:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:47Z|00259|binding|INFO|Claiming lport c0620a36-cb6e-4025-9457-fbbe48b68e1f for this chassis.
Dec 02 10:07:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:47.197 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:47Z|00260|binding|INFO|c0620a36-cb6e-4025-9457-fbbe48b68e1f: Claiming unknown
Dec 02 10:07:47 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670067.2000] manager: (tapc0620a36-cb): new Generic device (/org/freedesktop/NetworkManager/Devices/42)
Dec 02 10:07:47 np0005541913.localdomain systemd-udevd[316448]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:07:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:47.205 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c0620a36-cb6e-4025-9457-fbbe48b68e1f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:47.208 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c0620a36-cb6e-4025-9457-fbbe48b68e1f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:07:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:47.209 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:47.210 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[cd99d2e6-316d-414d-8a72-1b62f05fee9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:47Z|00261|binding|INFO|Setting lport c0620a36-cb6e-4025-9457-fbbe48b68e1f up in Southbound
Dec 02 10:07:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:47Z|00262|binding|INFO|Setting lport c0620a36-cb6e-4025-9457-fbbe48b68e1f ovn-installed in OVS
Dec 02 10:07:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:47.214 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:47.221 263406 INFO neutron.agent.dhcp.agent [None req-b201acc6-3060-4c23-b832-8173b49a7048 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:07:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:47.222 263406 INFO neutron.agent.dhcp.agent [None req-b201acc6-3060-4c23-b832-8173b49a7048 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:07:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc0620a36-cb: No such device
Dec 02 10:07:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:47.231 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc0620a36-cb: No such device
Dec 02 10:07:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc0620a36-cb: No such device
Dec 02 10:07:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc0620a36-cb: No such device
Dec 02 10:07:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc0620a36-cb: No such device
Dec 02 10:07:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc0620a36-cb: No such device
Dec 02 10:07:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc0620a36-cb: No such device
Dec 02 10:07:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc0620a36-cb: No such device
Dec 02 10:07:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:47.269 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:47.297 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:47 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:47.428 2 INFO neutron.agent.securitygroups_rpc [None req-fa6ee8ca-ed1b-4c8f-b78c-b44d9f9936bd 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:47 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2da69b1d7f\x2d7b6a\x2d4e57\x2d97d2\x2d3e13016a1afd.mount: Deactivated successfully.
Dec 02 10:07:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:47.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:07:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:47.934 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:07:48 np0005541913.localdomain podman[316519]: 
Dec 02 10:07:48 np0005541913.localdomain podman[316519]: 2025-12-02 10:07:48.24397705 +0000 UTC m=+0.094684893 container create ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:07:48 np0005541913.localdomain podman[316519]: 2025-12-02 10:07:48.198799572 +0000 UTC m=+0.049507485 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:07:48 np0005541913.localdomain systemd[1]: Started libpod-conmon-ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa.scope.
Dec 02 10:07:48 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:07:48 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec160c84537a63a0711c7b78391e0627f8810916382cc0d287179ccc031e3fe6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:07:48 np0005541913.localdomain podman[316519]: 2025-12-02 10:07:48.333238819 +0000 UTC m=+0.183946682 container init ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:07:48 np0005541913.localdomain podman[316519]: 2025-12-02 10:07:48.349567326 +0000 UTC m=+0.200275179 container start ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:07:48 np0005541913.localdomain dnsmasq[316537]: started, version 2.85 cachesize 150
Dec 02 10:07:48 np0005541913.localdomain dnsmasq[316537]: DNS service limited to local subnets
Dec 02 10:07:48 np0005541913.localdomain dnsmasq[316537]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:07:48 np0005541913.localdomain dnsmasq[316537]: warning: no upstream servers configured
Dec 02 10:07:48 np0005541913.localdomain dnsmasq-dhcp[316537]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:07:48 np0005541913.localdomain dnsmasq[316537]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:48 np0005541913.localdomain dnsmasq-dhcp[316537]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:07:48 np0005541913.localdomain dnsmasq-dhcp[316537]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:07:48 np0005541913.localdomain ceph-mon[298296]: pgmap v275: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Dec 02 10:07:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:48.425 263406 INFO neutron.agent.dhcp.agent [None req-2aa4fae5-cf1c-4a3b-b0ca-d4485ba1cf08 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:46Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99092516d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a1d610>], id=98bfbb74-6d65-460c-be9e-916f678993ba, ip_allocation=immediate, mac_address=fa:16:3e:18:08:18, name=tempest-NetworksTestDHCPv6-1751893532, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['b7420c97-3129-4103-b655-a67cf1a8fa15'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:45Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=1752, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:47Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:07:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:48Z|00263|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:07:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:48.450 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:48.545 263406 INFO neutron.agent.dhcp.agent [None req-4632cfa4-197c-403e-8cf9-a9a8ad15ea84 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:07:48 np0005541913.localdomain dnsmasq[316537]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:07:48 np0005541913.localdomain dnsmasq-dhcp[316537]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:07:48 np0005541913.localdomain podman[316556]: 2025-12-02 10:07:48.637072338 +0000 UTC m=+0.068036902 container kill ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:07:48 np0005541913.localdomain dnsmasq-dhcp[316537]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:07:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:48.894 263406 INFO neutron.agent.dhcp.agent [None req-fb7e9bec-af17-42f8-a709-f60cc4353cef - - - - - -] DHCP configuration for ports {'98bfbb74-6d65-460c-be9e-916f678993ba'} is completed
Dec 02 10:07:49 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:49.111 2 INFO neutron.agent.securitygroups_rpc [None req-1c456783-7537-497f-860f-91e236f22124 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:49 np0005541913.localdomain dnsmasq[316537]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:49 np0005541913.localdomain dnsmasq-dhcp[316537]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:07:49 np0005541913.localdomain dnsmasq-dhcp[316537]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:07:49 np0005541913.localdomain podman[316593]: 2025-12-02 10:07:49.359079203 +0000 UTC m=+0.064255629 container kill ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:07:50 np0005541913.localdomain dnsmasq[316537]: exiting on receipt of SIGTERM
Dec 02 10:07:50 np0005541913.localdomain systemd[1]: tmp-crun.Hp5gM6.mount: Deactivated successfully.
Dec 02 10:07:50 np0005541913.localdomain podman[316632]: 2025-12-02 10:07:50.383080949 +0000 UTC m=+0.080629148 container kill ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 02 10:07:50 np0005541913.localdomain systemd[1]: libpod-ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa.scope: Deactivated successfully.
Dec 02 10:07:50 np0005541913.localdomain ceph-mon[298296]: pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 3.2 KiB/s wr, 15 op/s
Dec 02 10:07:50 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "57640910-43d5-4fb7-b9cf-1d15a1cbc8ab", "format": "json"}]: dispatch
Dec 02 10:07:50 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "57640910-43d5-4fb7-b9cf-1d15a1cbc8ab", "force": true, "format": "json"}]: dispatch
Dec 02 10:07:50 np0005541913.localdomain podman[316648]: 2025-12-02 10:07:50.457193341 +0000 UTC m=+0.052711940 container died ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 02 10:07:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:07:50 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa-userdata-shm.mount: Deactivated successfully.
Dec 02 10:07:50 np0005541913.localdomain podman[316648]: 2025-12-02 10:07:50.527787081 +0000 UTC m=+0.123305630 container remove ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:07:50 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:50Z|00264|binding|INFO|Releasing lport c0620a36-cb6e-4025-9457-fbbe48b68e1f from this chassis (sb_readonly=0)
Dec 02 10:07:50 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:50Z|00265|binding|INFO|Setting lport c0620a36-cb6e-4025-9457-fbbe48b68e1f down in Southbound
Dec 02 10:07:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:50.541 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:50 np0005541913.localdomain kernel: device tapc0620a36-cb left promiscuous mode
Dec 02 10:07:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:50.548 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:50.554 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c0620a36-cb6e-4025-9457-fbbe48b68e1f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:50.555 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c0620a36-cb6e-4025-9457-fbbe48b68e1f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:07:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:50.556 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:50 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:50.557 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[28b79a89-4c3f-423e-9ccc-1f7bc88eb0d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:50.565 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:50 np0005541913.localdomain systemd[1]: libpod-conmon-ec0bb76357444177b973f7c93ec0e73a0338ba982d33a484b1a97bd516990cfa.scope: Deactivated successfully.
Dec 02 10:07:50 np0005541913.localdomain podman[316672]: 2025-12-02 10:07:50.57749442 +0000 UTC m=+0.082603841 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 10:07:50 np0005541913.localdomain podman[316672]: 2025-12-02 10:07:50.583947223 +0000 UTC m=+0.089056604 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 02 10:07:50 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:07:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:50.951 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:51.034 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ec160c84537a63a0711c7b78391e0627f8810916382cc0d287179ccc031e3fe6-merged.mount: Deactivated successfully.
Dec 02 10:07:51 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:07:51 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:51.565 2 INFO neutron.agent.securitygroups_rpc [None req-3a04bc57-bfc9-42ed-a239-801c8326e405 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:51 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:51.997 263406 INFO neutron.agent.linux.ip_lib [None req-e1b16a83-8994-4fa7-9c8e-59123f9fe5a9 - - - - - -] Device tap6c308b19-30 cannot be used as it has no MAC address
Dec 02 10:07:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:52.021 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:52 np0005541913.localdomain kernel: device tap6c308b19-30 entered promiscuous mode
Dec 02 10:07:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:52.029 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:52 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670072.0298] manager: (tap6c308b19-30): new Generic device (/org/freedesktop/NetworkManager/Devices/43)
Dec 02 10:07:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:52Z|00266|binding|INFO|Claiming lport 6c308b19-30ab-4052-98ab-e96747c0ae90 for this chassis.
Dec 02 10:07:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:52Z|00267|binding|INFO|6c308b19-30ab-4052-98ab-e96747c0ae90: Claiming unknown
Dec 02 10:07:52 np0005541913.localdomain systemd-udevd[316701]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:07:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:52Z|00268|binding|INFO|Setting lport 6c308b19-30ab-4052-98ab-e96747c0ae90 ovn-installed in OVS
Dec 02 10:07:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:52.047 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:52Z|00269|binding|INFO|Setting lport 6c308b19-30ab-4052-98ab-e96747c0ae90 up in Southbound
Dec 02 10:07:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:52.049 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=6c308b19-30ab-4052-98ab-e96747c0ae90) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:52.052 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 6c308b19-30ab-4052-98ab-e96747c0ae90 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:07:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:52.054 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:52.056 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6cdd26b8-1b5f-4514-821c-2e776881d07d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:52.063 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:52.071 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:07:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:07:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:52.111 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:52.145 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:52 np0005541913.localdomain podman[316705]: 2025-12-02 10:07:52.181734837 +0000 UTC m=+0.087340727 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 10:07:52 np0005541913.localdomain podman[316705]: 2025-12-02 10:07:52.200342785 +0000 UTC m=+0.105948715 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Dec 02 10:07:52 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:07:52 np0005541913.localdomain systemd[1]: tmp-crun.kTOOP1.mount: Deactivated successfully.
Dec 02 10:07:52 np0005541913.localdomain podman[316706]: 2025-12-02 10:07:52.302384175 +0000 UTC m=+0.206318441 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:07:52 np0005541913.localdomain podman[316706]: 2025-12-02 10:07:52.336138458 +0000 UTC m=+0.240072664 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:07:52 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:07:52 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:52.379 2 INFO neutron.agent.securitygroups_rpc [None req-b88c63e0-efad-4ee2-bdbd-ab6bd93ed0e7 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:52 np0005541913.localdomain ceph-mon[298296]: pgmap v277: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 2.6 KiB/s wr, 0 op/s
Dec 02 10:07:52 np0005541913.localdomain ceph-mon[298296]: mgrmap e45: np0005541914.lljzmk(active, since 7m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:07:52 np0005541913.localdomain podman[316796]: 
Dec 02 10:07:52 np0005541913.localdomain podman[316796]: 2025-12-02 10:07:52.926164813 +0000 UTC m=+0.071236767 container create a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:07:52 np0005541913.localdomain systemd[1]: Started libpod-conmon-a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727.scope.
Dec 02 10:07:52 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:07:52 np0005541913.localdomain podman[316796]: 2025-12-02 10:07:52.896602972 +0000 UTC m=+0.041674886 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:07:53 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb7bccef66fc5b60a58054e9bad7944e15989fa315bde06cc6460ab768eba099/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:07:53 np0005541913.localdomain podman[316796]: 2025-12-02 10:07:53.014459885 +0000 UTC m=+0.159531859 container init a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:07:53 np0005541913.localdomain podman[316796]: 2025-12-02 10:07:53.023720223 +0000 UTC m=+0.168792177 container start a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:07:53 np0005541913.localdomain dnsmasq[316814]: started, version 2.85 cachesize 150
Dec 02 10:07:53 np0005541913.localdomain dnsmasq[316814]: DNS service limited to local subnets
Dec 02 10:07:53 np0005541913.localdomain dnsmasq[316814]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:07:53 np0005541913.localdomain dnsmasq[316814]: warning: no upstream servers configured
Dec 02 10:07:53 np0005541913.localdomain dnsmasq[316814]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:53.208 263406 INFO neutron.agent.dhcp.agent [None req-aeacfdd5-7f5f-44aa-9dc8-e786852ac3fb - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:07:53 np0005541913.localdomain dnsmasq[316814]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:53 np0005541913.localdomain podman[316830]: 2025-12-02 10:07:53.39404508 +0000 UTC m=+0.065050171 container kill a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:07:53 np0005541913.localdomain systemd[1]: tmp-crun.jC4VEj.mount: Deactivated successfully.
Dec 02 10:07:53 np0005541913.localdomain dnsmasq[316814]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:53 np0005541913.localdomain podman[316867]: 2025-12-02 10:07:53.840793581 +0000 UTC m=+0.058656830 container kill a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:07:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:53.961 263406 INFO neutron.agent.dhcp.agent [None req-b1773246-6ce2-4630-92b3-1885b6ba0b08 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '6c308b19-30ab-4052-98ab-e96747c0ae90'} is completed
Dec 02 10:07:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:53.981 263406 INFO neutron.agent.dhcp.agent [None req-371e7bc4-1bb3-41d7-ba22-5dd81d317899 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:50Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990924a9d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990924a9a0>], id=26fecc46-bb27-4178-a075-902cfd5a6c9d, ip_allocation=immediate, mac_address=fa:16:3e:14:a4:c7, name=tempest-NetworksTestDHCPv6-1634238789, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['6726367e-635c-4301-a591-316ca0795570'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:50Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=1776, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:51Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:07:54 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:54.022 2 INFO neutron.agent.securitygroups_rpc [None req-1b0d4d6e-60bd-47bc-abae-8825e5c440ec 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:54 np0005541913.localdomain dnsmasq[316814]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:07:54 np0005541913.localdomain podman[316905]: 2025-12-02 10:07:54.156105397 +0000 UTC m=+0.059249326 container kill a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:07:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:54.258 263406 INFO neutron.agent.dhcp.agent [None req-6c17c48d-430e-4f80-8763-874abc010649 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '6c308b19-30ab-4052-98ab-e96747c0ae90'} is completed
Dec 02 10:07:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:54.304 263406 INFO neutron.agent.dhcp.agent [None req-371e7bc4-1bb3-41d7-ba22-5dd81d317899 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:53Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99092396a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9909239820>], id=862d39f9-3328-4142-b9e7-3246db70a1ad, ip_allocation=immediate, mac_address=fa:16:3e:92:92:e1, name=tempest-NetworksTestDHCPv6-274562090, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['08c27b5f-e79f-4e4b-9074-ee591cce28a9'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:53Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=1786, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:53Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:07:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:54.380 263406 INFO neutron.agent.dhcp.agent [None req-5362e9db-c9c9-4caf-b78b-92df89c876c1 - - - - - -] DHCP configuration for ports {'26fecc46-bb27-4178-a075-902cfd5a6c9d'} is completed
Dec 02 10:07:54 np0005541913.localdomain ceph-mon[298296]: pgmap v278: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 5.8 KiB/s wr, 2 op/s
Dec 02 10:07:54 np0005541913.localdomain dnsmasq[316814]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:07:54 np0005541913.localdomain podman[316943]: 2025-12-02 10:07:54.482531799 +0000 UTC m=+0.060646252 container kill a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:07:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:54.724 263406 INFO neutron.agent.dhcp.agent [None req-60247a16-660f-4e76-be9f-05c973ed23e2 - - - - - -] DHCP configuration for ports {'862d39f9-3328-4142-b9e7-3246db70a1ad'} is completed
Dec 02 10:07:54 np0005541913.localdomain dnsmasq[316814]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:07:54 np0005541913.localdomain systemd[1]: tmp-crun.t5kWew.mount: Deactivated successfully.
Dec 02 10:07:54 np0005541913.localdomain podman[316982]: 2025-12-02 10:07:54.790368705 +0000 UTC m=+0.050496972 container kill a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:07:54 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:54.810 2 INFO neutron.agent.securitygroups_rpc [None req-f2f34722-a858-417d-bb9f-16583a7fb9bc 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:55 np0005541913.localdomain dnsmasq[316814]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:55 np0005541913.localdomain podman[317020]: 2025-12-02 10:07:55.103943954 +0000 UTC m=+0.069478970 container kill a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:07:55 np0005541913.localdomain dnsmasq[316814]: exiting on receipt of SIGTERM
Dec 02 10:07:55 np0005541913.localdomain podman[317058]: 2025-12-02 10:07:55.662192399 +0000 UTC m=+0.058132196 container kill a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:07:55 np0005541913.localdomain systemd[1]: libpod-a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727.scope: Deactivated successfully.
Dec 02 10:07:55 np0005541913.localdomain podman[317076]: 2025-12-02 10:07:55.732341547 +0000 UTC m=+0.049373673 container died a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 02 10:07:55 np0005541913.localdomain systemd[1]: tmp-crun.jTu23E.mount: Deactivated successfully.
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:07:55 np0005541913.localdomain podman[317076]: 2025-12-02 10:07:55.834437237 +0000 UTC m=+0.151469313 container remove a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:07:55 np0005541913.localdomain systemd[1]: libpod-conmon-a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727.scope: Deactivated successfully.
Dec 02 10:07:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:55Z|00270|binding|INFO|Releasing lport 6c308b19-30ab-4052-98ab-e96747c0ae90 from this chassis (sb_readonly=0)
Dec 02 10:07:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:55Z|00271|binding|INFO|Setting lport 6c308b19-30ab-4052-98ab-e96747c0ae90 down in Southbound
Dec 02 10:07:55 np0005541913.localdomain kernel: device tap6c308b19-30 left promiscuous mode
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.849 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.850 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.850 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.850 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.851 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:07:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:55.854 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=6c308b19-30ab-4052-98ab-e96747c0ae90) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:55.856 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 6c308b19-30ab-4052-98ab-e96747c0ae90 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:07:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:55.857 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:55.858 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[34be7a7e-3f4f-4c23-b1ab-91a98f348ab1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.869 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:55.954 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.036 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:56 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:56.070 263406 INFO neutron.agent.dhcp.agent [None req-fbf7cabc-b893-4cd2-bf1e-11bd5f6844a3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:07:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:07:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:07:56 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4256956907' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.272 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.370 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.370 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:07:56 np0005541913.localdomain ceph-mon[298296]: pgmap v279: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 5.8 KiB/s wr, 2 op/s
Dec 02 10:07:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/4256956907' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.605 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.606 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11277MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.607 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.607 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:07:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bb7bccef66fc5b60a58054e9bad7944e15989fa315bde06cc6460ab768eba099-merged.mount: Deactivated successfully.
Dec 02 10:07:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a41e0e52d32c1b89548a11cf02d386351972898f22d0631e2368bd0bb155a727-userdata-shm.mount: Deactivated successfully.
Dec 02 10:07:56 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.932 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.933 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:07:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:56.933 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.227 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:07:57 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:57.525 263406 INFO neutron.agent.linux.ip_lib [None req-e080462e-3f44-4e89-ba5a-ed3b2061c837 - - - - - -] Device tap1374f02b-78 cannot be used as it has no MAC address
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.551 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:57 np0005541913.localdomain kernel: device tap1374f02b-78 entered promiscuous mode
Dec 02 10:07:57 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670077.5602] manager: (tap1374f02b-78): new Generic device (/org/freedesktop/NetworkManager/Devices/44)
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.564 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:57 np0005541913.localdomain systemd-udevd[317151]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:07:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:57Z|00272|binding|INFO|Claiming lport 1374f02b-78ae-4718-ac96-c95d5911f385 for this chassis.
Dec 02 10:07:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:57Z|00273|binding|INFO|1374f02b-78ae-4718-ac96-c95d5911f385: Claiming unknown
Dec 02 10:07:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:57.584 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=1374f02b-78ae-4718-ac96-c95d5911f385) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:07:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:57.586 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 1374f02b-78ae-4718-ac96-c95d5911f385 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:07:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:57.588 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:07:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:07:57.589 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca94d17-cec6-45b6-a671-c9e949974133]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:07:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:57Z|00274|binding|INFO|Setting lport 1374f02b-78ae-4718-ac96-c95d5911f385 ovn-installed in OVS
Dec 02 10:07:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:07:57Z|00275|binding|INFO|Setting lport 1374f02b-78ae-4718-ac96-c95d5911f385 up in Southbound
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.592 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.606 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.641 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.669 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:07:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:07:57 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1390269581' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.774 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.780 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.803 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.806 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:07:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:57.807 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:07:57 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:57.905 2 INFO neutron.agent.securitygroups_rpc [None req-2105c7c1-c7a9-4dc4-9a73-811f6d407872 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:58 np0005541913.localdomain ceph-mon[298296]: pgmap v280: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 5.8 KiB/s wr, 2 op/s
Dec 02 10:07:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1390269581' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:07:58 np0005541913.localdomain podman[317208]: 
Dec 02 10:07:58 np0005541913.localdomain podman[317208]: 2025-12-02 10:07:58.555426591 +0000 UTC m=+0.100993582 container create ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 10:07:58 np0005541913.localdomain podman[317208]: 2025-12-02 10:07:58.507603532 +0000 UTC m=+0.053170463 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:07:58 np0005541913.localdomain systemd[1]: Started libpod-conmon-ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9.scope.
Dec 02 10:07:58 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:07:58 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/916d62d032cda511062320377806f59ee00027d3131c76817008c45ca8736e10/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:07:58 np0005541913.localdomain podman[317208]: 2025-12-02 10:07:58.639835619 +0000 UTC m=+0.185402540 container init ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:07:58 np0005541913.localdomain podman[317208]: 2025-12-02 10:07:58.650433113 +0000 UTC m=+0.196000034 container start ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:07:58 np0005541913.localdomain dnsmasq[317226]: started, version 2.85 cachesize 150
Dec 02 10:07:58 np0005541913.localdomain dnsmasq[317226]: DNS service limited to local subnets
Dec 02 10:07:58 np0005541913.localdomain dnsmasq[317226]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:07:58 np0005541913.localdomain dnsmasq[317226]: warning: no upstream servers configured
Dec 02 10:07:58 np0005541913.localdomain dnsmasq-dhcp[317226]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:07:58 np0005541913.localdomain dnsmasq[317226]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:58 np0005541913.localdomain dnsmasq-dhcp[317226]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:07:58 np0005541913.localdomain dnsmasq-dhcp[317226]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:07:58 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:58.714 263406 INFO neutron.agent.dhcp.agent [None req-e080462e-3f44-4e89-ba5a-ed3b2061c837 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:07:57Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088b0df0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088b0be0>], id=fd805fb2-3db5-4232-89a5-c0a4f0358d1c, ip_allocation=immediate, mac_address=fa:16:3e:c7:34:a5, name=tempest-NetworksTestDHCPv6-395330461, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=14, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['6b7f0265-c40a-4328-926b-3221114b8b73'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:56Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=1829, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:07:57Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:07:58 np0005541913.localdomain dnsmasq[317226]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:07:58 np0005541913.localdomain dnsmasq-dhcp[317226]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:07:58 np0005541913.localdomain dnsmasq-dhcp[317226]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:07:58 np0005541913.localdomain podman[317245]: 2025-12-02 10:07:58.907385698 +0000 UTC m=+0.059344739 container kill ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:07:58 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:58.917 263406 INFO neutron.agent.dhcp.agent [None req-7e132dcc-b1ce-4a63-b4f6-6a8c309fa147 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:07:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:07:59.277 263406 INFO neutron.agent.dhcp.agent [None req-0bec8322-6a2f-45f9-b6fa-7909f652b8fa - - - - - -] DHCP configuration for ports {'fd805fb2-3db5-4232-89a5-c0a4f0358d1c'} is completed
Dec 02 10:07:59 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:07:59.457 2 INFO neutron.agent.securitygroups_rpc [None req-abbcf6d8-096d-46e2-96f3-3a8543ab77e7 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:07:59 np0005541913.localdomain systemd[1]: tmp-crun.B96ZiP.mount: Deactivated successfully.
Dec 02 10:07:59 np0005541913.localdomain dnsmasq[317226]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:07:59 np0005541913.localdomain dnsmasq-dhcp[317226]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:07:59 np0005541913.localdomain podman[317281]: 2025-12-02 10:07:59.676040041 +0000 UTC m=+0.072856100 container kill ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:07:59 np0005541913.localdomain dnsmasq-dhcp[317226]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:07:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:59.808 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:07:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:59.808 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:07:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:07:59.809 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:08:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:00.313 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:08:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:00.314 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:08:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:00.314 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:08:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:00.314 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:08:00 np0005541913.localdomain dnsmasq[317226]: exiting on receipt of SIGTERM
Dec 02 10:08:00 np0005541913.localdomain podman[317318]: 2025-12-02 10:08:00.435744006 +0000 UTC m=+0.062428291 container kill ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:08:00 np0005541913.localdomain systemd[1]: libpod-ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9.scope: Deactivated successfully.
Dec 02 10:08:00 np0005541913.localdomain ceph-mon[298296]: pgmap v281: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 6.2 KiB/s wr, 2 op/s
Dec 02 10:08:00 np0005541913.localdomain podman[317330]: 2025-12-02 10:08:00.524686005 +0000 UTC m=+0.074810191 container died ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:08:00 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:08:00 np0005541913.localdomain podman[317330]: 2025-12-02 10:08:00.561027848 +0000 UTC m=+0.111151984 container cleanup ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 02 10:08:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-916d62d032cda511062320377806f59ee00027d3131c76817008c45ca8736e10-merged.mount: Deactivated successfully.
Dec 02 10:08:00 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:00 np0005541913.localdomain systemd[1]: libpod-conmon-ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9.scope: Deactivated successfully.
Dec 02 10:08:00 np0005541913.localdomain podman[317332]: 2025-12-02 10:08:00.598924422 +0000 UTC m=+0.142018261 container remove ae9444f9a54a06ee96e3610b624615322c84b4fa1a226fce64ecd09c57b5bea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:08:00 np0005541913.localdomain kernel: device tap1374f02b-78 left promiscuous mode
Dec 02 10:08:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:00Z|00276|binding|INFO|Releasing lport 1374f02b-78ae-4718-ac96-c95d5911f385 from this chassis (sb_readonly=0)
Dec 02 10:08:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:00Z|00277|binding|INFO|Setting lport 1374f02b-78ae-4718-ac96-c95d5911f385 down in Southbound
Dec 02 10:08:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:00.610 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:00.619 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=1374f02b-78ae-4718-ac96-c95d5911f385) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:00.621 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 1374f02b-78ae-4718-ac96-c95d5911f385 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:00.622 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:00.623 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6003d9-6e45-4899-bcdc-d66a60e1fa56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:00.632 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:00 np0005541913.localdomain systemd[1]: tmp-crun.EzPhgK.mount: Deactivated successfully.
Dec 02 10:08:00 np0005541913.localdomain podman[317358]: 2025-12-02 10:08:00.646972957 +0000 UTC m=+0.088943631 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd)
Dec 02 10:08:00 np0005541913.localdomain podman[317358]: 2025-12-02 10:08:00.688975211 +0000 UTC m=+0.130945875 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd)
Dec 02 10:08:00 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:08:00 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:08:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:00.958 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.038 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.156 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:08:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.189 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.189 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.190 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:01.728 263406 INFO neutron.agent.linux.ip_lib [None req-36be447b-ea0e-4f8b-8fba-2471c99f5eb4 - - - - - -] Device tapb4564215-e5 cannot be used as it has no MAC address
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.796 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:01 np0005541913.localdomain kernel: device tapb4564215-e5 entered promiscuous mode
Dec 02 10:08:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:01Z|00278|binding|INFO|Claiming lport b4564215-e5ad-45ee-8436-ae119e1d9e06 for this chassis.
Dec 02 10:08:01 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670081.8051] manager: (tapb4564215-e5): new Generic device (/org/freedesktop/NetworkManager/Devices/45)
Dec 02 10:08:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:01Z|00279|binding|INFO|b4564215-e5ad-45ee-8436-ae119e1d9e06: Claiming unknown
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.805 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:01 np0005541913.localdomain systemd-udevd[317391]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:08:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:01Z|00280|binding|INFO|Setting lport b4564215-e5ad-45ee-8436-ae119e1d9e06 up in Southbound
Dec 02 10:08:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:01Z|00281|binding|INFO|Setting lport b4564215-e5ad-45ee-8436-ae119e1d9e06 ovn-installed in OVS
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.817 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:01.815 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=b4564215-e5ad-45ee-8436-ae119e1d9e06) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:01.818 160221 INFO neutron.agent.ovn.metadata.agent [-] Port b4564215-e5ad-45ee-8436-ae119e1d9e06 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:08:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:01.820 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:01 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:01.821 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ec247fca-43d6-4b02-bcbd-a5581ff0ec04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.837 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:01 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4564215-e5: No such device
Dec 02 10:08:01 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4564215-e5: No such device
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.844 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:01 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4564215-e5: No such device
Dec 02 10:08:01 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4564215-e5: No such device
Dec 02 10:08:01 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4564215-e5: No such device
Dec 02 10:08:01 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4564215-e5: No such device
Dec 02 10:08:01 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4564215-e5: No such device
Dec 02 10:08:01 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4564215-e5: No such device
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.882 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:01.920 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e134 e134: 6 total, 6 up, 6 in
Dec 02 10:08:02 np0005541913.localdomain ceph-mon[298296]: pgmap v282: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 3.7 KiB/s wr, 1 op/s
Dec 02 10:08:02 np0005541913.localdomain podman[317462]: 
Dec 02 10:08:02 np0005541913.localdomain podman[317462]: 2025-12-02 10:08:02.822828418 +0000 UTC m=+0.085952111 container create 9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:02.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:02.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:02.847 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:02.848 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:02.848 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 02 10:08:02 np0005541913.localdomain systemd[1]: Started libpod-conmon-9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a.scope.
Dec 02 10:08:02 np0005541913.localdomain podman[317462]: 2025-12-02 10:08:02.782716634 +0000 UTC m=+0.045840337 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:02 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:02 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6549eae79605d48e71d9eff56b3544cee41554d4da99836d1204d64a2125610/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:02 np0005541913.localdomain podman[317462]: 2025-12-02 10:08:02.902001226 +0000 UTC m=+0.165124929 container init 9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:02 np0005541913.localdomain podman[317462]: 2025-12-02 10:08:02.914641434 +0000 UTC m=+0.177765137 container start 9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:02 np0005541913.localdomain dnsmasq[317479]: started, version 2.85 cachesize 150
Dec 02 10:08:02 np0005541913.localdomain dnsmasq[317479]: DNS service limited to local subnets
Dec 02 10:08:02 np0005541913.localdomain dnsmasq[317479]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:02 np0005541913.localdomain dnsmasq[317479]: warning: no upstream servers configured
Dec 02 10:08:02 np0005541913.localdomain dnsmasq-dhcp[317479]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:08:02 np0005541913.localdomain dnsmasq[317479]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:02 np0005541913.localdomain dnsmasq-dhcp[317479]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:02 np0005541913.localdomain dnsmasq-dhcp[317479]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:03.051 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:08:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:03.052 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:08:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:03.053 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:08:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:03.076 263406 INFO neutron.agent.dhcp.agent [None req-7ca79e0d-3e91-40ab-8fe5-95ee712c3a95 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:08:03 np0005541913.localdomain dnsmasq[317479]: exiting on receipt of SIGTERM
Dec 02 10:08:03 np0005541913.localdomain podman[317498]: 2025-12-02 10:08:03.26177628 +0000 UTC m=+0.063697504 container kill 9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:08:03 np0005541913.localdomain systemd[1]: libpod-9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a.scope: Deactivated successfully.
Dec 02 10:08:03 np0005541913.localdomain podman[317511]: 2025-12-02 10:08:03.342349836 +0000 UTC m=+0.065988517 container died 9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 02 10:08:03 np0005541913.localdomain podman[317511]: 2025-12-02 10:08:03.373289144 +0000 UTC m=+0.096927785 container cleanup 9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:08:03 np0005541913.localdomain systemd[1]: libpod-conmon-9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a.scope: Deactivated successfully.
Dec 02 10:08:03 np0005541913.localdomain podman[317513]: 2025-12-02 10:08:03.417878167 +0000 UTC m=+0.131103969 container remove 9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:03Z|00282|binding|INFO|Releasing lport b4564215-e5ad-45ee-8436-ae119e1d9e06 from this chassis (sb_readonly=0)
Dec 02 10:08:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:03Z|00283|binding|INFO|Setting lport b4564215-e5ad-45ee-8436-ae119e1d9e06 down in Southbound
Dec 02 10:08:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:03.430 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:03 np0005541913.localdomain kernel: device tapb4564215-e5 left promiscuous mode
Dec 02 10:08:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:03.438 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=b4564215-e5ad-45ee-8436-ae119e1d9e06) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:03.440 160221 INFO neutron.agent.ovn.metadata.agent [-] Port b4564215-e5ad-45ee-8436-ae119e1d9e06 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:03.441 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:03.442 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[575d9f4f-32f0-41ff-b5e6-9d0d16982ae4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:03.454 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e135 e135: 6 total, 6 up, 6 in
Dec 02 10:08:03 np0005541913.localdomain ceph-mon[298296]: osdmap e134: 6 total, 6 up, 6 in
Dec 02 10:08:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e6549eae79605d48e71d9eff56b3544cee41554d4da99836d1204d64a2125610-merged.mount: Deactivated successfully.
Dec 02 10:08:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c1991448b05466eceb902376daedc49a59dd72caeac87c5632d5f49d7c6b05a-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:03 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:08:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:08:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:08:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:08:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:08:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:08:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:08:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:08:04 np0005541913.localdomain ceph-mon[298296]: pgmap v284: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 3.0 KiB/s wr, 20 op/s
Dec 02 10:08:04 np0005541913.localdomain ceph-mon[298296]: osdmap e135: 6 total, 6 up, 6 in
Dec 02 10:08:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2242023877' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:08:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2808362731' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:08:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2808362731' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:08:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:04.633 263406 INFO neutron.agent.linux.ip_lib [None req-c75d3854-3022-40bb-a2c1-5e68a7c768e5 - - - - - -] Device tap679c782c-2b cannot be used as it has no MAC address
Dec 02 10:08:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e136 e136: 6 total, 6 up, 6 in
Dec 02 10:08:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:04.662 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:04 np0005541913.localdomain kernel: device tap679c782c-2b entered promiscuous mode
Dec 02 10:08:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:04Z|00284|binding|INFO|Claiming lport 679c782c-2b17-40db-9e68-0b3c95332c3f for this chassis.
Dec 02 10:08:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:04Z|00285|binding|INFO|679c782c-2b17-40db-9e68-0b3c95332c3f: Claiming unknown
Dec 02 10:08:04 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670084.6710] manager: (tap679c782c-2b): new Generic device (/org/freedesktop/NetworkManager/Devices/46)
Dec 02 10:08:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:04.670 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:04Z|00286|binding|INFO|Setting lport 679c782c-2b17-40db-9e68-0b3c95332c3f ovn-installed in OVS
Dec 02 10:08:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:04Z|00287|binding|INFO|Setting lport 679c782c-2b17-40db-9e68-0b3c95332c3f up in Southbound
Dec 02 10:08:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:04.687 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=679c782c-2b17-40db-9e68-0b3c95332c3f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:04.687 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:04.690 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 679c782c-2b17-40db-9e68-0b3c95332c3f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:08:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:04.692 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:04.694 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[4d600042-eb86-4f7a-8cc2-27e5cc48631e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:04 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap679c782c-2b: No such device
Dec 02 10:08:04 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap679c782c-2b: No such device
Dec 02 10:08:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:04.708 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:04 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap679c782c-2b: No such device
Dec 02 10:08:04 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap679c782c-2b: No such device
Dec 02 10:08:04 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap679c782c-2b: No such device
Dec 02 10:08:04 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap679c782c-2b: No such device
Dec 02 10:08:04 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap679c782c-2b: No such device
Dec 02 10:08:04 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap679c782c-2b: No such device
Dec 02 10:08:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:04.746 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:04 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:04.777 2 INFO neutron.agent.securitygroups_rpc [None req-22f3ee62-f7aa-4000-8792-d140ffb54960 ea09fd599b014976b4b6d101bd660615 64d30b95640d4bc4991756da49cb0163 - - default default] Security group member updated ['e4e82d11-7ddc-4424-b13a-044ca8b63239']
Dec 02 10:08:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:04.781 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:04.839 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:04.839 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:08:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:08:05 np0005541913.localdomain systemd[1]: tmp-crun.fJcQ2O.mount: Deactivated successfully.
Dec 02 10:08:05 np0005541913.localdomain podman[317601]: 2025-12-02 10:08:05.449842207 +0000 UTC m=+0.085043565 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller)
Dec 02 10:08:05 np0005541913.localdomain podman[317600]: 2025-12-02 10:08:05.426826092 +0000 UTC m=+0.069253604 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:08:05 np0005541913.localdomain podman[317601]: 2025-12-02 10:08:05.504984403 +0000 UTC m=+0.140185701 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:05 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:08:05 np0005541913.localdomain podman[317600]: 2025-12-02 10:08:05.560598591 +0000 UTC m=+0.203026023 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:08:05 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:08:05 np0005541913.localdomain ceph-mon[298296]: osdmap e136: 6 total, 6 up, 6 in
Dec 02 10:08:05 np0005541913.localdomain ceph-mon[298296]: pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 4.2 KiB/s wr, 33 op/s
Dec 02 10:08:05 np0005541913.localdomain podman[317668]: 
Dec 02 10:08:05 np0005541913.localdomain podman[317668]: 2025-12-02 10:08:05.729679794 +0000 UTC m=+0.091822928 container create 79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:05 np0005541913.localdomain systemd[1]: Started libpod-conmon-79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95.scope.
Dec 02 10:08:05 np0005541913.localdomain podman[317668]: 2025-12-02 10:08:05.680942541 +0000 UTC m=+0.043085695 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:05 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082f904709c06aad040c93827c634ea3c26e9324146f3ba6d2bffe5cb021019d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:05 np0005541913.localdomain podman[317668]: 2025-12-02 10:08:05.803060477 +0000 UTC m=+0.165203621 container init 79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:08:05 np0005541913.localdomain podman[317668]: 2025-12-02 10:08:05.814180975 +0000 UTC m=+0.176324109 container start 79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:08:05 np0005541913.localdomain dnsmasq[317686]: started, version 2.85 cachesize 150
Dec 02 10:08:05 np0005541913.localdomain dnsmasq[317686]: DNS service limited to local subnets
Dec 02 10:08:05 np0005541913.localdomain dnsmasq[317686]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:05 np0005541913.localdomain dnsmasq[317686]: warning: no upstream servers configured
Dec 02 10:08:05 np0005541913.localdomain dnsmasq-dhcp[317686]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:08:05 np0005541913.localdomain dnsmasq[317686]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:05 np0005541913.localdomain dnsmasq-dhcp[317686]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:05 np0005541913.localdomain dnsmasq-dhcp[317686]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:05 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:05.914 263406 INFO neutron.agent.dhcp.agent [None req-c75d3854-3022-40bb-a2c1-5e68a7c768e5 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:04Z, description=, device_id=5d9ffd1e-177f-41cc-b69e-dee4698e0c88, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908976910>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99092d3ca0>], id=40102fd5-d7e2-4d44-b447-185810364f71, ip_allocation=immediate, mac_address=fa:16:3e:08:88:92, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['660e51c9-d82a-4643-a274-dee902233c50'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:03Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=False, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1883, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:04Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:08:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:05.961 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:08:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:08:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:06.039 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.049 263406 INFO neutron.agent.dhcp.agent [None req-9ac91f9e-c570-45fd-8942-8e5b7e1e60ee - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:08:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:08:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156100 "" "Go-http-client/1.1"
Dec 02 10:08:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:08:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19237 "" "Go-http-client/1.1"
Dec 02 10:08:06 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:06.102 2 INFO neutron.agent.securitygroups_rpc [None req-c8dc5996-311b-454a-bef8-be44e05069d7 ea09fd599b014976b4b6d101bd660615 64d30b95640d4bc4991756da49cb0163 - - default default] Security group member updated ['e4e82d11-7ddc-4424-b13a-044ca8b63239']
Dec 02 10:08:06 np0005541913.localdomain dnsmasq[317686]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:08:06 np0005541913.localdomain dnsmasq-dhcp[317686]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:06 np0005541913.localdomain dnsmasq-dhcp[317686]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:06 np0005541913.localdomain podman[317704]: 2025-12-02 10:08:06.147596012 +0000 UTC m=+0.096631333 container kill 79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:06 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:06Z|00288|binding|INFO|Releasing lport 679c782c-2b17-40db-9e68-0b3c95332c3f from this chassis (sb_readonly=0)
Dec 02 10:08:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:06.320 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:06 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:06Z|00289|binding|INFO|Setting lport 679c782c-2b17-40db-9e68-0b3c95332c3f down in Southbound
Dec 02 10:08:06 np0005541913.localdomain kernel: device tap679c782c-2b left promiscuous mode
Dec 02 10:08:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:06.342 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:06.498 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=679c782c-2b17-40db-9e68-0b3c95332c3f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:06.501 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 679c782c-2b17-40db-9e68-0b3c95332c3f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:06.502 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:06 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:06.503 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6df68703-dacf-456e-b8d5-70dfa04e8b90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.522 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:04Z, description=, device_id=5d9ffd1e-177f-41cc-b69e-dee4698e0c88, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990923eb50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9909230be0>], id=40102fd5-d7e2-4d44-b447-185810364f71, ip_allocation=immediate, mac_address=fa:16:3e:08:88:92, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['660e51c9-d82a-4643-a274-dee902233c50'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:03Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=False, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1883, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:04Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.563 263406 INFO neutron.agent.dhcp.agent [None req-25f35ce1-83ad-464f-a42a-5cad09666ec5 - - - - - -] DHCP configuration for ports {'40102fd5-d7e2-4d44-b447-185810364f71'} is completed
Dec 02 10:08:06 np0005541913.localdomain dnsmasq[317686]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:08:06 np0005541913.localdomain dnsmasq-dhcp[317686]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:06 np0005541913.localdomain podman[317744]: 2025-12-02 10:08:06.860599752 +0000 UTC m=+0.199186108 container kill 79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:08:06 np0005541913.localdomain dnsmasq-dhcp[317686]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e137 e137: 6 total, 6 up, 6 in
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for 7d517d9d-ba68-4c0f-b344-6c3be9d614a4.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap679c782c-2b not found in namespace qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4.
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     return fut.result()
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     raise self._exception
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap679c782c-2b not found in namespace qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4.
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.893 263406 ERROR neutron.agent.dhcp.agent 
Dec 02 10:08:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:06.896 263406 INFO neutron.agent.dhcp.agent [None req-9232755b-b057-437d-b308-8d060aa8cc33 - - - - - -] Synchronizing state
Dec 02 10:08:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:07.164 263406 INFO neutron.agent.dhcp.agent [None req-442f8f65-bc7e-44bc-888d-3c567e35c336 - - - - - -] DHCP configuration for ports {'40102fd5-d7e2-4d44-b447-185810364f71'} is completed
Dec 02 10:08:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:07.251 263406 INFO neutron.agent.dhcp.agent [None req-df0a516e-2ba9-46d1-bdd1-2505dc3dca33 - - - - - -] All active networks have been fetched through RPC.
Dec 02 10:08:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:07.252 263406 INFO neutron.agent.dhcp.agent [-] Starting network 207d2359-3afb-4aa2-9836-cfce83873d96 dhcp configuration
Dec 02 10:08:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:07.253 263406 INFO neutron.agent.dhcp.agent [-] Finished network 207d2359-3afb-4aa2-9836-cfce83873d96 dhcp configuration
Dec 02 10:08:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:07.254 263406 INFO neutron.agent.dhcp.agent [-] Starting network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 dhcp configuration
Dec 02 10:08:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2347714614' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:08:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2178363026' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:08:07 np0005541913.localdomain ceph-mon[298296]: osdmap e137: 6 total, 6 up, 6 in
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.439 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:07 np0005541913.localdomain dnsmasq[317686]: exiting on receipt of SIGTERM
Dec 02 10:08:07 np0005541913.localdomain systemd[1]: libpod-79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95.scope: Deactivated successfully.
Dec 02 10:08:07 np0005541913.localdomain podman[317775]: 2025-12-02 10:08:07.480363998 +0000 UTC m=+0.103546871 container kill 79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:07 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:07.528 2 INFO neutron.agent.securitygroups_rpc [None req-99b1e585-32ae-4cc8-9a4d-b88a12900723 b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:08:07 np0005541913.localdomain podman[317791]: 2025-12-02 10:08:07.54911036 +0000 UTC m=+0.052463659 container died 79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 10:08:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-082f904709c06aad040c93827c634ea3c26e9324146f3ba6d2bffe5cb021019d-merged.mount: Deactivated successfully.
Dec 02 10:08:07 np0005541913.localdomain podman[317791]: 2025-12-02 10:08:07.596079101 +0000 UTC m=+0.099432380 container remove 79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:07.633 263406 INFO neutron.agent.linux.ip_lib [-] Device tap679c782c-2b cannot be used as it has no MAC address
Dec 02 10:08:07 np0005541913.localdomain systemd[1]: libpod-conmon-79185058bbd6d7df07adf4ff88d4db7de0da968b34b710c2de74f2dcbc607d95.scope: Deactivated successfully.
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.653 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:07 np0005541913.localdomain kernel: device tap679c782c-2b entered promiscuous mode
Dec 02 10:08:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:07Z|00290|binding|INFO|Claiming lport 679c782c-2b17-40db-9e68-0b3c95332c3f for this chassis.
Dec 02 10:08:07 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670087.6602] manager: (tap679c782c-2b): new Generic device (/org/freedesktop/NetworkManager/Devices/47)
Dec 02 10:08:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:07Z|00291|binding|INFO|679c782c-2b17-40db-9e68-0b3c95332c3f: Claiming unknown
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.660 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:07Z|00292|binding|INFO|Setting lport 679c782c-2b17-40db-9e68-0b3c95332c3f ovn-installed in OVS
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.666 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:07Z|00293|binding|INFO|Setting lport 679c782c-2b17-40db-9e68-0b3c95332c3f up in Southbound
Dec 02 10:08:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:07.669 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=679c782c-2b17-40db-9e68-0b3c95332c3f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:07.671 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 679c782c-2b17-40db-9e68-0b3c95332c3f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.670 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:07.672 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:07.673 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[edc0eae8-77a9-4408-93e9-f30a2cb7097d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.699 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.727 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.757 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 02 10:08:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:07.842 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 02 10:08:08 np0005541913.localdomain ceph-mon[298296]: pgmap v289: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 5.5 KiB/s wr, 44 op/s
Dec 02 10:08:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "60995fd9-a7c9-4e80-ba2f-4e09200b332e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:08:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "60995fd9-a7c9-4e80-ba2f-4e09200b332e", "format": "json"}]: dispatch
Dec 02 10:08:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:08:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2385997552' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:08:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1668397609' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:08:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1668397609' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:08:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e138 e138: 6 total, 6 up, 6 in
Dec 02 10:08:08 np0005541913.localdomain podman[317874]: 
Dec 02 10:08:08 np0005541913.localdomain podman[317874]: 2025-12-02 10:08:08.447973864 +0000 UTC m=+0.073162932 container create bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:08:08 np0005541913.localdomain systemd[1]: Started libpod-conmon-bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d.scope.
Dec 02 10:08:08 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:08 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72541e5e7751300a91567c94e6dac3a39f1f6779176b58ffc5f771b9c7663f69/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:08 np0005541913.localdomain podman[317874]: 2025-12-02 10:08:08.514553897 +0000 UTC m=+0.139742925 container init bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:08 np0005541913.localdomain podman[317874]: 2025-12-02 10:08:08.420135322 +0000 UTC m=+0.045324350 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:08 np0005541913.localdomain podman[317874]: 2025-12-02 10:08:08.52363742 +0000 UTC m=+0.148826448 container start bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:08 np0005541913.localdomain systemd[1]: tmp-crun.TdNtZ8.mount: Deactivated successfully.
Dec 02 10:08:08 np0005541913.localdomain dnsmasq[317892]: started, version 2.85 cachesize 150
Dec 02 10:08:08 np0005541913.localdomain dnsmasq[317892]: DNS service limited to local subnets
Dec 02 10:08:08 np0005541913.localdomain dnsmasq[317892]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:08 np0005541913.localdomain dnsmasq[317892]: warning: no upstream servers configured
Dec 02 10:08:08 np0005541913.localdomain dnsmasq-dhcp[317892]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:08:08 np0005541913.localdomain dnsmasq[317892]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:08:08 np0005541913.localdomain dnsmasq-dhcp[317892]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:08 np0005541913.localdomain dnsmasq-dhcp[317892]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:08.564 263406 INFO neutron.agent.dhcp.agent [-] Finished network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 dhcp configuration
Dec 02 10:08:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:08.565 263406 INFO neutron.agent.dhcp.agent [None req-df0a516e-2ba9-46d1-bdd1-2505dc3dca33 - - - - - -] Synchronizing state complete
Dec 02 10:08:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:08.633 263406 INFO neutron.agent.dhcp.agent [None req-6e7af5c8-b9eb-4254-a30a-6043ff159257 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '679c782c-2b17-40db-9e68-0b3c95332c3f', '40102fd5-d7e2-4d44-b447-185810364f71'} is completed
Dec 02 10:08:08 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:08.756 2 INFO neutron.agent.securitygroups_rpc [None req-c7a539d4-2f79-4a17-aaa4-1046dc1167cd b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:08:08 np0005541913.localdomain dnsmasq[317892]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:08 np0005541913.localdomain dnsmasq-dhcp[317892]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:08 np0005541913.localdomain dnsmasq-dhcp[317892]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:08 np0005541913.localdomain podman[317910]: 2025-12-02 10:08:08.832119951 +0000 UTC m=+0.046015138 container kill bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:09 np0005541913.localdomain ceph-mon[298296]: osdmap e138: 6 total, 6 up, 6 in
Dec 02 10:08:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e139 e139: 6 total, 6 up, 6 in
Dec 02 10:08:09 np0005541913.localdomain dnsmasq[317892]: exiting on receipt of SIGTERM
Dec 02 10:08:09 np0005541913.localdomain podman[317949]: 2025-12-02 10:08:09.83630152 +0000 UTC m=+0.062352812 container kill bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:09 np0005541913.localdomain systemd[1]: libpod-bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d.scope: Deactivated successfully.
Dec 02 10:08:09 np0005541913.localdomain podman[317962]: 2025-12-02 10:08:09.908860394 +0000 UTC m=+0.058883790 container died bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:08:09 np0005541913.localdomain podman[317962]: 2025-12-02 10:08:09.941066283 +0000 UTC m=+0.091089599 container cleanup bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:08:09 np0005541913.localdomain systemd[1]: libpod-conmon-bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d.scope: Deactivated successfully.
Dec 02 10:08:09 np0005541913.localdomain podman[317964]: 2025-12-02 10:08:09.988480275 +0000 UTC m=+0.130729154 container remove bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:08:10 np0005541913.localdomain kernel: device tap679c782c-2b left promiscuous mode
Dec 02 10:08:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:10Z|00294|binding|INFO|Releasing lport 679c782c-2b17-40db-9e68-0b3c95332c3f from this chassis (sb_readonly=0)
Dec 02 10:08:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:10Z|00295|binding|INFO|Setting lport 679c782c-2b17-40db-9e68-0b3c95332c3f down in Southbound
Dec 02 10:08:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:10.047 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:10.070 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:10 np0005541913.localdomain ceph-mon[298296]: pgmap v291: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 18 KiB/s wr, 178 op/s
Dec 02 10:08:10 np0005541913.localdomain ceph-mon[298296]: osdmap e139: 6 total, 6 up, 6 in
Dec 02 10:08:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-72541e5e7751300a91567c94e6dac3a39f1f6779176b58ffc5f771b9c7663f69-merged.mount: Deactivated successfully.
Dec 02 10:08:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd56c53a3dfe32b2fedd6402a9f79047a076a93090fba003a3dfbaefbb78dd5d-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:10.587 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=679c782c-2b17-40db-9e68-0b3c95332c3f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:10.590 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 679c782c-2b17-40db-9e68-0b3c95332c3f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:10.592 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:10.592 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[407eea56-3701-4e23-8403-c8418eb79aa4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:10Z|00296|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:08:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:10.678 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:10 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:08:10 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:10.872 263406 INFO neutron.agent.dhcp.agent [None req-fa5cad18-1374-4582-bb12-ddaed479de5e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:08:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:10.963 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:11.042 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e140 e140: 6 total, 6 up, 6 in
Dec 02 10:08:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2189887501' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:08:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2189887501' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:08:11 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:08:11 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:11.681 263406 INFO neutron.agent.linux.ip_lib [None req-99fbca4b-9d33-4ffb-9300-cf2db7cf92e6 - - - - - -] Device tap0983994a-c5 cannot be used as it has no MAC address
Dec 02 10:08:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:11.697 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:11 np0005541913.localdomain kernel: device tap0983994a-c5 entered promiscuous mode
Dec 02 10:08:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:11Z|00297|binding|INFO|Claiming lport 0983994a-c588-4cb2-8149-c4fb6ecf83ff for this chassis.
Dec 02 10:08:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:11Z|00298|binding|INFO|0983994a-c588-4cb2-8149-c4fb6ecf83ff: Claiming unknown
Dec 02 10:08:11 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670091.7060] manager: (tap0983994a-c5): new Generic device (/org/freedesktop/NetworkManager/Devices/48)
Dec 02 10:08:11 np0005541913.localdomain systemd-udevd[318002]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:08:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:11.706 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:11.716 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0983994a-c588-4cb2-8149-c4fb6ecf83ff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:11.717 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0983994a-c588-4cb2-8149-c4fb6ecf83ff in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:08:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:11.718 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:11.718 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[da0d8e51-92c0-4efb-84a3-da4e5319150a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0983994a-c5: No such device
Dec 02 10:08:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0983994a-c5: No such device
Dec 02 10:08:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:11Z|00299|binding|INFO|Setting lport 0983994a-c588-4cb2-8149-c4fb6ecf83ff ovn-installed in OVS
Dec 02 10:08:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:11Z|00300|binding|INFO|Setting lport 0983994a-c588-4cb2-8149-c4fb6ecf83ff up in Southbound
Dec 02 10:08:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:11.735 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0983994a-c5: No such device
Dec 02 10:08:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0983994a-c5: No such device
Dec 02 10:08:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0983994a-c5: No such device
Dec 02 10:08:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0983994a-c5: No such device
Dec 02 10:08:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0983994a-c5: No such device
Dec 02 10:08:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0983994a-c5: No such device
Dec 02 10:08:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:11.763 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:11.777 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:12 np0005541913.localdomain ceph-mon[298296]: pgmap v293: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 118 KiB/s rd, 16 KiB/s wr, 163 op/s
Dec 02 10:08:12 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "aa7c0661-ed15-4711-8a85-f361d992598b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:08:12 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "aa7c0661-ed15-4711-8a85-f361d992598b", "format": "json"}]: dispatch
Dec 02 10:08:12 np0005541913.localdomain ceph-mon[298296]: osdmap e140: 6 total, 6 up, 6 in
Dec 02 10:08:12 np0005541913.localdomain podman[318073]: 
Dec 02 10:08:12 np0005541913.localdomain podman[318073]: 2025-12-02 10:08:12.540810153 +0000 UTC m=+0.101140767 container create d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 02 10:08:12 np0005541913.localdomain systemd[1]: Started libpod-conmon-d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93.scope.
Dec 02 10:08:12 np0005541913.localdomain podman[318073]: 2025-12-02 10:08:12.491701394 +0000 UTC m=+0.052032028 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:12 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a61df44e86f8fb6336ebbdb7745d6d54fbfba671109717a6826455fb31cff4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:12 np0005541913.localdomain podman[318073]: 2025-12-02 10:08:12.618738709 +0000 UTC m=+0.179069323 container init d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:12 np0005541913.localdomain podman[318073]: 2025-12-02 10:08:12.628319734 +0000 UTC m=+0.188650348 container start d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:12 np0005541913.localdomain dnsmasq[318091]: started, version 2.85 cachesize 150
Dec 02 10:08:12 np0005541913.localdomain dnsmasq[318091]: DNS service limited to local subnets
Dec 02 10:08:12 np0005541913.localdomain dnsmasq[318091]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:12 np0005541913.localdomain dnsmasq[318091]: warning: no upstream servers configured
Dec 02 10:08:12 np0005541913.localdomain dnsmasq[318091]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:12 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:12.691 263406 INFO neutron.agent.dhcp.agent [None req-99fbca4b-9d33-4ffb-9300-cf2db7cf92e6 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:11Z, description=, device_id=76289ed5-9a62-4040-9fb0-0eb25e7c4d83, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088d41f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088d4e50>], id=064db39f-7363-4009-9530-71abc8becb56, ip_allocation=immediate, mac_address=fa:16:3e:00:89:dd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['4c7ab332-2efb-4efc-8a25-4f4854ae0d48'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:10Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=False, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1914, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:12Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:08:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:12.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:12 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:12.848 263406 INFO neutron.agent.dhcp.agent [None req-2ea3aeb4-156e-46ce-8d73-5dd2543297a7 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:08:12 np0005541913.localdomain dnsmasq[318091]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:08:12 np0005541913.localdomain podman[318111]: 2025-12-02 10:08:12.895102334 +0000 UTC m=+0.066110443 container kill d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:13 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:13.155 263406 INFO neutron.agent.dhcp.agent [None req-41bfd9cc-def8-4ac4-9c8c-85b850dbf444 - - - - - -] DHCP configuration for ports {'064db39f-7363-4009-9530-71abc8becb56'} is completed
Dec 02 10:08:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e141 e141: 6 total, 6 up, 6 in
Dec 02 10:08:13 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:13.785 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:11Z, description=, device_id=76289ed5-9a62-4040-9fb0-0eb25e7c4d83, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a05160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a05730>], id=064db39f-7363-4009-9530-71abc8becb56, ip_allocation=immediate, mac_address=fa:16:3e:00:89:dd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['4c7ab332-2efb-4efc-8a25-4f4854ae0d48'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:10Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=False, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1914, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:12Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:08:13 np0005541913.localdomain dnsmasq[318091]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:08:13 np0005541913.localdomain podman[318148]: 2025-12-02 10:08:13.985115802 +0000 UTC m=+0.058079429 container kill d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:14.119 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:14.231 263406 INFO neutron.agent.dhcp.agent [None req-f8508c3a-5080-4c49-972e-c26eaa40eb87 - - - - - -] DHCP configuration for ports {'064db39f-7363-4009-9530-71abc8becb56'} is completed
Dec 02 10:08:14 np0005541913.localdomain ceph-mon[298296]: pgmap v295: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 216 KiB/s rd, 30 KiB/s wr, 298 op/s
Dec 02 10:08:14 np0005541913.localdomain ceph-mon[298296]: osdmap e141: 6 total, 6 up, 6 in
Dec 02 10:08:14 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e142 e142: 6 total, 6 up, 6 in
Dec 02 10:08:15 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:15.326 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:08:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e143 e143: 6 total, 6 up, 6 in
Dec 02 10:08:15 np0005541913.localdomain ceph-mon[298296]: osdmap e142: 6 total, 6 up, 6 in
Dec 02 10:08:15 np0005541913.localdomain dnsmasq[318091]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:15 np0005541913.localdomain podman[318187]: 2025-12-02 10:08:15.663281103 +0000 UTC m=+0.054933424 container kill d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:08:15 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:15Z|00301|binding|INFO|Releasing lport 0983994a-c588-4cb2-8149-c4fb6ecf83ff from this chassis (sb_readonly=0)
Dec 02 10:08:15 np0005541913.localdomain kernel: device tap0983994a-c5 left promiscuous mode
Dec 02 10:08:15 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:15Z|00302|binding|INFO|Setting lport 0983994a-c588-4cb2-8149-c4fb6ecf83ff down in Southbound
Dec 02 10:08:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:15.773 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:15 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:15.782 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0983994a-c588-4cb2-8149-c4fb6ecf83ff) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:15 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:15.783 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0983994a-c588-4cb2-8149-c4fb6ecf83ff in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:15 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:15.784 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:15 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:15.785 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[aab57ad9-17e2-4f5e-8a26-dea8614063df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:15.794 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:15.966 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:16.043 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.107 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.113 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c114c25-3520-46f7-b0f6-c4effd1a917e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.109122', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1c055fe-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': '872ee421c2948b6612cc7b9b5d76f1f6c5e4c692cbe85f4a6bc627ed6ac1fb52'}]}, 'timestamp': '2025-12-02 10:08:16.114531', '_unique_id': '007290f3619740349886423117b439d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.116 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.117 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f0e3a9c-ade7-4e48-b6c8-256a2963388d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.117726', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1c0e848-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': 'b7cb9879149761e7c6f6dbd83c09e9effc27106f17484e888836a083855a080b'}]}, 'timestamp': '2025-12-02 10:08:16.118212', '_unique_id': '76820e30731f4c1eb07cadf6229b832e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.119 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.120 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa2c9151-665b-4587-9ffe-e16525a255c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.120368', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1c14ed2-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': '0ed12734316b4e54f4d9c391770ec703bf3dce185c1f57cbaeaba2a87ccc2edb'}]}, 'timestamp': '2025-12-02 10:08:16.120899', '_unique_id': 'b4540c0401e34bcca7ec5292a3a2b1c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.121 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.154 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.155 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2df317e6-82e6-40b4-9bd4-e2a163969967', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:08:16.123119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd1c686ea-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': '67c9b77c9ab465829069f7d9b7e9f95bf5406e0565727c63b7801fdc7df626fa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:08:16.123119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd1c69f54-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': '3dd082df1936f022ed46d4b84eb6a55c0553560695478d2fc6919990d2d5a4b9'}]}, 'timestamp': '2025-12-02 10:08:16.155685', '_unique_id': '16e73cc7923b4e5a961bc02a741f66aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.157 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.158 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20e0cea6-2297-4670-b4c4-d9e04c9f447b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.158565', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1c72726-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': '67f28ccbd27fd18f6957026e1e095b0c2bd3dc8f1cc45e3025d4336c1e98c6ea'}]}, 'timestamp': '2025-12-02 10:08:16.159146', '_unique_id': '290bdc9cf71740fc9ee188ca0b53853a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.160 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.161 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.161 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe210fcf-3096-4251-a44e-88c72c875125', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.161878', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1c7a48a-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': '6a44e9f50cdf9b198dcd1c2b07ce991af1f15db1d3dcd7ed64cdce0a94dfdcac'}]}, 'timestamp': '2025-12-02 10:08:16.162352', '_unique_id': 'bd5b37175ec949ca854f4a5dd64623b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.163 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.165 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.181 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 17710000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e144 e144: 6 total, 6 up, 6 in
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e21cca1a-cb0b-441a-841a-cc5ddd3d867a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17710000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:08:16.165137', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd1caa946-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.400558434, 'message_signature': 'e778b8ee6248a992f03dee92266666be4e6f60b8988420fd134617bca04c167a'}]}, 'timestamp': '2025-12-02 10:08:16.182129', '_unique_id': 'b39b438da43f43bab93783ecdff508b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.185 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.186 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7df3eb41-37ab-49d6-a4bb-644de48f5ca1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:08:16.185225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd1cb3d02-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': 'c995f65fdcc63b6d54aefe4c3b7d130a16e6a2d80f4634fa061ebcb981297231'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:08:16.185225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd1cb5594-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': '191929b07223547304475914de949a25c30b3a63bf6fde7e708c2e4745fb9c93'}]}, 'timestamp': '2025-12-02 10:08:16.186840', '_unique_id': '48922079dd1e4a1493aadb422722800b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.189 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.191 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b040727-d664-4ee3-ac64-a9a5a7aacd8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:08:16.189916', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd1cbec20-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': '8b56d80ad1b95a38f4c05536807e4a2798c954ed40aadb722d1fc441028e6ffe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:08:16.189916', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd1cc4f94-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': 'c0242899bf2a5c33fcacc60331aa9efccd71b0b0aac0da0fe1ef967462c94d51'}]}, 'timestamp': '2025-12-02 10:08:16.192970', '_unique_id': '3df8292ff10340baac6630a4859bba70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.194 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbb0bf6e-33c4-4749-81a4-316de4f0b851', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.196082', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1ccdec8-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': '392a5837b917a4873fbea494ccdd60d9de6a68f183f1e8890ad228df46fb73b5'}]}, 'timestamp': '2025-12-02 10:08:16.196638', '_unique_id': 'eae9957e237c47bbb05c27881d892c0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.197 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.199 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6a3ca0f-8619-49f2-a22b-4515ad9917f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.199424', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1cd6b2c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': 'bcedd6c17bc08c1b8b97c02b7793329fe810a85311414ea605712807b2975357'}]}, 'timestamp': '2025-12-02 10:08:16.200249', '_unique_id': '40ae01ef305b40508500ee0a49030d87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.202 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24944876-9702-45f6-9108-01f791839e13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.203270', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1cdf57e-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': 'ec57849337a1c3996482f5766e2516a34f8c31363970d4e5276057e4cf40511a'}]}, 'timestamp': '2025-12-02 10:08:16.203811', '_unique_id': '1d2cc18b75784ff8821e46187b504b2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.224 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e422db3-0416-41a7-8f00-168387fdcf57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:08:16.206396', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd1d12398-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.425471808, 'message_signature': '36d2e17ac303e4a51fd19e2177c6fbd0a5f57172ce5b6f159db0ea54f20f2213'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:08:16.206396', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd1d13f2c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.425471808, 'message_signature': '158c7ef70c430283f9f7a65d59fccc0438f8d67068c092237a0684050eeb48dc'}]}, 'timestamp': '2025-12-02 10:08:16.225302', '_unique_id': '60ad47422ba34cab9c435fabd71b59a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.228 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.229 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.229 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e78e534-fd57-4dcf-8833-de1025211a66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:08:16.228975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd1d1e60c-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.425471808, 'message_signature': '7e8d1b22c58a78b51ee0c3b23baf8cb48633c9d89588262c071bbc3a76c0c720'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:08:16.228975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd1d1fcbe-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.425471808, 'message_signature': '7e8acb9b0a5b92ce670a87f3cf8d256ea4fd8c5ca5556cf4851afe0634e5bed3'}]}, 'timestamp': '2025-12-02 10:08:16.230220', '_unique_id': '6f7a932c30f14f06babde2ab7702c3ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.231 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.233 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.233 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '464f39e1-bcd7-4dc4-a05c-f430c3f3f5ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:08:16.233372', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd1d29278-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': '7b8907a5473ecd1f4d1f3ebe0acf31fa9cc67883b7a9bf97e9f0d8d46441cde2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:08:16.233372', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd1d2a754-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': 'e0620e33ff35dd0a768a3fa4525e917a772018f4f091105ff44bdf3db79b8ff8'}]}, 'timestamp': '2025-12-02 10:08:16.234503', '_unique_id': '5b71ff2d78ee4624bdf0399f1c0b5666'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.235 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.237 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.237 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.238 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ce9e7ea-1dc0-489e-a4e1-1b9711576d6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:08:16.237913', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd1d34268-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': 'd8136e16be7c7795d4ca12a1f51b69032027d0cd8a43537bea53bdb98d6cf033'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:08:16.237913', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd1d358a2-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': 'd211890d58976032f9d99252c6fadb2531ea05899109b7561e2c8862665b5db6'}]}, 'timestamp': '2025-12-02 10:08:16.239044', '_unique_id': '774b2e2c3dc64369a6d01858efad755c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.240 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.242 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.242 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b00f5ad1-0c82-4a0d-8e45-3b340b8b79b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:08:16.242865', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd1d403f6-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.400558434, 'message_signature': '5cde8290fa6d9cf61111f9ed7aeb62380933ecf0cf8d5a31242cf588863ded6f'}]}, 'timestamp': '2025-12-02 10:08:16.243445', '_unique_id': 'f9673f0c2ad140a7a74e918ce31d75e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.244 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.248 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.248 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '883d0d1c-df24-47da-bdaf-7a596b483244', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:08:16.248157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd1d4d236-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': '5f02efe489e75a2cecc725f179b6e7c1f27fa41d0b6b9f3b492c322c9a8b0bb7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:08:16.248157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd1d4ea50-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.342204059, 'message_signature': 'cf10c244920a8b7baa4e3e8fb30bc5373745879f4f8dbe1e340066cc51e37c4e'}]}, 'timestamp': '2025-12-02 10:08:16.249326', '_unique_id': '380325129a2e413dae5c94c3981a4106'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.250 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.252 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a116e725-e23c-4d4d-a670-025a3e833059', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.252278', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1d57358-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': 'd20a39640889707bc37c6a0e3acd8207efd36d37958431748aba247713b68d15'}]}, 'timestamp': '2025-12-02 10:08:16.252889', '_unique_id': 'f4df1069a99b40bfaf21349aa93263e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.253 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.255 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce72a5fc-f57e-4c89-a288-40d4a432b938', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:08:16.255916', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'd1d60692-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.328207586, 'message_signature': 'a55ce317a367eec948f024cfdfa552fabd7d09ce1a986b26e51fef5fe7cfaa0a'}]}, 'timestamp': '2025-12-02 10:08:16.256824', '_unique_id': 'd4b82e05dfbb4aecb7126c48221f3fd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.258 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.260 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.261 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.262 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6962c73d-d25b-49f0-8c9a-9e4c4c8e31b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:08:16.261265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd1d6d8ec-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.425471808, 'message_signature': 'b4d9c814480a174896a0a931b89cf4f0b6ab7e4635fe8b6aa9d4ce57645c24a5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:08:16.261265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd1d6f642-cf66-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12258.425471808, 'message_signature': '0d089381456f866af7a6bfdeab4c436e296816e869d7ea645389c8428327ebb4'}]}, 'timestamp': '2025-12-02 10:08:16.262876', '_unique_id': '9ce931a33531495d82023e4e9e9a5c92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:08:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:08:16.264 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:08:16 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:08:16 np0005541913.localdomain dnsmasq[318091]: exiting on receipt of SIGTERM
Dec 02 10:08:16 np0005541913.localdomain podman[318226]: 2025-12-02 10:08:16.374680462 +0000 UTC m=+0.052673085 container kill d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 10:08:16 np0005541913.localdomain systemd[1]: libpod-d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93.scope: Deactivated successfully.
Dec 02 10:08:16 np0005541913.localdomain systemd[1]: tmp-crun.mw3JIn.mount: Deactivated successfully.
Dec 02 10:08:16 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "aa7c0661-ed15-4711-8a85-f361d992598b", "snap_name": "c562bc6d-afee-44f1-9f1b-5b7fe43288c6", "format": "json"}]: dispatch
Dec 02 10:08:16 np0005541913.localdomain ceph-mon[298296]: pgmap v298: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 102 KiB/s rd, 14 KiB/s wr, 141 op/s
Dec 02 10:08:16 np0005541913.localdomain ceph-mon[298296]: osdmap e143: 6 total, 6 up, 6 in
Dec 02 10:08:16 np0005541913.localdomain ceph-mon[298296]: osdmap e144: 6 total, 6 up, 6 in
Dec 02 10:08:16 np0005541913.localdomain podman[318238]: 2025-12-02 10:08:16.457416937 +0000 UTC m=+0.094153960 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 02 10:08:16 np0005541913.localdomain podman[318238]: 2025-12-02 10:08:16.471993186 +0000 UTC m=+0.108730139 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:08:16 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:08:16 np0005541913.localdomain podman[318245]: 2025-12-02 10:08:16.538724724 +0000 UTC m=+0.153672716 container died d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:16 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:16Z|00303|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:08:16 np0005541913.localdomain podman[318245]: 2025-12-02 10:08:16.573989423 +0000 UTC m=+0.188937365 container cleanup d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:16 np0005541913.localdomain systemd[1]: libpod-conmon-d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93.scope: Deactivated successfully.
Dec 02 10:08:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:16.603 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:16 np0005541913.localdomain podman[318249]: 2025-12-02 10:08:16.611931425 +0000 UTC m=+0.220711223 container remove d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7a61df44e86f8fb6336ebbdb7745d6d54fbfba671109717a6826455fb31cff4a-merged.mount: Deactivated successfully.
Dec 02 10:08:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d112284dad311a13fa2f730fc2c063b3f922917194c190cb70e5e683f47b5b93-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:16 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:16.878 263406 INFO neutron.agent.dhcp.agent [None req-cfe58590-dfd2-4f69-b66e-536d1e688d90 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:08:16 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:08:17 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e145 e145: 6 total, 6 up, 6 in
Dec 02 10:08:18 np0005541913.localdomain ceph-mon[298296]: pgmap v301: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail
Dec 02 10:08:18 np0005541913.localdomain ceph-mon[298296]: osdmap e145: 6 total, 6 up, 6 in
Dec 02 10:08:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:18.687 263406 INFO neutron.agent.linux.ip_lib [None req-78f3ff14-c88f-46c0-bea2-bcb4c2dabfb5 - - - - - -] Device tap49697408-c0 cannot be used as it has no MAC address
Dec 02 10:08:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:18.735 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:18 np0005541913.localdomain kernel: device tap49697408-c0 entered promiscuous mode
Dec 02 10:08:18 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670098.7439] manager: (tap49697408-c0): new Generic device (/org/freedesktop/NetworkManager/Devices/49)
Dec 02 10:08:18 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:18Z|00304|binding|INFO|Claiming lport 49697408-c01c-4e89-b56b-aa2bd5d6b93d for this chassis.
Dec 02 10:08:18 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:18Z|00305|binding|INFO|49697408-c01c-4e89-b56b-aa2bd5d6b93d: Claiming unknown
Dec 02 10:08:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:18.747 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:18 np0005541913.localdomain systemd-udevd[318297]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.753 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee9:26fd/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=49697408-c01c-4e89-b56b-aa2bd5d6b93d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.755 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 49697408-c01c-4e89-b56b-aa2bd5d6b93d in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.757 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port c4648b8c-9385-4d50-be21-eac02960451b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.757 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.759 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a5705038-2821-4c11-8cfc-b3ec0055d50f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:18 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap49697408-c0: No such device
Dec 02 10:08:18 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:18Z|00306|binding|INFO|Setting lport 49697408-c01c-4e89-b56b-aa2bd5d6b93d ovn-installed in OVS
Dec 02 10:08:18 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:18Z|00307|binding|INFO|Setting lport 49697408-c01c-4e89-b56b-aa2bd5d6b93d up in Southbound
Dec 02 10:08:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:18.789 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:18 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap49697408-c0: No such device
Dec 02 10:08:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:18.794 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:18 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap49697408-c0: No such device
Dec 02 10:08:18 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap49697408-c0: No such device
Dec 02 10:08:18 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap49697408-c0: No such device
Dec 02 10:08:18 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap49697408-c0: No such device
Dec 02 10:08:18 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap49697408-c0: No such device
Dec 02 10:08:18 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap49697408-c0: No such device
Dec 02 10:08:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:18.832 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:18.871 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.903 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2 2001:db8::f816:3eff:fee6:1993'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a59d5a92-7a77-419d-a87f-fbb46ea78955) old=Port_Binding(mac=['fa:16:3e:e6:19:93 2001:db8::f816:3eff:fee6:1993'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.905 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a59d5a92-7a77-419d-a87f-fbb46ea78955 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 updated
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.906 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port c4648b8c-9385-4d50-be21-eac02960451b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.906 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:18 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:18.907 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[69020618-7b52-4ebf-8edf-bf6396a7b88b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:19 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:19.296 2 INFO neutron.agent.securitygroups_rpc [None req-2bfb9ebd-1846-44bd-b2e4-1f309ec769c2 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:08:19 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:19.335 2 INFO neutron.agent.securitygroups_rpc [None req-a38d4309-d6ec-4127-b224-040aeb412100 b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:08:19 np0005541913.localdomain podman[318369]: 
Dec 02 10:08:19 np0005541913.localdomain podman[318369]: 2025-12-02 10:08:19.732824574 +0000 UTC m=+0.093363539 container create 9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:08:19 np0005541913.localdomain systemd[1]: Started libpod-conmon-9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d.scope.
Dec 02 10:08:19 np0005541913.localdomain podman[318369]: 2025-12-02 10:08:19.688179884 +0000 UTC m=+0.048718879 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:19 np0005541913.localdomain systemd[1]: tmp-crun.lNYFfz.mount: Deactivated successfully.
Dec 02 10:08:19 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:19 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e491d8d4a71f860433ad283ec12af5d3298ac14eb5025a594468033bcfee7a83/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:19 np0005541913.localdomain podman[318369]: 2025-12-02 10:08:19.810383981 +0000 UTC m=+0.170922936 container init 9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:19 np0005541913.localdomain podman[318369]: 2025-12-02 10:08:19.823662654 +0000 UTC m=+0.184201659 container start 9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:08:19 np0005541913.localdomain dnsmasq[318387]: started, version 2.85 cachesize 150
Dec 02 10:08:19 np0005541913.localdomain dnsmasq[318387]: DNS service limited to local subnets
Dec 02 10:08:19 np0005541913.localdomain dnsmasq[318387]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:19 np0005541913.localdomain dnsmasq[318387]: warning: no upstream servers configured
Dec 02 10:08:19 np0005541913.localdomain dnsmasq[318387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:20 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:20.029 263406 INFO neutron.agent.dhcp.agent [None req-e4596e98-5517-4660-9df8-2ac1df4897cf - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:08:20 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:20.054 2 INFO neutron.agent.securitygroups_rpc [None req-d321651e-4716-4e9e-b955-449cf71fa8bf 11daa5bc8801433f99b71663879a8016 62771fbe049e4d57aae1b3554ed3a36c - - default default] Security group member updated ['e79580ca-0f44-4e36-92d0-a0d65fb01c6b']
Dec 02 10:08:20 np0005541913.localdomain dnsmasq[318387]: exiting on receipt of SIGTERM
Dec 02 10:08:20 np0005541913.localdomain podman[318406]: 2025-12-02 10:08:20.127995225 +0000 UTC m=+0.059653651 container kill 9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:08:20 np0005541913.localdomain systemd[1]: libpod-9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d.scope: Deactivated successfully.
Dec 02 10:08:20 np0005541913.localdomain podman[318419]: 2025-12-02 10:08:20.211657415 +0000 UTC m=+0.071519168 container died 9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:08:20 np0005541913.localdomain podman[318419]: 2025-12-02 10:08:20.336407519 +0000 UTC m=+0.196269232 container cleanup 9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:20 np0005541913.localdomain systemd[1]: libpod-conmon-9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d.scope: Deactivated successfully.
Dec 02 10:08:20 np0005541913.localdomain podman[318426]: 2025-12-02 10:08:20.363811199 +0000 UTC m=+0.206811933 container remove 9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:08:20 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:20.426 2 INFO neutron.agent.securitygroups_rpc [None req-d321651e-4716-4e9e-b955-449cf71fa8bf 11daa5bc8801433f99b71663879a8016 62771fbe049e4d57aae1b3554ed3a36c - - default default] Security group member updated ['e79580ca-0f44-4e36-92d0-a0d65fb01c6b']
Dec 02 10:08:20 np0005541913.localdomain ceph-mon[298296]: pgmap v303: 177 pgs: 177 active+clean; 145 MiB data, 769 MiB used, 41 GiB / 42 GiB avail; 183 KiB/s rd, 21 KiB/s wr, 256 op/s
Dec 02 10:08:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:20.543 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:20 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:20.646 2 INFO neutron.agent.securitygroups_rpc [None req-87212674-2d83-471b-8535-396909b240c7 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:08:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e491d8d4a71f860433ad283ec12af5d3298ac14eb5025a594468033bcfee7a83-merged.mount: Deactivated successfully.
Dec 02 10:08:20 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e721da8ab0c62dd724218239feaad4b5755f53508e7fc0680508b8f2c2e4a6d-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:20 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:08:20 np0005541913.localdomain podman[318448]: 2025-12-02 10:08:20.849991986 +0000 UTC m=+0.081304198 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:20 np0005541913.localdomain podman[318448]: 2025-12-02 10:08:20.861137783 +0000 UTC m=+0.092450035 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 10:08:20 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:08:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:20.970 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:21.045 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e146 e146: 6 total, 6 up, 6 in
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.240178) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670101240254, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 910, "num_deletes": 258, "total_data_size": 1858186, "memory_usage": 1876584, "flush_reason": "Manual Compaction"}
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670101250183, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 1224963, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24600, "largest_seqno": 25505, "table_properties": {"data_size": 1220813, "index_size": 1877, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10165, "raw_average_key_size": 21, "raw_value_size": 1212225, "raw_average_value_size": 2509, "num_data_blocks": 82, "num_entries": 483, "num_filter_entries": 483, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670061, "oldest_key_time": 1764670061, "file_creation_time": 1764670101, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 10057 microseconds, and 4133 cpu microseconds.
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.250238) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 1224963 bytes OK
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.250265) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.251993) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.252014) EVENT_LOG_v1 {"time_micros": 1764670101252007, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.252037) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 1853370, prev total WAL file size 1853370, number of live WAL files 2.
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.252707) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end)
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(1196KB)], [39(17MB)]
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670101252758, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 19153783, "oldest_snapshot_seqno": -1}
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12629 keys, 17217424 bytes, temperature: kUnknown
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670101336648, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 17217424, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17143550, "index_size": 41197, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31621, "raw_key_size": 339304, "raw_average_key_size": 26, "raw_value_size": 16926449, "raw_average_value_size": 1340, "num_data_blocks": 1564, "num_entries": 12629, "num_filter_entries": 12629, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670101, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.336930) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 17217424 bytes
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.339264) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 228.1 rd, 205.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 17.1 +0.0 blob) out(16.4 +0.0 blob), read-write-amplify(29.7) write-amplify(14.1) OK, records in: 13161, records dropped: 532 output_compression: NoCompression
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.339293) EVENT_LOG_v1 {"time_micros": 1764670101339280, "job": 22, "event": "compaction_finished", "compaction_time_micros": 83979, "compaction_time_cpu_micros": 39139, "output_level": 6, "num_output_files": 1, "total_output_size": 17217424, "num_input_records": 13161, "num_output_records": 12629, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670101339584, "job": 22, "event": "table_file_deletion", "file_number": 41}
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670101342315, "job": 22, "event": "table_file_deletion", "file_number": 39}
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.252596) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.342418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.342428) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.342432) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.342435) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:21 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:21.342438) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:21 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:21.764 2 INFO neutron.agent.securitygroups_rpc [None req-7754a6d4-074d-4d03-86b1-db3804b94ab5 11daa5bc8801433f99b71663879a8016 62771fbe049e4d57aae1b3554ed3a36c - - default default] Security group member updated ['e79580ca-0f44-4e36-92d0-a0d65fb01c6b']
Dec 02 10:08:21 np0005541913.localdomain podman[318518]: 
Dec 02 10:08:21 np0005541913.localdomain podman[318518]: 2025-12-02 10:08:21.849566842 +0000 UTC m=+0.097284292 container create 876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:08:21 np0005541913.localdomain systemd[1]: Started libpod-conmon-876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697.scope.
Dec 02 10:08:21 np0005541913.localdomain podman[318518]: 2025-12-02 10:08:21.797216168 +0000 UTC m=+0.044933658 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:21 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:21 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7994a80ca3ae2362c636511214aa588ff04075f3d1ffa63ed03ba32d12f47496/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:21 np0005541913.localdomain podman[318518]: 2025-12-02 10:08:21.92075241 +0000 UTC m=+0.168469860 container init 876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 10:08:21 np0005541913.localdomain podman[318518]: 2025-12-02 10:08:21.927448668 +0000 UTC m=+0.175166138 container start 876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:08:21 np0005541913.localdomain dnsmasq[318536]: started, version 2.85 cachesize 150
Dec 02 10:08:21 np0005541913.localdomain dnsmasq[318536]: DNS service limited to local subnets
Dec 02 10:08:21 np0005541913.localdomain dnsmasq[318536]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:21 np0005541913.localdomain dnsmasq[318536]: warning: no upstream servers configured
Dec 02 10:08:21 np0005541913.localdomain dnsmasq-dhcp[318536]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:08:21 np0005541913.localdomain dnsmasq[318536]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:08:21 np0005541913.localdomain dnsmasq-dhcp[318536]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:21 np0005541913.localdomain dnsmasq-dhcp[318536]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:22.149 263406 INFO neutron.agent.dhcp.agent [None req-ff1c3e3b-9900-498b-9ff3-5616c8b82e26 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '4309bb90-9cb8-4e1a-9a0b-f5834db52038', '49697408-c01c-4e89-b56b-aa2bd5d6b93d'} is completed
Dec 02 10:08:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "aa7c0661-ed15-4711-8a85-f361d992598b", "snap_name": "c562bc6d-afee-44f1-9f1b-5b7fe43288c6_d6e2b43c-7ed8-4069-8208-0ab8116bd864", "force": true, "format": "json"}]: dispatch
Dec 02 10:08:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "aa7c0661-ed15-4711-8a85-f361d992598b", "snap_name": "c562bc6d-afee-44f1-9f1b-5b7fe43288c6", "force": true, "format": "json"}]: dispatch
Dec 02 10:08:22 np0005541913.localdomain ceph-mon[298296]: pgmap v304: 177 pgs: 177 active+clean; 145 MiB data, 769 MiB used, 41 GiB / 42 GiB avail; 143 KiB/s rd, 16 KiB/s wr, 199 op/s
Dec 02 10:08:22 np0005541913.localdomain ceph-mon[298296]: osdmap e146: 6 total, 6 up, 6 in
Dec 02 10:08:22 np0005541913.localdomain dnsmasq[318536]: exiting on receipt of SIGTERM
Dec 02 10:08:22 np0005541913.localdomain podman[318554]: 2025-12-02 10:08:22.250475547 +0000 UTC m=+0.066795131 container kill 876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:08:22 np0005541913.localdomain systemd[1]: libpod-876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697.scope: Deactivated successfully.
Dec 02 10:08:22 np0005541913.localdomain podman[318569]: 2025-12-02 10:08:22.304893377 +0000 UTC m=+0.041628901 container died 876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:22 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:22.308 2 INFO neutron.agent.securitygroups_rpc [None req-a959ed63-fb01-427b-9973-6a88ead4c1cf 11daa5bc8801433f99b71663879a8016 62771fbe049e4d57aae1b3554ed3a36c - - default default] Security group member updated ['e79580ca-0f44-4e36-92d0-a0d65fb01c6b']
Dec 02 10:08:22 np0005541913.localdomain podman[318569]: 2025-12-02 10:08:22.333072028 +0000 UTC m=+0.069807512 container cleanup 876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:08:22 np0005541913.localdomain systemd[1]: libpod-conmon-876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697.scope: Deactivated successfully.
Dec 02 10:08:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:22.352 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:08:22 np0005541913.localdomain podman[318571]: 2025-12-02 10:08:22.395398018 +0000 UTC m=+0.124040726 container remove 876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 02 10:08:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:08:22 np0005541913.localdomain podman[318572]: 2025-12-02 10:08:22.377681347 +0000 UTC m=+0.109283654 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, version=9.6, managed_by=edpm_ansible, config_id=edpm, com.redhat.component=ubi9-minimal-container)
Dec 02 10:08:22 np0005541913.localdomain podman[318572]: 2025-12-02 10:08:22.46301074 +0000 UTC m=+0.194613057 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Dec 02 10:08:22 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:08:22 np0005541913.localdomain podman[318615]: 2025-12-02 10:08:22.516399024 +0000 UTC m=+0.110153377 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:08:22 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e147 e147: 6 total, 6 up, 6 in
Dec 02 10:08:22 np0005541913.localdomain podman[318615]: 2025-12-02 10:08:22.55193999 +0000 UTC m=+0.145694323 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:08:22 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:08:22 np0005541913.localdomain systemd[1]: tmp-crun.KcUnsg.mount: Deactivated successfully.
Dec 02 10:08:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7994a80ca3ae2362c636511214aa588ff04075f3d1ffa63ed03ba32d12f47496-merged.mount: Deactivated successfully.
Dec 02 10:08:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-876af8e678cf4d2a9e6f22ac20fc13cbbf440d5ca9116ca1735556ffd1696697-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:23 np0005541913.localdomain podman[318692]: 
Dec 02 10:08:23 np0005541913.localdomain sudo[318700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:08:23 np0005541913.localdomain sudo[318700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:08:23 np0005541913.localdomain sudo[318700]: pam_unix(sudo:session): session closed for user root
Dec 02 10:08:23 np0005541913.localdomain podman[318692]: 2025-12-02 10:08:23.14330923 +0000 UTC m=+0.087797191 container create c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:08:23 np0005541913.localdomain systemd[1]: Started libpod-conmon-c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9.scope.
Dec 02 10:08:23 np0005541913.localdomain podman[318692]: 2025-12-02 10:08:23.097194461 +0000 UTC m=+0.041682472 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:23 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:23 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dfb9edd785f2c80f522c9fd9120eb8211cfcba13f635ce18efdb23634020145c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:23 np0005541913.localdomain sudo[318723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:08:23 np0005541913.localdomain podman[318692]: 2025-12-02 10:08:23.212040162 +0000 UTC m=+0.156528123 container init c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:08:23 np0005541913.localdomain sudo[318723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:08:23 np0005541913.localdomain podman[318692]: 2025-12-02 10:08:23.222897541 +0000 UTC m=+0.167385492 container start c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:08:23 np0005541913.localdomain dnsmasq[318746]: started, version 2.85 cachesize 150
Dec 02 10:08:23 np0005541913.localdomain dnsmasq[318746]: DNS service limited to local subnets
Dec 02 10:08:23 np0005541913.localdomain dnsmasq[318746]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:23 np0005541913.localdomain dnsmasq[318746]: warning: no upstream servers configured
Dec 02 10:08:23 np0005541913.localdomain dnsmasq[318746]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:23 np0005541913.localdomain ceph-mon[298296]: osdmap e147: 6 total, 6 up, 6 in
Dec 02 10:08:23 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e148 e148: 6 total, 6 up, 6 in
Dec 02 10:08:23 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:23.591 263406 INFO neutron.agent.dhcp.agent [None req-0f16e2d2-ee30-44d4-b251-3be52e53bca7 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '49697408-c01c-4e89-b56b-aa2bd5d6b93d'} is completed
Dec 02 10:08:23 np0005541913.localdomain podman[318764]: 2025-12-02 10:08:23.623185858 +0000 UTC m=+0.083410424 container kill c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:08:23 np0005541913.localdomain dnsmasq[318746]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:23 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:23.766 263406 INFO neutron.agent.dhcp.agent [None req-16ccd695-fef9-4c33-b634-fbb1bbce8395 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:19Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089b8ca0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f99089b8850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089b8970>, <neutron.agent.linux.dhcp.DictModel object at 0x7f99089b8b20>], id=4309bb90-9cb8-4e1a-9a0b-f5834db52038, ip_allocation=immediate, mac_address=fa:16:3e:a2:96:1d, name=tempest-NetworksTestDHCPv6-491044706, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=23, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['268258f7-a345-48a6-ab25-cc16b1ab921c', 'e054bad1-d3f9-4896-960d-8ef0f5ded92b'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:18Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=1938, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:19Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:08:23 np0005541913.localdomain sudo[318723]: pam_unix(sudo:session): session closed for user root
Dec 02 10:08:23 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:23.915 263406 INFO neutron.agent.dhcp.agent [None req-416f354e-2179-41a6-818b-00d3a0e5bdc0 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '49697408-c01c-4e89-b56b-aa2bd5d6b93d'} is completed
Dec 02 10:08:23 np0005541913.localdomain dnsmasq[318746]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:08:23 np0005541913.localdomain podman[318831]: 2025-12-02 10:08:23.935126481 +0000 UTC m=+0.045634897 container kill c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:24 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:24.139 263406 INFO neutron.agent.dhcp.agent [None req-056135d1-cd40-4841-a70a-3a55c6f15857 - - - - - -] DHCP configuration for ports {'4309bb90-9cb8-4e1a-9a0b-f5834db52038'} is completed
Dec 02 10:08:24 np0005541913.localdomain sudo[318870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:08:24 np0005541913.localdomain sudo[318870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:08:24 np0005541913.localdomain sudo[318870]: pam_unix(sudo:session): session closed for user root
Dec 02 10:08:24 np0005541913.localdomain dnsmasq[318746]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:24 np0005541913.localdomain podman[318869]: 2025-12-02 10:08:24.266219244 +0000 UTC m=+0.060348359 container kill c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:08:24 np0005541913.localdomain ceph-mon[298296]: pgmap v307: 177 pgs: 177 active+clean; 146 MiB data, 769 MiB used, 41 GiB / 42 GiB avail; 162 KiB/s rd, 25 KiB/s wr, 228 op/s
Dec 02 10:08:24 np0005541913.localdomain ceph-mon[298296]: osdmap e148: 6 total, 6 up, 6 in
Dec 02 10:08:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:08:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:08:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:08:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:08:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e149 e149: 6 total, 6 up, 6 in
Dec 02 10:08:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:24.638 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:19:93 2001:db8::f816:3eff:fee6:1993'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a59d5a92-7a77-419d-a87f-fbb46ea78955) old=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2 2001:db8::f816:3eff:fee6:1993'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:24.640 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a59d5a92-7a77-419d-a87f-fbb46ea78955 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 updated
Dec 02 10:08:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:24.643 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port c4648b8c-9385-4d50-be21-eac02960451b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:08:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:24.643 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:24.644 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[762e3d9a-e5ee-46ac-b0ce-bee3713560f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:24.770 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:24 np0005541913.localdomain dnsmasq[318746]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:24 np0005541913.localdomain podman[318924]: 2025-12-02 10:08:24.989748175 +0000 UTC m=+0.050215549 container kill c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:25 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:25.305 263406 INFO neutron.agent.dhcp.agent [None req-5bd2df99-404d-4dcc-818e-84ad504aa715 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '49697408-c01c-4e89-b56b-aa2bd5d6b93d'} is completed
Dec 02 10:08:25 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "aa7c0661-ed15-4711-8a85-f361d992598b", "format": "json"}]: dispatch
Dec 02 10:08:25 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "aa7c0661-ed15-4711-8a85-f361d992598b", "force": true, "format": "json"}]: dispatch
Dec 02 10:08:25 np0005541913.localdomain ceph-mon[298296]: osdmap e149: 6 total, 6 up, 6 in
Dec 02 10:08:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e150 e150: 6 total, 6 up, 6 in
Dec 02 10:08:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:25.889 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:25 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:25.972 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:26.009 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Triggering sync for uuid b254bb7f-2891-4b37-9c44-9700e301ce16 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 02 10:08:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:26.010 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:08:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:26.010 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:08:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:26.036 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:08:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:26.047 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:26Z|00308|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:08:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:26.231 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:26 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:26.352 2 INFO neutron.agent.securitygroups_rpc [None req-e2093ff2-f702-4cf4-8beb-c324b04696df b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:08:26 np0005541913.localdomain dnsmasq[318746]: exiting on receipt of SIGTERM
Dec 02 10:08:26 np0005541913.localdomain systemd[1]: tmp-crun.mRLVU7.mount: Deactivated successfully.
Dec 02 10:08:26 np0005541913.localdomain podman[318963]: 2025-12-02 10:08:26.35731189 +0000 UTC m=+0.068955229 container kill c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:08:26 np0005541913.localdomain systemd[1]: libpod-c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9.scope: Deactivated successfully.
Dec 02 10:08:26 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:26.395 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:08:26 np0005541913.localdomain podman[318976]: 2025-12-02 10:08:26.439396137 +0000 UTC m=+0.066340459 container died c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:08:26 np0005541913.localdomain podman[318976]: 2025-12-02 10:08:26.468476912 +0000 UTC m=+0.095421194 container cleanup c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:08:26 np0005541913.localdomain systemd[1]: libpod-conmon-c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9.scope: Deactivated successfully.
Dec 02 10:08:26 np0005541913.localdomain podman[318978]: 2025-12-02 10:08:26.523589631 +0000 UTC m=+0.143029503 container remove c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:08:26 np0005541913.localdomain ceph-mon[298296]: pgmap v310: 177 pgs: 177 active+clean; 146 MiB data, 769 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 13 KiB/s wr, 42 op/s
Dec 02 10:08:26 np0005541913.localdomain ceph-mon[298296]: osdmap e150: 6 total, 6 up, 6 in
Dec 02 10:08:27 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:27.076 2 INFO neutron.agent.securitygroups_rpc [None req-95108d82-ee5d-48ad-b799-8f24c524b687 378bbf1156ab482eae3359fa477651da 13c70d8f74354389b175376619620536 - - default default] Security group member updated ['20308e6b-d2a0-4e90-a058-a0e30da512e9']
Dec 02 10:08:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-dfb9edd785f2c80f522c9fd9120eb8211cfcba13f635ce18efdb23634020145c-merged.mount: Deactivated successfully.
Dec 02 10:08:27 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0e86867b5c789de145da941b9b246fef8ccb5f770b4dbd26db4f1b6d38be5e9-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:27.538 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2 2001:db8::f816:3eff:fee6:1993'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a59d5a92-7a77-419d-a87f-fbb46ea78955) old=Port_Binding(mac=['fa:16:3e:e6:19:93 2001:db8::f816:3eff:fee6:1993'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:27.540 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a59d5a92-7a77-419d-a87f-fbb46ea78955 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 updated
Dec 02 10:08:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:27.542 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port c4648b8c-9385-4d50-be21-eac02960451b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:08:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:27.543 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:27.544 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[1cfecb4f-4508-498c-b10d-0adb4a3c93a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:08:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:08:28 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:28Z|00309|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:08:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:28.475 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9dcfd3db-4efe-40f2-a49d-5d58c6cc71e6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:08:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9dcfd3db-4efe-40f2-a49d-5d58c6cc71e6", "format": "json"}]: dispatch
Dec 02 10:08:28 np0005541913.localdomain ceph-mon[298296]: pgmap v312: 177 pgs: 177 active+clean; 146 MiB data, 769 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 12 KiB/s wr, 37 op/s
Dec 02 10:08:28 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3666129381' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:08:28 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3666129381' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:08:29 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:29.180 2 INFO neutron.agent.securitygroups_rpc [None req-0c3b85c4-8ee4-4ede-a5c5-9e006eeb1903 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:08:29 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:29.361 2 INFO neutron.agent.securitygroups_rpc [None req-29a3155c-2369-431f-9930-578d28142354 378bbf1156ab482eae3359fa477651da 13c70d8f74354389b175376619620536 - - default default] Security group member updated ['20308e6b-d2a0-4e90-a058-a0e30da512e9']
Dec 02 10:08:29 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:29.553 2 INFO neutron.agent.securitygroups_rpc [None req-4ad98beb-2033-4528-bdc9-387b15719003 b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:08:29 np0005541913.localdomain podman[319059]: 
Dec 02 10:08:29 np0005541913.localdomain podman[319059]: 2025-12-02 10:08:29.562263988 +0000 UTC m=+0.071276161 container create 5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:08:29 np0005541913.localdomain systemd[1]: Started libpod-conmon-5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e.scope.
Dec 02 10:08:29 np0005541913.localdomain systemd[1]: tmp-crun.rYzOyq.mount: Deactivated successfully.
Dec 02 10:08:29 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:29 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1d0402c7888f1d0cd7e68faad9130068442e8068353c15c6ac51afafeb7addf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:29 np0005541913.localdomain podman[319059]: 2025-12-02 10:08:29.536028938 +0000 UTC m=+0.045041131 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:29 np0005541913.localdomain podman[319059]: 2025-12-02 10:08:29.642725651 +0000 UTC m=+0.151737844 container init 5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:08:29 np0005541913.localdomain podman[319059]: 2025-12-02 10:08:29.652635176 +0000 UTC m=+0.161647359 container start 5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:29 np0005541913.localdomain dnsmasq[319079]: started, version 2.85 cachesize 150
Dec 02 10:08:29 np0005541913.localdomain dnsmasq[319079]: DNS service limited to local subnets
Dec 02 10:08:29 np0005541913.localdomain dnsmasq[319079]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:29 np0005541913.localdomain dnsmasq[319079]: warning: no upstream servers configured
Dec 02 10:08:29 np0005541913.localdomain dnsmasq-dhcp[319079]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:08:29 np0005541913.localdomain dnsmasq-dhcp[319079]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:08:29 np0005541913.localdomain dnsmasq[319079]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:29 np0005541913.localdomain dnsmasq-dhcp[319079]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:29 np0005541913.localdomain dnsmasq-dhcp[319079]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:29 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:29.729 263406 INFO neutron.agent.dhcp.agent [None req-7b5e6717-cea8-474d-9840-10b7d67b5359 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:28Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a24340>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908a24f40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a24160>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908a24e80>], id=5da4d313-4dc9-4016-98e1-1b58c370bb2e, ip_allocation=immediate, mac_address=fa:16:3e:98:19:1e, name=tempest-NetworksTestDHCPv6-1769082368, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['33e7892a-f1fb-4759-a941-d291759a7b26', '9c67749b-9d67-48ed-9334-d10de6566a63'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:25Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=1988, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:28Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:08:29 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:29.859 263406 INFO neutron.agent.dhcp.agent [None req-8a7b9314-95d0-4509-b132-7b11625f9741 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '49697408-c01c-4e89-b56b-aa2bd5d6b93d'} is completed
Dec 02 10:08:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "60995fd9-a7c9-4e80-ba2f-4e09200b332e", "format": "json"}]: dispatch
Dec 02 10:08:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "60995fd9-a7c9-4e80-ba2f-4e09200b332e", "force": true, "format": "json"}]: dispatch
Dec 02 10:08:29 np0005541913.localdomain ceph-mon[298296]: pgmap v313: 177 pgs: 177 active+clean; 146 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 17 KiB/s wr, 95 op/s
Dec 02 10:08:29 np0005541913.localdomain podman[319096]: 2025-12-02 10:08:29.95823756 +0000 UTC m=+0.052538001 container kill 5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:08:29 np0005541913.localdomain dnsmasq[319079]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:08:29 np0005541913.localdomain dnsmasq-dhcp[319079]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:29 np0005541913.localdomain dnsmasq-dhcp[319079]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:30 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:30.199 263406 INFO neutron.agent.dhcp.agent [None req-99847e37-8f50-4fc8-8084-86ecf3626a83 - - - - - -] DHCP configuration for ports {'5da4d313-4dc9-4016-98e1-1b58c370bb2e'} is completed
Dec 02 10:08:30 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:30.605 2 INFO neutron.agent.securitygroups_rpc [None req-84026af9-2eee-4701-994f-c9f2d1b31806 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:08:30 np0005541913.localdomain dnsmasq[319079]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:30 np0005541913.localdomain dnsmasq-dhcp[319079]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:30 np0005541913.localdomain dnsmasq-dhcp[319079]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:30 np0005541913.localdomain podman[319136]: 2025-12-02 10:08:30.91176704 +0000 UTC m=+0.064431358 container kill 5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:08:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:08:30 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9dcfd3db-4efe-40f2-a49d-5d58c6cc71e6", "snap_name": "9664f206-7cb4-4a1a-b619-2a201c7ebe10", "format": "json"}]: dispatch
Dec 02 10:08:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:30.975 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:31 np0005541913.localdomain podman[319150]: 2025-12-02 10:08:31.024669769 +0000 UTC m=+0.086735722 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 10:08:31 np0005541913.localdomain podman[319150]: 2025-12-02 10:08:31.037914362 +0000 UTC m=+0.099980345 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=multipathd)
Dec 02 10:08:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:31.048 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:31 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e151 e151: 6 total, 6 up, 6 in
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.235333) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670111235414, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 470, "num_deletes": 252, "total_data_size": 408037, "memory_usage": 417464, "flush_reason": "Manual Compaction"}
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670111240461, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 267212, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25510, "largest_seqno": 25975, "table_properties": {"data_size": 264600, "index_size": 659, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7423, "raw_average_key_size": 21, "raw_value_size": 259020, "raw_average_value_size": 742, "num_data_blocks": 29, "num_entries": 349, "num_filter_entries": 349, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670101, "oldest_key_time": 1764670101, "file_creation_time": 1764670111, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 5194 microseconds, and 2091 cpu microseconds.
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.240529) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 267212 bytes OK
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.240558) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.243584) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.243639) EVENT_LOG_v1 {"time_micros": 1764670111243602, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.243664) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 405071, prev total WAL file size 405071, number of live WAL files 2.
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.244222) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303037' seq:72057594037927935, type:22 .. '6D6772737461740034323538' seq:0, type:0; will stop at (end)
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(260KB)], [42(16MB)]
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670111244339, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 17484636, "oldest_snapshot_seqno": -1}
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12455 keys, 15368615 bytes, temperature: kUnknown
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670111322786, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 15368615, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15300531, "index_size": 35855, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31173, "raw_key_size": 336025, "raw_average_key_size": 26, "raw_value_size": 15090997, "raw_average_value_size": 1211, "num_data_blocks": 1340, "num_entries": 12455, "num_filter_entries": 12455, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670111, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.323052) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 15368615 bytes
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.325033) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 222.7 rd, 195.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 16.4 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(122.9) write-amplify(57.5) OK, records in: 12978, records dropped: 523 output_compression: NoCompression
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.325052) EVENT_LOG_v1 {"time_micros": 1764670111325044, "job": 24, "event": "compaction_finished", "compaction_time_micros": 78514, "compaction_time_cpu_micros": 30001, "output_level": 6, "num_output_files": 1, "total_output_size": 15368615, "num_input_records": 12978, "num_output_records": 12455, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670111325275, "job": 24, "event": "table_file_deletion", "file_number": 44}
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670111326942, "job": 24, "event": "table_file_deletion", "file_number": 42}
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.244123) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.327051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.327060) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.327063) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.327073) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:31 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:08:31.327077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:08:31 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:31.458 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:08:32 np0005541913.localdomain dnsmasq[319079]: exiting on receipt of SIGTERM
Dec 02 10:08:32 np0005541913.localdomain systemd[1]: tmp-crun.qbL2xN.mount: Deactivated successfully.
Dec 02 10:08:32 np0005541913.localdomain systemd[1]: libpod-5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e.scope: Deactivated successfully.
Dec 02 10:08:32 np0005541913.localdomain podman[319195]: 2025-12-02 10:08:32.184296432 +0000 UTC m=+0.041385854 container kill 5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:08:32 np0005541913.localdomain ceph-mon[298296]: pgmap v314: 177 pgs: 177 active+clean; 146 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 14 KiB/s wr, 76 op/s
Dec 02 10:08:32 np0005541913.localdomain ceph-mon[298296]: osdmap e151: 6 total, 6 up, 6 in
Dec 02 10:08:32 np0005541913.localdomain podman[319210]: 2025-12-02 10:08:32.232593169 +0000 UTC m=+0.043462060 container died 5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:08:32 np0005541913.localdomain podman[319210]: 2025-12-02 10:08:32.256475045 +0000 UTC m=+0.067343896 container cleanup 5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 10:08:32 np0005541913.localdomain systemd[1]: libpod-conmon-5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e.scope: Deactivated successfully.
Dec 02 10:08:32 np0005541913.localdomain podman[319217]: 2025-12-02 10:08:32.304445134 +0000 UTC m=+0.105944584 container remove 5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:33 np0005541913.localdomain podman[319290]: 
Dec 02 10:08:33 np0005541913.localdomain podman[319290]: 2025-12-02 10:08:33.030064451 +0000 UTC m=+0.088756946 container create 1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:08:33 np0005541913.localdomain systemd[1]: Started libpod-conmon-1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7.scope.
Dec 02 10:08:33 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:33 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35eb325a33a7c1b7be7aec223582707d0502075bf5da3d993e966b3ab11f7c3b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:33 np0005541913.localdomain podman[319290]: 2025-12-02 10:08:33.090219014 +0000 UTC m=+0.148911509 container init 1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:33 np0005541913.localdomain podman[319290]: 2025-12-02 10:08:32.993941799 +0000 UTC m=+0.052634354 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:33 np0005541913.localdomain podman[319290]: 2025-12-02 10:08:33.099564053 +0000 UTC m=+0.158256548 container start 1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:33 np0005541913.localdomain dnsmasq[319309]: started, version 2.85 cachesize 150
Dec 02 10:08:33 np0005541913.localdomain dnsmasq[319309]: DNS service limited to local subnets
Dec 02 10:08:33 np0005541913.localdomain dnsmasq[319309]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:33 np0005541913.localdomain dnsmasq[319309]: warning: no upstream servers configured
Dec 02 10:08:33 np0005541913.localdomain dnsmasq-dhcp[319309]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:08:33 np0005541913.localdomain dnsmasq[319309]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:33 np0005541913.localdomain dnsmasq-dhcp[319309]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:33 np0005541913.localdomain dnsmasq-dhcp[319309]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a1d0402c7888f1d0cd7e68faad9130068442e8068353c15c6ac51afafeb7addf-merged.mount: Deactivated successfully.
Dec 02 10:08:33 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5ae1a41050dae7bb980aa0480d9ed16c89529ad5be181bafdf6e1d27c696130e-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:33 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e152 e152: 6 total, 6 up, 6 in
Dec 02 10:08:33 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:33.697 263406 INFO neutron.agent.dhcp.agent [None req-850fcb4a-fb66-4c3b-806c-e71b9bcd05e5 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '49697408-c01c-4e89-b56b-aa2bd5d6b93d'} is completed
Dec 02 10:08:33 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:33.974 2 INFO neutron.agent.securitygroups_rpc [None req-3fe80696-0018-4728-a361-06aaa88dce01 b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:08:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:34.043 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:08:34 np0005541913.localdomain systemd[1]: tmp-crun.FINuKz.mount: Deactivated successfully.
Dec 02 10:08:34 np0005541913.localdomain dnsmasq[319309]: exiting on receipt of SIGTERM
Dec 02 10:08:34 np0005541913.localdomain podman[319326]: 2025-12-02 10:08:34.047594798 +0000 UTC m=+0.067114760 container kill 1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:08:34 np0005541913.localdomain systemd[1]: libpod-1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7.scope: Deactivated successfully.
Dec 02 10:08:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:08:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:08:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:08:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:08:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:08:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:08:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:08:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:08:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:34.113 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:34.114 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:08:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:34.115 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:08:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:34.115 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:34 np0005541913.localdomain podman[319339]: 2025-12-02 10:08:34.131337559 +0000 UTC m=+0.062383874 container died 1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:34 np0005541913.localdomain podman[319339]: 2025-12-02 10:08:34.159259014 +0000 UTC m=+0.090305289 container cleanup 1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:08:34 np0005541913.localdomain systemd[1]: libpod-conmon-1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7.scope: Deactivated successfully.
Dec 02 10:08:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-35eb325a33a7c1b7be7aec223582707d0502075bf5da3d993e966b3ab11f7c3b-merged.mount: Deactivated successfully.
Dec 02 10:08:34 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:34 np0005541913.localdomain podman[319340]: 2025-12-02 10:08:34.199470885 +0000 UTC m=+0.129538323 container remove 1e64f9bdbc034317b0d5ea7907b398cbc0a4b016edcffa538253b356c83590e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:08:34 np0005541913.localdomain kernel: device tap49697408-c0 left promiscuous mode
Dec 02 10:08:34 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:34Z|00310|binding|INFO|Releasing lport 49697408-c01c-4e89-b56b-aa2bd5d6b93d from this chassis (sb_readonly=0)
Dec 02 10:08:34 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:34Z|00311|binding|INFO|Setting lport 49697408-c01c-4e89-b56b-aa2bd5d6b93d down in Southbound
Dec 02 10:08:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:34.212 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:34.222 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fee9:26fd/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=49697408-c01c-4e89-b56b-aa2bd5d6b93d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:34.224 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 49697408-c01c-4e89-b56b-aa2bd5d6b93d in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:34.225 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:34.226 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[9840e20c-dfb1-4962-9973-18a811e37a9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:34.237 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:34 np0005541913.localdomain ceph-mon[298296]: pgmap v316: 177 pgs: 177 active+clean; 146 MiB data, 774 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 22 KiB/s wr, 97 op/s
Dec 02 10:08:34 np0005541913.localdomain ceph-mon[298296]: osdmap e152: 6 total, 6 up, 6 in
Dec 02 10:08:34 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:08:35 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "9dcfd3db-4efe-40f2-a49d-5d58c6cc71e6", "snap_name": "9664f206-7cb4-4a1a-b619-2a201c7ebe10", "target_sub_name": "dd0a2a5c-154a-4d2f-9d8b-fce17b535945", "format": "json"}]: dispatch
Dec 02 10:08:35 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "dd0a2a5c-154a-4d2f-9d8b-fce17b535945", "format": "json"}]: dispatch
Dec 02 10:08:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:35.353 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a59d5a92-7a77-419d-a87f-fbb46ea78955) old=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2 2001:db8::f816:3eff:fee6:1993'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:35.356 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a59d5a92-7a77-419d-a87f-fbb46ea78955 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 updated
Dec 02 10:08:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:35.359 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:35.361 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ad613f61-a4a8-465b-a750-7e6efb649115]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:35.978 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:08:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:08:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:36.049 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:08:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:08:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:08:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18769 "" "Go-http-client/1.1"
Dec 02 10:08:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:36 np0005541913.localdomain ceph-mon[298296]: pgmap v318: 177 pgs: 177 active+clean; 146 MiB data, 774 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 22 KiB/s wr, 97 op/s
Dec 02 10:08:36 np0005541913.localdomain ceph-mon[298296]: mgrmap e46: np0005541914.lljzmk(active, since 8m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:08:36 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/640511349' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:08:36 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/640511349' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:08:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:08:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:08:36 np0005541913.localdomain podman[319369]: 2025-12-02 10:08:36.43844121 +0000 UTC m=+0.078315448 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 10:08:36 np0005541913.localdomain podman[319368]: 2025-12-02 10:08:36.497322269 +0000 UTC m=+0.136497388 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:08:36 np0005541913.localdomain podman[319369]: 2025-12-02 10:08:36.501991814 +0000 UTC m=+0.141866072 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:36 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:08:36 np0005541913.localdomain podman[319368]: 2025-12-02 10:08:36.558299474 +0000 UTC m=+0.197474593 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:08:36 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:08:36 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:36.584 263406 INFO neutron.agent.linux.ip_lib [None req-00e52bdb-0d1f-451c-9cea-829624b56b9e - - - - - -] Device tapc64e3ca3-73 cannot be used as it has no MAC address
Dec 02 10:08:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:36.608 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:36 np0005541913.localdomain kernel: device tapc64e3ca3-73 entered promiscuous mode
Dec 02 10:08:36 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670116.6141] manager: (tapc64e3ca3-73): new Generic device (/org/freedesktop/NetworkManager/Devices/50)
Dec 02 10:08:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:36Z|00312|binding|INFO|Claiming lport c64e3ca3-73af-44f7-b152-4306718afd23 for this chassis.
Dec 02 10:08:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:36Z|00313|binding|INFO|c64e3ca3-73af-44f7-b152-4306718afd23: Claiming unknown
Dec 02 10:08:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:36.616 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:36 np0005541913.localdomain systemd-udevd[319425]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:08:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:36.622 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:36.622 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c64e3ca3-73af-44f7-b152-4306718afd23) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:36.623 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:36Z|00314|binding|INFO|Setting lport c64e3ca3-73af-44f7-b152-4306718afd23 ovn-installed in OVS
Dec 02 10:08:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:36Z|00315|binding|INFO|Setting lport c64e3ca3-73af-44f7-b152-4306718afd23 up in Southbound
Dec 02 10:08:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:36.623 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c64e3ca3-73af-44f7-b152-4306718afd23 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:08:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:36.624 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:36.625 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2a8b203c-85f8-48d9-97e1-4bfb27c648a8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:08:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:36.625 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:36.626 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ed42ff10-faad-4a23-89f9-9b200cd87797]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc64e3ca3-73: No such device
Dec 02 10:08:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:36.642 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc64e3ca3-73: No such device
Dec 02 10:08:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc64e3ca3-73: No such device
Dec 02 10:08:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc64e3ca3-73: No such device
Dec 02 10:08:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc64e3ca3-73: No such device
Dec 02 10:08:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc64e3ca3-73: No such device
Dec 02 10:08:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc64e3ca3-73: No such device
Dec 02 10:08:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc64e3ca3-73: No such device
Dec 02 10:08:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:36.675 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:36.701 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:37 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:08:37 np0005541913.localdomain podman[319496]: 
Dec 02 10:08:37 np0005541913.localdomain podman[319496]: 2025-12-02 10:08:37.711317452 +0000 UTC m=+0.100372737 container create 70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:08:37 np0005541913.localdomain podman[319496]: 2025-12-02 10:08:37.642350493 +0000 UTC m=+0.031405818 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:37 np0005541913.localdomain systemd[1]: Started libpod-conmon-70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694.scope.
Dec 02 10:08:37 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:37 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebb0ea69bd56382c7b188d586d00a208525f507b947282f45f01187189ba7005/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:37 np0005541913.localdomain podman[319496]: 2025-12-02 10:08:37.790898622 +0000 UTC m=+0.179953907 container init 70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:37 np0005541913.localdomain podman[319496]: 2025-12-02 10:08:37.80095274 +0000 UTC m=+0.190008015 container start 70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:08:37 np0005541913.localdomain dnsmasq[319514]: started, version 2.85 cachesize 150
Dec 02 10:08:37 np0005541913.localdomain dnsmasq[319514]: DNS service limited to local subnets
Dec 02 10:08:37 np0005541913.localdomain dnsmasq[319514]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:37 np0005541913.localdomain dnsmasq[319514]: warning: no upstream servers configured
Dec 02 10:08:37 np0005541913.localdomain dnsmasq-dhcp[319514]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:08:37 np0005541913.localdomain dnsmasq[319514]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:37 np0005541913.localdomain dnsmasq-dhcp[319514]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:37 np0005541913.localdomain dnsmasq-dhcp[319514]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:37 np0005541913.localdomain kernel: device tapc64e3ca3-73 left promiscuous mode
Dec 02 10:08:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:37Z|00316|binding|INFO|Releasing lport c64e3ca3-73af-44f7-b152-4306718afd23 from this chassis (sb_readonly=0)
Dec 02 10:08:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:37.893 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:37Z|00317|binding|INFO|Setting lport c64e3ca3-73af-44f7-b152-4306718afd23 down in Southbound
Dec 02 10:08:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:37.915 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8796ecd6-5c3f-49b4-a11b-51a83206e216", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:08:38 np0005541913.localdomain ceph-mon[298296]: pgmap v319: 177 pgs: 177 active+clean; 146 MiB data, 774 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 8.7 KiB/s wr, 26 op/s
Dec 02 10:08:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8796ecd6-5c3f-49b4-a11b-51a83206e216", "format": "json"}]: dispatch
Dec 02 10:08:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:38.520 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe8c:26f8/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c64e3ca3-73af-44f7-b152-4306718afd23) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:38.522 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c64e3ca3-73af-44f7-b152-4306718afd23 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:38.527 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:38.528 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[9a86aa40-2b9a-4f49-a734-15c54a2b8f5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:38.601 263406 INFO neutron.agent.dhcp.agent [None req-d9651824-06dd-4759-9ad7-13c727cf3e17 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:08:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:38.651 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2 2001:db8::f816:3eff:fee6:1993'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a59d5a92-7a77-419d-a87f-fbb46ea78955) old=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:38.653 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a59d5a92-7a77-419d-a87f-fbb46ea78955 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 updated
Dec 02 10:08:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:38.656 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:38.656 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[d72a6455-23fe-4c6b-82a8-1c794e1d2554]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:40.314 263406 INFO neutron.agent.linux.ip_lib [None req-5a362f57-b7ef-4779-b7b2-5aa374c72b75 - - - - - -] Device tap8fbb99e9-2a cannot be used as it has no MAC address
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.339 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain kernel: device tap8fbb99e9-2a entered promiscuous mode
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.345 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670120.3464] manager: (tap8fbb99e9-2a): new Generic device (/org/freedesktop/NetworkManager/Devices/51)
Dec 02 10:08:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:40Z|00318|binding|INFO|Claiming lport 8fbb99e9-2ad3-4260-a17b-f7524696dad5 for this chassis.
Dec 02 10:08:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:40Z|00319|binding|INFO|8fbb99e9-2ad3-4260-a17b-f7524696dad5: Claiming unknown
Dec 02 10:08:40 np0005541913.localdomain ceph-mon[298296]: pgmap v320: 177 pgs: 177 active+clean; 146 MiB data, 774 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 32 KiB/s wr, 53 op/s
Dec 02 10:08:40 np0005541913.localdomain systemd-udevd[319536]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:08:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:40.361 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-8a8e4389-c9b3-4713-b533-7861fccbcf32', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a8e4389-c9b3-4713-b533-7861fccbcf32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50b20ebe68c9494a933fabe997d62528', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc4549c7-0142-4249-a0f1-78307f272ad4, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=8fbb99e9-2ad3-4260-a17b-f7524696dad5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:40.364 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 8fbb99e9-2ad3-4260-a17b-f7524696dad5 in datapath 8a8e4389-c9b3-4713-b533-7861fccbcf32 bound to our chassis
Dec 02 10:08:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:40.366 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8a8e4389-c9b3-4713-b533-7861fccbcf32 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:40.371 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[0f16b312-bc22-4129-a9cb-7f3c490e7fe3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:40Z|00320|binding|INFO|Setting lport 8fbb99e9-2ad3-4260-a17b-f7524696dad5 ovn-installed in OVS
Dec 02 10:08:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:40Z|00321|binding|INFO|Setting lport 8fbb99e9-2ad3-4260-a17b-f7524696dad5 up in Southbound
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.374 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.376 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.390 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.432 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.462 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain dnsmasq[319514]: exiting on receipt of SIGTERM
Dec 02 10:08:40 np0005541913.localdomain podman[319554]: 2025-12-02 10:08:40.508246956 +0000 UTC m=+0.048331658 container kill 70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:08:40 np0005541913.localdomain systemd[1]: libpod-70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694.scope: Deactivated successfully.
Dec 02 10:08:40 np0005541913.localdomain podman[319570]: 2025-12-02 10:08:40.567244668 +0000 UTC m=+0.048173144 container died 70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:08:40 np0005541913.localdomain podman[319570]: 2025-12-02 10:08:40.602446747 +0000 UTC m=+0.083375203 container cleanup 70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:40 np0005541913.localdomain systemd[1]: libpod-conmon-70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694.scope: Deactivated successfully.
Dec 02 10:08:40 np0005541913.localdomain podman[319577]: 2025-12-02 10:08:40.629646151 +0000 UTC m=+0.100193070 container remove 70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:08:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:40.696 263406 INFO neutron.agent.linux.ip_lib [None req-e7e3c12b-d55a-492b-9f7f-2d08b4cb8d72 - - - - - -] Device tapc64e3ca3-73 cannot be used as it has no MAC address
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.714 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain kernel: device tapc64e3ca3-73 entered promiscuous mode
Dec 02 10:08:40 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670120.7185] manager: (tapc64e3ca3-73): new Generic device (/org/freedesktop/NetworkManager/Devices/52)
Dec 02 10:08:40 np0005541913.localdomain systemd-udevd[319538]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:08:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:40Z|00322|binding|INFO|Claiming lport c64e3ca3-73af-44f7-b152-4306718afd23 for this chassis.
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.719 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:40Z|00323|binding|INFO|c64e3ca3-73af-44f7-b152-4306718afd23: Claiming unknown
Dec 02 10:08:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:40.728 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe8c:26f8/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c64e3ca3-73af-44f7-b152-4306718afd23) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:40Z|00324|binding|INFO|Setting lport c64e3ca3-73af-44f7-b152-4306718afd23 ovn-installed in OVS
Dec 02 10:08:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:40Z|00325|binding|INFO|Setting lport c64e3ca3-73af-44f7-b152-4306718afd23 up in Southbound
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.730 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:40.731 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c64e3ca3-73af-44f7-b152-4306718afd23 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.735 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:40.734 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2a8b203c-85f8-48d9-97e1-4bfb27c648a8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:08:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:40.735 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:40.736 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ce688c7e-b69d-4269-865d-c4bb85f05954]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.764 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.801 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.824 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:40 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:40.866 2 INFO neutron.agent.securitygroups_rpc [None req-33ec59f6-70cd-4828-b040-1367d796c3cf 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:08:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:40.981 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:41.051 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-ebb0ea69bd56382c7b188d586d00a208525f507b947282f45f01187189ba7005-merged.mount: Deactivated successfully.
Dec 02 10:08:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70338df7c6654c96d78e0278a6a2ea84b179f75ddef4eb5f00f98c5bcbd88694-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:41 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:41.223 2 INFO neutron.agent.securitygroups_rpc [None req-67c167f7-d811-43ca-8236-d9881acaf013 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:08:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e153 e153: 6 total, 6 up, 6 in
Dec 02 10:08:41 np0005541913.localdomain podman[319679]: 
Dec 02 10:08:41 np0005541913.localdomain podman[319679]: 2025-12-02 10:08:41.420137608 +0000 UTC m=+0.107600619 container create de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a.scope.
Dec 02 10:08:41 np0005541913.localdomain podman[319679]: 2025-12-02 10:08:41.371899351 +0000 UTC m=+0.059362402 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/331fe4c85053fe605b3d7d394a4a65cbed0aeb4f2e3994530c0c6c0a05693b91/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:41 np0005541913.localdomain podman[319679]: 2025-12-02 10:08:41.489119975 +0000 UTC m=+0.176582996 container init de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319699]: started, version 2.85 cachesize 150
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319699]: DNS service limited to local subnets
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319699]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319699]: warning: no upstream servers configured
Dec 02 10:08:41 np0005541913.localdomain dnsmasq-dhcp[319699]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/addn_hosts - 0 addresses
Dec 02 10:08:41 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/host
Dec 02 10:08:41 np0005541913.localdomain podman[319679]: 2025-12-02 10:08:41.509124128 +0000 UTC m=+0.196587149 container start de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:08:41 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/opts
Dec 02 10:08:41 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:41.528 2 INFO neutron.agent.securitygroups_rpc [None req-77ccc14a-3033-433b-916a-b05c2a4a2183 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:08:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:41.654 263406 INFO neutron.agent.dhcp.agent [None req-1f853948-0a75-41a1-afa8-3893d973fa67 - - - - - -] DHCP configuration for ports {'4934921f-c372-4513-ba83-48451840e960'} is completed
Dec 02 10:08:41 np0005541913.localdomain podman[319720]: 
Dec 02 10:08:41 np0005541913.localdomain podman[319720]: 2025-12-02 10:08:41.706227062 +0000 UTC m=+0.092100316 container create 7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067.scope.
Dec 02 10:08:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e83cf781f8dcd5525026416e9b72f7ccd15f8eef76d821a9886753d0d7e285a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:41 np0005541913.localdomain podman[319720]: 2025-12-02 10:08:41.664585812 +0000 UTC m=+0.050459156 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:41 np0005541913.localdomain podman[319720]: 2025-12-02 10:08:41.767493854 +0000 UTC m=+0.153367098 container init 7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:08:41 np0005541913.localdomain podman[319720]: 2025-12-02 10:08:41.779571566 +0000 UTC m=+0.165444820 container start 7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319739]: started, version 2.85 cachesize 150
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319739]: DNS service limited to local subnets
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319739]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319739]: warning: no upstream servers configured
Dec 02 10:08:41 np0005541913.localdomain dnsmasq-dhcp[319739]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:08:41 np0005541913.localdomain dnsmasq[319739]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:41 np0005541913.localdomain dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:41 np0005541913.localdomain dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:41.845 263406 INFO neutron.agent.dhcp.agent [None req-fb8bcf1a-2f41-4e5f-a9d1-80eab479a1e2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:40Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908841640>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908841b80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088415e0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908841f70>], id=7093b5e3-4af2-43eb-9bb8-fdb6491b81ff, ip_allocation=immediate, mac_address=fa:16:3e:b8:f6:0b, name=tempest-NetworksTestDHCPv6-610894895, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=31, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['0f1bf5f6-1ea6-475e-8b92-333c1acae145', '37da3fdc-3c05-4495-96b3-7d5c496a8839'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:35Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2045, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:40Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:08:42 np0005541913.localdomain dnsmasq[319739]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:08:42 np0005541913.localdomain podman[319758]: 2025-12-02 10:08:42.070668984 +0000 UTC m=+0.047879877 container kill 7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:08:42 np0005541913.localdomain dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:42 np0005541913.localdomain dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:42 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8796ecd6-5c3f-49b4-a11b-51a83206e216", "format": "json"}]: dispatch
Dec 02 10:08:42 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8796ecd6-5c3f-49b4-a11b-51a83206e216", "force": true, "format": "json"}]: dispatch
Dec 02 10:08:42 np0005541913.localdomain ceph-mon[298296]: pgmap v321: 177 pgs: 177 active+clean; 146 MiB data, 774 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 26 KiB/s wr, 43 op/s
Dec 02 10:08:42 np0005541913.localdomain ceph-mon[298296]: osdmap e153: 6 total, 6 up, 6 in
Dec 02 10:08:42 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:42.501 2 INFO neutron.agent.securitygroups_rpc [None req-fd3bc9cf-ed5a-495f-beed-1c7d898feb8a 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:08:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:42.595 263406 INFO neutron.agent.dhcp.agent [None req-7b0f0269-e6f2-4d89-bebb-aaf1e6def218 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', 'c64e3ca3-73af-44f7-b152-4306718afd23'} is completed
Dec 02 10:08:42 np0005541913.localdomain dnsmasq[319739]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:42 np0005541913.localdomain dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:42 np0005541913.localdomain dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:42 np0005541913.localdomain podman[319796]: 2025-12-02 10:08:42.792772476 +0000 UTC m=+0.072344638 container kill 7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:08:42 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:42.950 2 INFO neutron.agent.securitygroups_rpc [None req-98ebf0df-0324-4fc7-82f5-9efe0544203a 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:08:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:43.217 263406 INFO neutron.agent.dhcp.agent [None req-1a3d8fcc-b621-4cba-9f0b-cce73f1cef1a - - - - - -] DHCP configuration for ports {'7093b5e3-4af2-43eb-9bb8-fdb6491b81ff'} is completed
Dec 02 10:08:44 np0005541913.localdomain podman[319835]: 2025-12-02 10:08:44.07255265 +0000 UTC m=+0.051980505 container kill 7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:08:44 np0005541913.localdomain dnsmasq[319739]: exiting on receipt of SIGTERM
Dec 02 10:08:44 np0005541913.localdomain systemd[1]: libpod-7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067.scope: Deactivated successfully.
Dec 02 10:08:44 np0005541913.localdomain podman[319850]: 2025-12-02 10:08:44.155362798 +0000 UTC m=+0.065010934 container died 7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:08:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e83cf781f8dcd5525026416e9b72f7ccd15f8eef76d821a9886753d0d7e285a2-merged.mount: Deactivated successfully.
Dec 02 10:08:44 np0005541913.localdomain podman[319850]: 2025-12-02 10:08:44.194347277 +0000 UTC m=+0.103995353 container cleanup 7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:44 np0005541913.localdomain systemd[1]: libpod-conmon-7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067.scope: Deactivated successfully.
Dec 02 10:08:44 np0005541913.localdomain podman[319851]: 2025-12-02 10:08:44.223651087 +0000 UTC m=+0.127996611 container remove 7de61421e34b4879fbe4da74285d5a8f3b2a3f5f38f1c0a73170c12bd8ad0067 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:08:44 np0005541913.localdomain ceph-mon[298296]: pgmap v323: 177 pgs: 177 active+clean; 146 MiB data, 774 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 26 KiB/s wr, 24 op/s
Dec 02 10:08:45 np0005541913.localdomain podman[319930]: 
Dec 02 10:08:45 np0005541913.localdomain podman[319930]: 2025-12-02 10:08:45.038900573 +0000 UTC m=+0.095383173 container create fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:08:45 np0005541913.localdomain systemd[1]: Started libpod-conmon-fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9.scope.
Dec 02 10:08:45 np0005541913.localdomain podman[319930]: 2025-12-02 10:08:44.989802865 +0000 UTC m=+0.046285525 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:45 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f40a1d2655a37158c8f6a41872170a0475aa15be61f8cbdc512f99b733d995a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:45 np0005541913.localdomain podman[319930]: 2025-12-02 10:08:45.111694553 +0000 UTC m=+0.168177153 container init fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:45 np0005541913.localdomain systemd[1]: tmp-crun.SF2vQN.mount: Deactivated successfully.
Dec 02 10:08:45 np0005541913.localdomain podman[319930]: 2025-12-02 10:08:45.123811977 +0000 UTC m=+0.180294577 container start fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:08:45 np0005541913.localdomain dnsmasq[319948]: started, version 2.85 cachesize 150
Dec 02 10:08:45 np0005541913.localdomain dnsmasq[319948]: DNS service limited to local subnets
Dec 02 10:08:45 np0005541913.localdomain dnsmasq[319948]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:45 np0005541913.localdomain dnsmasq[319948]: warning: no upstream servers configured
Dec 02 10:08:45 np0005541913.localdomain dnsmasq[319948]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:45 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:45.539 263406 INFO neutron.agent.dhcp.agent [None req-84e17728-afdf-4a85-9c31-60e0bf442626 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', 'c64e3ca3-73af-44f7-b152-4306718afd23'} is completed
Dec 02 10:08:45 np0005541913.localdomain dnsmasq[319948]: exiting on receipt of SIGTERM
Dec 02 10:08:45 np0005541913.localdomain podman[319967]: 2025-12-02 10:08:45.546038228 +0000 UTC m=+0.061366437 container kill fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:45 np0005541913.localdomain systemd[1]: libpod-fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9.scope: Deactivated successfully.
Dec 02 10:08:45 np0005541913.localdomain podman[319979]: 2025-12-02 10:08:45.629394539 +0000 UTC m=+0.068060644 container died fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 02 10:08:45 np0005541913.localdomain podman[319979]: 2025-12-02 10:08:45.662154793 +0000 UTC m=+0.100820848 container cleanup fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:45 np0005541913.localdomain systemd[1]: libpod-conmon-fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9.scope: Deactivated successfully.
Dec 02 10:08:45 np0005541913.localdomain podman[319981]: 2025-12-02 10:08:45.698410779 +0000 UTC m=+0.130288954 container remove fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:08:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:45Z|00326|binding|INFO|Releasing lport c64e3ca3-73af-44f7-b152-4306718afd23 from this chassis (sb_readonly=0)
Dec 02 10:08:45 np0005541913.localdomain kernel: device tapc64e3ca3-73 left promiscuous mode
Dec 02 10:08:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:45Z|00327|binding|INFO|Setting lport c64e3ca3-73af-44f7-b152-4306718afd23 down in Southbound
Dec 02 10:08:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:45.713 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:45.721 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe8c:26f8/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c64e3ca3-73af-44f7-b152-4306718afd23) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:45.723 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c64e3ca3-73af-44f7-b152-4306718afd23 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:45.726 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:45.727 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[0c3bb8c3-028b-4aaf-8c2a-2c009019f87d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:45.729 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:45 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:45.828 2 INFO neutron.agent.securitygroups_rpc [None req-a2676272-e4a7-4aab-af43-dd7cd656aeb3 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:08:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:45.984 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:46.053 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f40a1d2655a37158c8f6a41872170a0475aa15be61f8cbdc512f99b733d995a2-merged.mount: Deactivated successfully.
Dec 02 10:08:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fab9dc8abfb3d0531868a0a937b82bfaa261f85a204c1ee9edcb58bbbe1c4dc9-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:46 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:08:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:46 np0005541913.localdomain ceph-mon[298296]: pgmap v324: 177 pgs: 177 active+clean; 146 MiB data, 774 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 26 KiB/s wr, 23 op/s
Dec 02 10:08:47 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:08:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:47.353 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a59d5a92-7a77-419d-a87f-fbb46ea78955) old=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2 2001:db8::f816:3eff:fee6:1993'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:47.355 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a59d5a92-7a77-419d-a87f-fbb46ea78955 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 updated
Dec 02 10:08:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:47.356 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:47.357 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1ea5ed-684c-4096-b927-720470c4675a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:47 np0005541913.localdomain podman[320010]: 2025-12-02 10:08:47.449789511 +0000 UTC m=+0.088874740 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 02 10:08:47 np0005541913.localdomain podman[320010]: 2025-12-02 10:08:47.458881843 +0000 UTC m=+0.097967092 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:08:47 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:08:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:48.032 263406 INFO neutron.agent.linux.ip_lib [None req-7b86167a-ba06-4d0d-8e25-50da2a5af72a - - - - - -] Device tap4b41fcb7-46 cannot be used as it has no MAC address
Dec 02 10:08:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:48.058 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:48 np0005541913.localdomain kernel: device tap4b41fcb7-46 entered promiscuous mode
Dec 02 10:08:48 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670128.0679] manager: (tap4b41fcb7-46): new Generic device (/org/freedesktop/NetworkManager/Devices/53)
Dec 02 10:08:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:48.068 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:48Z|00328|binding|INFO|Claiming lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 for this chassis.
Dec 02 10:08:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:48Z|00329|binding|INFO|4b41fcb7-4686-4ae0-bf20-f32d06645ac4: Claiming unknown
Dec 02 10:08:48 np0005541913.localdomain systemd-udevd[320039]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:08:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:48Z|00330|binding|INFO|Setting lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 ovn-installed in OVS
Dec 02 10:08:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:48.080 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:48 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap4b41fcb7-46: No such device
Dec 02 10:08:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:48.099 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:48 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap4b41fcb7-46: No such device
Dec 02 10:08:48 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap4b41fcb7-46: No such device
Dec 02 10:08:48 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap4b41fcb7-46: No such device
Dec 02 10:08:48 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap4b41fcb7-46: No such device
Dec 02 10:08:48 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap4b41fcb7-46: No such device
Dec 02 10:08:48 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap4b41fcb7-46: No such device
Dec 02 10:08:48 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap4b41fcb7-46: No such device
Dec 02 10:08:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:48.142 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:48.172 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:48 np0005541913.localdomain ceph-mon[298296]: pgmap v325: 177 pgs: 177 active+clean; 146 MiB data, 774 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 26 KiB/s wr, 23 op/s
Dec 02 10:08:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:48Z|00331|binding|INFO|Setting lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 up in Southbound
Dec 02 10:08:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:48.543 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=4b41fcb7-4686-4ae0-bf20-f32d06645ac4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:48 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:48.543 2 INFO neutron.agent.securitygroups_rpc [None req-1e3a85d5-a4d4-4ac9-b4fb-7c32fb08bdf0 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:08:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:48.545 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:08:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:48.548 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 82df1984-655f-43b7-8e68-0cea428fb7f6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:08:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:48.549 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:48.550 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[986e5f78-c558-4b96-98c0-8389959c3678]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:49 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:49.207 263406 INFO neutron.agent.linux.ip_lib [None req-81205318-602f-4d4a-b3cc-7cfb247118e4 - - - - - -] Device tap71712210-e7 cannot be used as it has no MAC address
Dec 02 10:08:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:49.243 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:49 np0005541913.localdomain kernel: device tap71712210-e7 entered promiscuous mode
Dec 02 10:08:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:49Z|00332|binding|INFO|Claiming lport 71712210-e77f-42b0-bbd5-d3267949cb4f for this chassis.
Dec 02 10:08:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:49Z|00333|binding|INFO|71712210-e77f-42b0-bbd5-d3267949cb4f: Claiming unknown
Dec 02 10:08:49 np0005541913.localdomain systemd-udevd[320041]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:08:49 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670129.2511] manager: (tap71712210-e7): new Generic device (/org/freedesktop/NetworkManager/Devices/54)
Dec 02 10:08:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:49.250 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:49.264 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-bfdb46d8-0ab9-4f91-af70-05b63804efe6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bfdb46d8-0ab9-4f91-af70-05b63804efe6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7b847f-2e87-42d4-a87d-a72dff8a08d3, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=71712210-e77f-42b0-bbd5-d3267949cb4f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:49.266 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 71712210-e77f-42b0-bbd5-d3267949cb4f in datapath bfdb46d8-0ab9-4f91-af70-05b63804efe6 bound to our chassis
Dec 02 10:08:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:49.268 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network bfdb46d8-0ab9-4f91-af70-05b63804efe6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:08:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:49.269 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[7774b60e-6d90-4ced-a6e1-662c3689704b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:49Z|00334|binding|INFO|Setting lport 71712210-e77f-42b0-bbd5-d3267949cb4f ovn-installed in OVS
Dec 02 10:08:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:49Z|00335|binding|INFO|Setting lport 71712210-e77f-42b0-bbd5-d3267949cb4f up in Southbound
Dec 02 10:08:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:49.281 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:49.305 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:49.354 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:49.392 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:49 np0005541913.localdomain podman[320125]: 
Dec 02 10:08:49 np0005541913.localdomain podman[320125]: 2025-12-02 10:08:49.46187248 +0000 UTC m=+0.102804151 container create d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:08:49 np0005541913.localdomain systemd[1]: Started libpod-conmon-d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079.scope.
Dec 02 10:08:49 np0005541913.localdomain podman[320125]: 2025-12-02 10:08:49.416936192 +0000 UTC m=+0.057867903 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:49 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:49 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6bb415e42759f888aad2ad6426896c066270642db5d3de3206e231e1a2e1442/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:49 np0005541913.localdomain podman[320125]: 2025-12-02 10:08:49.555472575 +0000 UTC m=+0.196404246 container init d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:08:49 np0005541913.localdomain podman[320125]: 2025-12-02 10:08:49.566157749 +0000 UTC m=+0.207089420 container start d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:08:49 np0005541913.localdomain dnsmasq[320148]: started, version 2.85 cachesize 150
Dec 02 10:08:49 np0005541913.localdomain dnsmasq[320148]: DNS service limited to local subnets
Dec 02 10:08:49 np0005541913.localdomain dnsmasq[320148]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:49 np0005541913.localdomain dnsmasq[320148]: warning: no upstream servers configured
Dec 02 10:08:49 np0005541913.localdomain dnsmasq-dhcp[320148]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:08:49 np0005541913.localdomain dnsmasq[320148]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:49 np0005541913.localdomain dnsmasq-dhcp[320148]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:49 np0005541913.localdomain dnsmasq-dhcp[320148]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:49Z|00336|binding|INFO|Releasing lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 from this chassis (sb_readonly=0)
Dec 02 10:08:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:49.679 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:49Z|00337|binding|INFO|Setting lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 down in Southbound
Dec 02 10:08:49 np0005541913.localdomain kernel: device tap4b41fcb7-46 left promiscuous mode
Dec 02 10:08:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:49.707 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:49.866 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=4b41fcb7-4686-4ae0-bf20-f32d06645ac4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:49.868 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:49.871 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:49.872 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[0b83dcd6-c644-45a6-b5f4-66aa6e602899]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:49.948 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:49 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:49.955 2 INFO neutron.agent.securitygroups_rpc [None req-59adf8f3-045e-4261-a367-eea8612462ef 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:08:50 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:50.074 263406 INFO neutron.agent.dhcp.agent [None req-88eeb533-b19d-4714-a306-9331fb97097e - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:08:50 np0005541913.localdomain ceph-mon[298296]: pgmap v326: 177 pgs: 177 active+clean; 146 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 9.4 KiB/s wr, 3 op/s
Dec 02 10:08:50 np0005541913.localdomain podman[320191]: 
Dec 02 10:08:50 np0005541913.localdomain podman[320191]: 2025-12-02 10:08:50.572943969 +0000 UTC m=+0.102482382 container create 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:08:50 np0005541913.localdomain systemd[1]: Started libpod-conmon-5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4.scope.
Dec 02 10:08:50 np0005541913.localdomain systemd[1]: tmp-crun.XFkaxx.mount: Deactivated successfully.
Dec 02 10:08:50 np0005541913.localdomain podman[320191]: 2025-12-02 10:08:50.529265255 +0000 UTC m=+0.058803688 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:50 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:50 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/155e384317babd96cf2cd1a89ffbaad899eea093c053e3a7d2f3cc36440b4ff7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:50 np0005541913.localdomain podman[320191]: 2025-12-02 10:08:50.644004472 +0000 UTC m=+0.173542855 container init 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:50 np0005541913.localdomain podman[320191]: 2025-12-02 10:08:50.655233852 +0000 UTC m=+0.184772235 container start 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 02 10:08:50 np0005541913.localdomain dnsmasq[320209]: started, version 2.85 cachesize 150
Dec 02 10:08:50 np0005541913.localdomain dnsmasq[320209]: DNS service limited to local subnets
Dec 02 10:08:50 np0005541913.localdomain dnsmasq[320209]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:50 np0005541913.localdomain dnsmasq[320209]: warning: no upstream servers configured
Dec 02 10:08:50 np0005541913.localdomain dnsmasq-dhcp[320209]: DHCP, static leases only on 10.101.0.0, lease time 1d
Dec 02 10:08:50 np0005541913.localdomain dnsmasq[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/addn_hosts - 0 addresses
Dec 02 10:08:50 np0005541913.localdomain dnsmasq-dhcp[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/host
Dec 02 10:08:50 np0005541913.localdomain dnsmasq-dhcp[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/opts
Dec 02 10:08:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:50.987 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:51 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:51.028 263406 INFO neutron.agent.dhcp.agent [None req-b2a2a912-7625-45d2-abe6-129620fcf6c5 - - - - - -] DHCP configuration for ports {'d5170ff3-2008-409b-ab16-990861d5c150'} is completed
Dec 02 10:08:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:51.055 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:08:51 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:51.120 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:50Z, description=, device_id=aa61d6a1-1090-4f06-abf3-fa0ed7c99a0f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b58880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a907c0>], id=42dd30a6-6487-4d6c-bb07-70b967b53b83, ip_allocation=immediate, mac_address=fa:16:3e:95:e1:4a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:08:36Z, description=, dns_domain=, id=8a8e4389-c9b3-4713-b533-7861fccbcf32, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-306705527, port_security_enabled=True, project_id=50b20ebe68c9494a933fabe997d62528, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29831, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2021, status=ACTIVE, subnets=['bbb45bcf-ad23-4ed6-8ed8-1a191d2f154d'], tags=[], tenant_id=50b20ebe68c9494a933fabe997d62528, updated_at=2025-12-02T10:08:38Z, vlan_transparent=None, network_id=8a8e4389-c9b3-4713-b533-7861fccbcf32, port_security_enabled=False, project_id=50b20ebe68c9494a933fabe997d62528, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2077, status=DOWN, tags=[], tenant_id=50b20ebe68c9494a933fabe997d62528, updated_at=2025-12-02T10:08:50Z on network 8a8e4389-c9b3-4713-b533-7861fccbcf32
Dec 02 10:08:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:51 np0005541913.localdomain podman[320210]: 2025-12-02 10:08:51.202057133 +0000 UTC m=+0.093792520 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 10:08:51 np0005541913.localdomain podman[320210]: 2025-12-02 10:08:51.207111138 +0000 UTC m=+0.098846565 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 02 10:08:51 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:08:51 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "dd0a2a5c-154a-4d2f-9d8b-fce17b535945", "format": "json"}]: dispatch
Dec 02 10:08:51 np0005541913.localdomain dnsmasq[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/addn_hosts - 1 addresses
Dec 02 10:08:51 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/host
Dec 02 10:08:51 np0005541913.localdomain podman[320245]: 2025-12-02 10:08:51.38351852 +0000 UTC m=+0.072509214 container kill de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:08:51 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/opts
Dec 02 10:08:51 np0005541913.localdomain systemd[1]: tmp-crun.GbVTzC.mount: Deactivated successfully.
Dec 02 10:08:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:51.457 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2 2001:db8::f816:3eff:fee6:1993'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a59d5a92-7a77-419d-a87f-fbb46ea78955) old=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:51.460 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a59d5a92-7a77-419d-a87f-fbb46ea78955 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 updated
Dec 02 10:08:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:51.463 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:51.464 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a38bd617-d7ba-4c26-a12c-e9ed0c35a33a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:51 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:51.699 263406 INFO neutron.agent.dhcp.agent [None req-6e436cec-7b51-462b-849b-c6ad4521531e - - - - - -] DHCP configuration for ports {'42dd30a6-6487-4d6c-bb07-70b967b53b83'} is completed
Dec 02 10:08:51 np0005541913.localdomain dnsmasq[320148]: exiting on receipt of SIGTERM
Dec 02 10:08:51 np0005541913.localdomain podman[320282]: 2025-12-02 10:08:51.80907007 +0000 UTC m=+0.059953318 container kill d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:08:51 np0005541913.localdomain systemd[1]: libpod-d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079.scope: Deactivated successfully.
Dec 02 10:08:51 np0005541913.localdomain podman[320295]: 2025-12-02 10:08:51.884302485 +0000 UTC m=+0.056530857 container died d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:08:51 np0005541913.localdomain podman[320295]: 2025-12-02 10:08:51.922287707 +0000 UTC m=+0.094516069 container cleanup d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:08:51 np0005541913.localdomain systemd[1]: libpod-conmon-d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079.scope: Deactivated successfully.
Dec 02 10:08:51 np0005541913.localdomain podman[320296]: 2025-12-02 10:08:51.966073634 +0000 UTC m=+0.133810547 container remove d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:08:52 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:52.022 263406 INFO neutron.agent.linux.ip_lib [None req-bcc5338e-a4f4-4633-a0fe-4889008fdbcf - - - - - -] Device tap4b41fcb7-46 cannot be used as it has no MAC address
Dec 02 10:08:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:52.050 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:52 np0005541913.localdomain kernel: device tap4b41fcb7-46 entered promiscuous mode
Dec 02 10:08:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:52Z|00338|binding|INFO|Claiming lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 for this chassis.
Dec 02 10:08:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:52Z|00339|binding|INFO|4b41fcb7-4686-4ae0-bf20-f32d06645ac4: Claiming unknown
Dec 02 10:08:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:52.060 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:52 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670132.0627] manager: (tap4b41fcb7-46): new Generic device (/org/freedesktop/NetworkManager/Devices/55)
Dec 02 10:08:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:52Z|00340|binding|INFO|Setting lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 ovn-installed in OVS
Dec 02 10:08:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:52.070 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:52.075 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:52.100 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b6bb415e42759f888aad2ad6426896c066270642db5d3de3206e231e1a2e1442-merged.mount: Deactivated successfully.
Dec 02 10:08:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7a8a58c79152eb2cabe055c8c789290ff0dc544ecd130fbffe1094542360079-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:52.139 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:52.170 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:52 np0005541913.localdomain ceph-mon[298296]: pgmap v327: 177 pgs: 177 active+clean; 146 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 9.4 KiB/s wr, 3 op/s
Dec 02 10:08:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:08:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:52Z|00341|binding|INFO|Setting lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 up in Southbound
Dec 02 10:08:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:52.465 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe45:c69d/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=4b41fcb7-4686-4ae0-bf20-f32d06645ac4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:52.468 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:08:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:52.470 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 82df1984-655f-43b7-8e68-0cea428fb7f6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:08:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:52.471 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:52.472 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[2e76a852-4412-49a8-87cb-56dadfa2d5a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:08:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:08:52 np0005541913.localdomain systemd[1]: tmp-crun.ienCYF.mount: Deactivated successfully.
Dec 02 10:08:53 np0005541913.localdomain podman[320377]: 2025-12-02 10:08:53.003085929 +0000 UTC m=+0.141034079 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:08:53 np0005541913.localdomain podman[320378]: 2025-12-02 10:08:53.025829755 +0000 UTC m=+0.156781219 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:08:53 np0005541913.localdomain podman[320378]: 2025-12-02 10:08:53.038061122 +0000 UTC m=+0.169012606 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:08:53 np0005541913.localdomain podman[320377]: 2025-12-02 10:08:53.052039735 +0000 UTC m=+0.189987845 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, release=1755695350, distribution-scope=public, config_id=edpm, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Dec 02 10:08:53 np0005541913.localdomain podman[320407]: 
Dec 02 10:08:53 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:08:53 np0005541913.localdomain podman[320407]: 2025-12-02 10:08:53.067241689 +0000 UTC m=+0.111326678 container create 6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:08:53 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:08:53 np0005541913.localdomain systemd[1]: Started libpod-conmon-6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e.scope.
Dec 02 10:08:53 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:53 np0005541913.localdomain systemd[1]: tmp-crun.prSsMM.mount: Deactivated successfully.
Dec 02 10:08:53 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd3e505d8e880e23e61b81d3c4b057d6f4c504d64d83c246c85ab5ae5ccbaf0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:53 np0005541913.localdomain podman[320407]: 2025-12-02 10:08:53.021387228 +0000 UTC m=+0.065472257 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:53 np0005541913.localdomain podman[320407]: 2025-12-02 10:08:53.127055753 +0000 UTC m=+0.171140772 container init 6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:08:53 np0005541913.localdomain podman[320407]: 2025-12-02 10:08:53.13591845 +0000 UTC m=+0.180003449 container start 6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:53 np0005541913.localdomain dnsmasq[320446]: started, version 2.85 cachesize 150
Dec 02 10:08:53 np0005541913.localdomain dnsmasq[320446]: DNS service limited to local subnets
Dec 02 10:08:53 np0005541913.localdomain dnsmasq[320446]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:53 np0005541913.localdomain dnsmasq[320446]: warning: no upstream servers configured
Dec 02 10:08:53 np0005541913.localdomain dnsmasq-dhcp[320446]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:08:53 np0005541913.localdomain dnsmasq-dhcp[320446]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:08:53 np0005541913.localdomain dnsmasq[320446]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:53 np0005541913.localdomain dnsmasq-dhcp[320446]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:53 np0005541913.localdomain dnsmasq-dhcp[320446]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "dd0a2a5c-154a-4d2f-9d8b-fce17b535945", "format": "json"}]: dispatch
Dec 02 10:08:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:53.582 263406 INFO neutron.agent.dhcp.agent [None req-fa38b537-7ace-4b28-a4c8-008521b6d3e6 - - - - - -] DHCP configuration for ports {'4b41fcb7-4686-4ae0-bf20-f32d06645ac4', 'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:08:53 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:53.585 2 INFO neutron.agent.securitygroups_rpc [None req-e08d9635-b9e8-48c9-978b-72b4270a2462 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:08:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:54.023 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:53Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088503d0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908850c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908850160>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908850d30>], id=9684ac42-ac06-402a-9d43-f3e1def9fb6d, ip_allocation=immediate, mac_address=fa:16:3e:e1:25:26, name=tempest-NetworksTestDHCPv6-840276124, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=35, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['bf84a78a-769c-4c8f-85c7-0080eaede2d7', 'eb9c3d44-1811-4f82-96ed-c16d4fac6b84'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:48Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2079, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:08:53Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:08:54 np0005541913.localdomain dnsmasq[320446]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:08:54 np0005541913.localdomain dnsmasq-dhcp[320446]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:54 np0005541913.localdomain podman[320463]: 2025-12-02 10:08:54.285050643 +0000 UTC m=+0.067633083 container kill 6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:08:54 np0005541913.localdomain dnsmasq-dhcp[320446]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:54 np0005541913.localdomain ceph-mon[298296]: pgmap v328: 177 pgs: 177 active+clean; 146 MiB data, 779 MiB used, 41 GiB / 42 GiB avail; 517 B/s rd, 8.7 KiB/s wr, 3 op/s
Dec 02 10:08:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:54.544 263406 INFO neutron.agent.dhcp.agent [None req-e3bb298c-0b95-4437-9ac9-de7cea9e1e2c - - - - - -] DHCP configuration for ports {'9684ac42-ac06-402a-9d43-f3e1def9fb6d'} is completed
Dec 02 10:08:54 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:54.611 2 INFO neutron.agent.securitygroups_rpc [None req-3c6b2ae8-8852-4b91-8b57-db72052e455d 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:08:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:54.745 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:50Z, description=, device_id=aa61d6a1-1090-4f06-abf3-fa0ed7c99a0f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a3bfd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908976ca0>], id=42dd30a6-6487-4d6c-bb07-70b967b53b83, ip_allocation=immediate, mac_address=fa:16:3e:95:e1:4a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:08:36Z, description=, dns_domain=, id=8a8e4389-c9b3-4713-b533-7861fccbcf32, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-306705527, port_security_enabled=True, project_id=50b20ebe68c9494a933fabe997d62528, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29831, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2021, status=ACTIVE, subnets=['bbb45bcf-ad23-4ed6-8ed8-1a191d2f154d'], tags=[], tenant_id=50b20ebe68c9494a933fabe997d62528, updated_at=2025-12-02T10:08:38Z, vlan_transparent=None, network_id=8a8e4389-c9b3-4713-b533-7861fccbcf32, port_security_enabled=False, project_id=50b20ebe68c9494a933fabe997d62528, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2077, status=DOWN, tags=[], tenant_id=50b20ebe68c9494a933fabe997d62528, updated_at=2025-12-02T10:08:50Z on network 8a8e4389-c9b3-4713-b533-7861fccbcf32
Dec 02 10:08:54 np0005541913.localdomain dnsmasq[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/addn_hosts - 1 addresses
Dec 02 10:08:54 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/host
Dec 02 10:08:54 np0005541913.localdomain podman[320503]: 2025-12-02 10:08:54.968929068 +0000 UTC m=+0.068472955 container kill de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 02 10:08:54 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/opts
Dec 02 10:08:55 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:55.004 2 INFO neutron.agent.securitygroups_rpc [None req-c6b91e56-d8d3-49df-8fb0-b7b5f6e00308 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:08:55 np0005541913.localdomain dnsmasq[320446]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:55 np0005541913.localdomain dnsmasq-dhcp[320446]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:55 np0005541913.localdomain dnsmasq-dhcp[320446]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:55 np0005541913.localdomain podman[320542]: 2025-12-02 10:08:55.256530222 +0000 UTC m=+0.061792187 container kill 6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:55.266 263406 INFO neutron.agent.dhcp.agent [None req-03e14478-4c91-4d9f-8fe0-ab95191d78ec - - - - - -] DHCP configuration for ports {'42dd30a6-6487-4d6c-bb07-70b967b53b83'} is completed
Dec 02 10:08:55 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:55.871 2 INFO neutron.agent.securitygroups_rpc [None req-62ccdf95-04fd-49bc-8e08-6a4afcc10f44 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:08:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:55.991 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:56.057 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:08:56 np0005541913.localdomain ceph-mon[298296]: pgmap v329: 177 pgs: 177 active+clean; 146 MiB data, 779 MiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 2.7 KiB/s wr, 1 op/s
Dec 02 10:08:56 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:56.645 2 INFO neutron.agent.securitygroups_rpc [None req-2779f3cb-e43d-46a6-b42c-1a159a69c67f b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:08:56 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:56.728 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:55Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088336d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908833eb0>], id=dba2acb3-305f-411c-a151-68276b1c53d9, ip_allocation=immediate, mac_address=fa:16:3e:0a:30:cd, name=tempest-FloatingIPTestJSON-1300587785, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:08:36Z, description=, dns_domain=, id=8a8e4389-c9b3-4713-b533-7861fccbcf32, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-306705527, port_security_enabled=True, project_id=50b20ebe68c9494a933fabe997d62528, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29831, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2021, status=ACTIVE, subnets=['bbb45bcf-ad23-4ed6-8ed8-1a191d2f154d'], tags=[], tenant_id=50b20ebe68c9494a933fabe997d62528, updated_at=2025-12-02T10:08:38Z, vlan_transparent=None, network_id=8a8e4389-c9b3-4713-b533-7861fccbcf32, port_security_enabled=True, project_id=50b20ebe68c9494a933fabe997d62528, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['0990385a-b99f-41bd-8d17-8e7fb5ec4794'], standard_attr_id=2084, status=DOWN, tags=[], tenant_id=50b20ebe68c9494a933fabe997d62528, updated_at=2025-12-02T10:08:56Z on network 8a8e4389-c9b3-4713-b533-7861fccbcf32
Dec 02 10:08:56 np0005541913.localdomain dnsmasq[320446]: exiting on receipt of SIGTERM
Dec 02 10:08:56 np0005541913.localdomain podman[320582]: 2025-12-02 10:08:56.765973007 +0000 UTC m=+0.061638534 container kill 6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:08:56 np0005541913.localdomain systemd[1]: libpod-6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e.scope: Deactivated successfully.
Dec 02 10:08:56 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:56.780 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:56Z, description=, device_id=484ade60-328f-43eb-b990-42dac6f1b75b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908833430>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908833250>], id=38c2fef2-e6bd-4f3b-a3ea-c9d9c60fb17c, ip_allocation=immediate, mac_address=fa:16:3e:72:43:2a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:08:40Z, description=, dns_domain=, id=bfdb46d8-0ab9-4f91-af70-05b63804efe6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1196347497, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54074, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2042, status=ACTIVE, subnets=['d57b0d74-058f-4387-bcee-307aa4948e69'], tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:08:47Z, vlan_transparent=None, network_id=bfdb46d8-0ab9-4f91-af70-05b63804efe6, port_security_enabled=False, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2085, status=DOWN, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:08:56Z on network bfdb46d8-0ab9-4f91-af70-05b63804efe6
Dec 02 10:08:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:56.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:56.829 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:08:56 np0005541913.localdomain podman[320596]: 2025-12-02 10:08:56.845455985 +0000 UTC m=+0.060492533 container died 6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:56 np0005541913.localdomain systemd[1]: tmp-crun.4t63t4.mount: Deactivated successfully.
Dec 02 10:08:56 np0005541913.localdomain podman[320596]: 2025-12-02 10:08:56.889056547 +0000 UTC m=+0.104093055 container cleanup 6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 02 10:08:56 np0005541913.localdomain systemd[1]: libpod-conmon-6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e.scope: Deactivated successfully.
Dec 02 10:08:56 np0005541913.localdomain podman[320598]: 2025-12-02 10:08:56.986799012 +0000 UTC m=+0.193806926 container remove 6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:08:57 np0005541913.localdomain dnsmasq[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/addn_hosts - 2 addresses
Dec 02 10:08:57 np0005541913.localdomain podman[320667]: 2025-12-02 10:08:57.029374816 +0000 UTC m=+0.058603932 container kill de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:08:57 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/host
Dec 02 10:08:57 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/opts
Dec 02 10:08:57 np0005541913.localdomain dnsmasq[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/addn_hosts - 1 addresses
Dec 02 10:08:57 np0005541913.localdomain podman[320655]: 2025-12-02 10:08:57.083205781 +0000 UTC m=+0.154929950 container kill 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:08:57 np0005541913.localdomain dnsmasq-dhcp[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/host
Dec 02 10:08:57 np0005541913.localdomain dnsmasq-dhcp[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/opts
Dec 02 10:08:57 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:08:57.460 2 INFO neutron.agent.securitygroups_rpc [None req-675bc92b-ef71-4946-a4ea-2f67c0d27bea 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:08:57 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:57.652 263406 INFO neutron.agent.dhcp.agent [None req-ce6e6a38-608b-4fc3-b43c-4926734febd5 - - - - - -] DHCP configuration for ports {'38c2fef2-e6bd-4f3b-a3ea-c9d9c60fb17c', 'dba2acb3-305f-411c-a151-68276b1c53d9'} is completed
Dec 02 10:08:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-cd3e505d8e880e23e61b81d3c4b057d6f4c504d64d83c246c85ab5ae5ccbaf0f-merged.mount: Deactivated successfully.
Dec 02 10:08:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6504725b85f817469802e2a87a2f5729880545cf3bfb62f37f8f51021ef2f40e-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:57.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:08:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:57.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:08:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:57.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:08:57 np0005541913.localdomain podman[320748]: 
Dec 02 10:08:57 np0005541913.localdomain podman[320748]: 2025-12-02 10:08:57.977959275 +0000 UTC m=+0.094510490 container create 2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:08:58 np0005541913.localdomain systemd[1]: Started libpod-conmon-2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7.scope.
Dec 02 10:08:58 np0005541913.localdomain podman[320748]: 2025-12-02 10:08:57.933839669 +0000 UTC m=+0.050390934 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:08:58 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:08:58 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997b7cb411eca1e18b4e08673cc037792aaf71615365dfe2ab0daee54678e70c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:08:58 np0005541913.localdomain podman[320748]: 2025-12-02 10:08:58.050396996 +0000 UTC m=+0.166948211 container init 2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:08:58 np0005541913.localdomain podman[320748]: 2025-12-02 10:08:58.059015375 +0000 UTC m=+0.175566580 container start 2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:08:58 np0005541913.localdomain dnsmasq[320766]: started, version 2.85 cachesize 150
Dec 02 10:08:58 np0005541913.localdomain dnsmasq[320766]: DNS service limited to local subnets
Dec 02 10:08:58 np0005541913.localdomain dnsmasq[320766]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:08:58 np0005541913.localdomain dnsmasq[320766]: warning: no upstream servers configured
Dec 02 10:08:58 np0005541913.localdomain dnsmasq-dhcp[320766]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:08:58 np0005541913.localdomain dnsmasq[320766]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:08:58 np0005541913.localdomain dnsmasq-dhcp[320766]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:08:58 np0005541913.localdomain dnsmasq-dhcp[320766]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:08:58 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:58.301 263406 INFO neutron.agent.dhcp.agent [None req-9099030d-387a-4288-bfa2-f16a30907ab3 - - - - - -] DHCP configuration for ports {'4b41fcb7-4686-4ae0-bf20-f32d06645ac4', 'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:08:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:58.316 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:08:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:58.316 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:08:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:58.317 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:08:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:58.317 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:08:58 np0005541913.localdomain dnsmasq[320766]: exiting on receipt of SIGTERM
Dec 02 10:08:58 np0005541913.localdomain podman[320784]: 2025-12-02 10:08:58.413492262 +0000 UTC m=+0.057745810 container kill 2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:08:58 np0005541913.localdomain systemd[1]: libpod-2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7.scope: Deactivated successfully.
Dec 02 10:08:58 np0005541913.localdomain ceph-mon[298296]: pgmap v330: 177 pgs: 177 active+clean; 146 MiB data, 779 MiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 2.7 KiB/s wr, 1 op/s
Dec 02 10:08:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "dd0a2a5c-154a-4d2f-9d8b-fce17b535945", "format": "json"}]: dispatch
Dec 02 10:08:58 np0005541913.localdomain podman[320797]: 2025-12-02 10:08:58.488569452 +0000 UTC m=+0.060137783 container died 2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:08:58 np0005541913.localdomain podman[320797]: 2025-12-02 10:08:58.512731667 +0000 UTC m=+0.084299978 container cleanup 2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:08:58 np0005541913.localdomain systemd[1]: libpod-conmon-2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7.scope: Deactivated successfully.
Dec 02 10:08:58 np0005541913.localdomain podman[320798]: 2025-12-02 10:08:58.579691551 +0000 UTC m=+0.146727451 container remove 2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:08:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:58.596 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:58 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:58Z|00342|binding|INFO|Releasing lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 from this chassis (sb_readonly=0)
Dec 02 10:08:58 np0005541913.localdomain kernel: device tap4b41fcb7-46 left promiscuous mode
Dec 02 10:08:58 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:08:58Z|00343|binding|INFO|Setting lport 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 down in Southbound
Dec 02 10:08:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:08:58.619 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:08:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-997b7cb411eca1e18b4e08673cc037792aaf71615365dfe2ab0daee54678e70c-merged.mount: Deactivated successfully.
Dec 02 10:08:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2856229d50cdf17a43cf6e248d49c30517c3b01edef9d1acc6add5935fc7bbb7-userdata-shm.mount: Deactivated successfully.
Dec 02 10:08:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:58.856 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe45:c69d/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=4b41fcb7-4686-4ae0-bf20-f32d06645ac4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:08:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:58.858 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 4b41fcb7-4686-4ae0-bf20-f32d06645ac4 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:08:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:58.860 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:08:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:08:58.861 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[41e176c6-5b0d-4037-9ca1-8557c6f1e527]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:08:59 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:08:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:08:59.066 263406 INFO neutron.agent.dhcp.agent [None req-e1e01852-a348-4194-911d-9b46de5787ba - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:08:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "dd0a2a5c-154a-4d2f-9d8b-fce17b535945", "force": true, "format": "json"}]: dispatch
Dec 02 10:09:00 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:00.263 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:08:56Z, description=, device_id=484ade60-328f-43eb-b990-42dac6f1b75b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990884a850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990884a3d0>], id=38c2fef2-e6bd-4f3b-a3ea-c9d9c60fb17c, ip_allocation=immediate, mac_address=fa:16:3e:72:43:2a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:08:40Z, description=, dns_domain=, id=bfdb46d8-0ab9-4f91-af70-05b63804efe6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1196347497, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54074, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2042, status=ACTIVE, subnets=['d57b0d74-058f-4387-bcee-307aa4948e69'], tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:08:47Z, vlan_transparent=None, network_id=bfdb46d8-0ab9-4f91-af70-05b63804efe6, port_security_enabled=False, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2085, status=DOWN, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:08:56Z on network bfdb46d8-0ab9-4f91-af70-05b63804efe6
Dec 02 10:09:00 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:00.327 2 INFO neutron.agent.securitygroups_rpc [None req-87a4be36-2a80-4d5d-bf3f-f65722b03fc3 b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:09:00 np0005541913.localdomain dnsmasq[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/addn_hosts - 1 addresses
Dec 02 10:09:00 np0005541913.localdomain podman[320845]: 2025-12-02 10:09:00.491751294 +0000 UTC m=+0.068461205 container kill 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:00 np0005541913.localdomain dnsmasq-dhcp[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/host
Dec 02 10:09:00 np0005541913.localdomain dnsmasq-dhcp[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/opts
Dec 02 10:09:00 np0005541913.localdomain ceph-mon[298296]: pgmap v331: 177 pgs: 177 active+clean; 146 MiB data, 779 MiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 4.6 KiB/s wr, 1 op/s
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.592 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:09:00 np0005541913.localdomain podman[320874]: 2025-12-02 10:09:00.640338974 +0000 UTC m=+0.062287981 container kill de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:09:00 np0005541913.localdomain dnsmasq[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/addn_hosts - 1 addresses
Dec 02 10:09:00 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/host
Dec 02 10:09:00 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/opts
Dec 02 10:09:00 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:00.770 2 INFO neutron.agent.securitygroups_rpc [None req-a8153026-65f9-484e-923d-c2362124502e 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.781 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.782 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.783 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.784 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.808 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.808 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.809 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.809 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.810 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:09:00 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:00.825 263406 INFO neutron.agent.dhcp.agent [None req-53507337-aade-4a1a-b9fe-2f12810efa76 - - - - - -] DHCP configuration for ports {'38c2fef2-e6bd-4f3b-a3ea-c9d9c60fb17c'} is completed
Dec 02 10:09:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:00.994 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.059 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:09:01 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2476910833' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.337 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:09:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.418 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.418 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:09:01 np0005541913.localdomain systemd[1]: tmp-crun.HsriqY.mount: Deactivated successfully.
Dec 02 10:09:01 np0005541913.localdomain podman[320923]: 2025-12-02 10:09:01.440777835 +0000 UTC m=+0.078496013 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec 02 10:09:01 np0005541913.localdomain podman[320923]: 2025-12-02 10:09:01.457230073 +0000 UTC m=+0.094948241 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 10:09:01 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.654 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.658 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11236MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.658 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.659 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.735 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.736 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.736 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:09:01 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:01.840 2 INFO neutron.agent.securitygroups_rpc [None req-a5be6bf0-ce48-4e65-99a4-3416f730b3a2 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.909 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.939 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.939 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.955 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 10:09:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:01.979 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_ABM,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 10:09:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:02.036 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:09:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9dcfd3db-4efe-40f2-a49d-5d58c6cc71e6", "snap_name": "9664f206-7cb4-4a1a-b619-2a201c7ebe10_4a876e35-722e-46c7-9bcc-640ed22bd047", "force": true, "format": "json"}]: dispatch
Dec 02 10:09:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9dcfd3db-4efe-40f2-a49d-5d58c6cc71e6", "snap_name": "9664f206-7cb4-4a1a-b619-2a201c7ebe10", "force": true, "format": "json"}]: dispatch
Dec 02 10:09:02 np0005541913.localdomain ceph-mon[298296]: pgmap v332: 177 pgs: 177 active+clean; 146 MiB data, 779 MiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 2.6 KiB/s wr, 0 op/s
Dec 02 10:09:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2476910833' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:02 np0005541913.localdomain dnsmasq[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/addn_hosts - 0 addresses
Dec 02 10:09:02 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/host
Dec 02 10:09:02 np0005541913.localdomain dnsmasq-dhcp[319699]: read /var/lib/neutron/dhcp/8a8e4389-c9b3-4713-b533-7861fccbcf32/opts
Dec 02 10:09:02 np0005541913.localdomain podman[320970]: 2025-12-02 10:09:02.274078002 +0000 UTC m=+0.067592022 container kill de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:09:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:09:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2652316114' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:02.618 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.582s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:09:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:02.627 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:09:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:02.647 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:09:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:02.650 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:09:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:02.651 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:09:02 np0005541913.localdomain dnsmasq[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/addn_hosts - 0 addresses
Dec 02 10:09:02 np0005541913.localdomain dnsmasq-dhcp[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/host
Dec 02 10:09:02 np0005541913.localdomain podman[321020]: 2025-12-02 10:09:02.760393761 +0000 UTC m=+0.048942884 container kill 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 02 10:09:02 np0005541913.localdomain dnsmasq-dhcp[320209]: read /var/lib/neutron/dhcp/bfdb46d8-0ab9-4f91-af70-05b63804efe6/opts
Dec 02 10:09:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:02Z|00344|binding|INFO|Releasing lport 8fbb99e9-2ad3-4260-a17b-f7524696dad5 from this chassis (sb_readonly=0)
Dec 02 10:09:02 np0005541913.localdomain kernel: device tap8fbb99e9-2a left promiscuous mode
Dec 02 10:09:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:02Z|00345|binding|INFO|Setting lport 8fbb99e9-2ad3-4260-a17b-f7524696dad5 down in Southbound
Dec 02 10:09:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:02.951 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:02.964 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-8a8e4389-c9b3-4713-b533-7861fccbcf32', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a8e4389-c9b3-4713-b533-7861fccbcf32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50b20ebe68c9494a933fabe997d62528', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc4549c7-0142-4249-a0f1-78307f272ad4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=8fbb99e9-2ad3-4260-a17b-f7524696dad5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:02.965 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 8fbb99e9-2ad3-4260-a17b-f7524696dad5 in datapath 8a8e4389-c9b3-4713-b533-7861fccbcf32 unbound from our chassis
Dec 02 10:09:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:02.968 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8a8e4389-c9b3-4713-b533-7861fccbcf32, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:02.969 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[01641f36-ca34-4786-b532-ceaedd228637]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:02.979 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.052 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.053 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.054 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:09:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:03Z|00346|binding|INFO|Releasing lport 71712210-e77f-42b0-bbd5-d3267949cb4f from this chassis (sb_readonly=0)
Dec 02 10:09:03 np0005541913.localdomain kernel: device tap71712210-e7 left promiscuous mode
Dec 02 10:09:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:03Z|00347|binding|INFO|Setting lport 71712210-e77f-42b0-bbd5-d3267949cb4f down in Southbound
Dec 02 10:09:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:03.254 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.262 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-bfdb46d8-0ab9-4f91-af70-05b63804efe6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bfdb46d8-0ab9-4f91-af70-05b63804efe6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7b847f-2e87-42d4-a87d-a72dff8a08d3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=71712210-e77f-42b0-bbd5-d3267949cb4f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.265 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 71712210-e77f-42b0-bbd5-d3267949cb4f in datapath bfdb46d8-0ab9-4f91-af70-05b63804efe6 unbound from our chassis
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.268 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bfdb46d8-0ab9-4f91-af70-05b63804efe6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:03.270 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.269 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[389b220f-2e5f-4732-9dfa-a281254f5bd0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2652316114' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:03 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:03.315 2 INFO neutron.agent.securitygroups_rpc [None req-11b93b80-ea7f-4df4-8312-05f04742e794 74c5eb8a019a4e62a5eaf3b3d37efc2b 013c3f934ab54b1a83f18d3dcf154dd0 - - default default] Security group member updated ['b78815c8-0800-4df2-8d06-dc1b5176ba24']
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.411 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:19:93 2001:db8::f816:3eff:fee6:1993'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a59d5a92-7a77-419d-a87f-fbb46ea78955) old=Port_Binding(mac=['fa:16:3e:e6:19:93 10.100.0.2 2001:db8::f816:3eff:fee6:1993'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.414 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a59d5a92-7a77-419d-a87f-fbb46ea78955 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 updated
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.417 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:03.418 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[22f0c3b1-ece1-4a72-a75d-832d0b0d7c87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:09:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:09:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:09:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:09:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:09:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:09:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:09:04 np0005541913.localdomain ceph-mon[298296]: pgmap v333: 177 pgs: 177 active+clean; 146 MiB data, 779 MiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 7.9 KiB/s wr, 2 op/s
Dec 02 10:09:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:04.796 263406 INFO neutron.agent.linux.ip_lib [None req-f9d77031-8ceb-4333-912e-047cf12142b3 - - - - - -] Device tap0ada2217-2a cannot be used as it has no MAC address
Dec 02 10:09:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:04.868 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:04 np0005541913.localdomain kernel: device tap0ada2217-2a entered promiscuous mode
Dec 02 10:09:04 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670144.8758] manager: (tap0ada2217-2a): new Generic device (/org/freedesktop/NetworkManager/Devices/56)
Dec 02 10:09:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:04Z|00348|binding|INFO|Claiming lport 0ada2217-2a5d-48ec-b3e1-f4be95cae804 for this chassis.
Dec 02 10:09:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:04Z|00349|binding|INFO|0ada2217-2a5d-48ec-b3e1-f4be95cae804: Claiming unknown
Dec 02 10:09:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:04.876 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:04 np0005541913.localdomain systemd-udevd[321054]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:04.885 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe8a:ca01/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0ada2217-2a5d-48ec-b3e1-f4be95cae804) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:04.887 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0ada2217-2a5d-48ec-b3e1-f4be95cae804 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:09:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:04Z|00350|binding|INFO|Setting lport 0ada2217-2a5d-48ec-b3e1-f4be95cae804 ovn-installed in OVS
Dec 02 10:09:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:04Z|00351|binding|INFO|Setting lport 0ada2217-2a5d-48ec-b3e1-f4be95cae804 up in Southbound
Dec 02 10:09:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:04.890 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:04.889 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7de86c17-dca9-4795-a188-896ecb54fd0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:04.889 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:04 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:04.890 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[3374b84e-9e6c-4155-b6eb-2b066324d524]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:04.893 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:04.914 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:04.951 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:04.975 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9dcfd3db-4efe-40f2-a49d-5d58c6cc71e6", "format": "json"}]: dispatch
Dec 02 10:09:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9dcfd3db-4efe-40f2-a49d-5d58c6cc71e6", "force": true, "format": "json"}]: dispatch
Dec 02 10:09:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/251576263' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:09:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/251576263' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:09:05 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:05.573 2 INFO neutron.agent.securitygroups_rpc [None req-d150629a-bcce-4a38-b00c-70964b564cd8 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:05 np0005541913.localdomain podman[321109]: 
Dec 02 10:09:05 np0005541913.localdomain podman[321109]: 2025-12-02 10:09:05.75590582 +0000 UTC m=+0.093010410 container create 4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:09:05 np0005541913.localdomain systemd[1]: Started libpod-conmon-4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1.scope.
Dec 02 10:09:05 np0005541913.localdomain podman[321109]: 2025-12-02 10:09:05.710432858 +0000 UTC m=+0.047537498 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:05 np0005541913.localdomain systemd[1]: tmp-crun.ldCNSH.mount: Deactivated successfully.
Dec 02 10:09:05 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:05 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33de825e2a2dd72ee39ef97e7e351a6db15d33a2072c70233b921fc2059cfb73/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:05 np0005541913.localdomain podman[321109]: 2025-12-02 10:09:05.839284221 +0000 UTC m=+0.176388811 container init 4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:09:05 np0005541913.localdomain podman[321109]: 2025-12-02 10:09:05.849405291 +0000 UTC m=+0.186509881 container start 4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:09:05 np0005541913.localdomain dnsmasq[321128]: started, version 2.85 cachesize 150
Dec 02 10:09:05 np0005541913.localdomain dnsmasq[321128]: DNS service limited to local subnets
Dec 02 10:09:05 np0005541913.localdomain dnsmasq[321128]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:05 np0005541913.localdomain dnsmasq[321128]: warning: no upstream servers configured
Dec 02 10:09:05 np0005541913.localdomain dnsmasq[321128]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:05 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:05.917 263406 INFO neutron.agent.dhcp.agent [None req-f9d77031-8ceb-4333-912e-047cf12142b3 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:05Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088d4280>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088d4820>], id=61687ac2-e5f9-4379-b916-cbc20f7dcee8, ip_allocation=immediate, mac_address=fa:16:3e:03:4b:72, name=tempest-NetworksTestDHCPv6-1045327991, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=38, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['0277418c-2cf9-4a26-87d2-695322913a68'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:01Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2116, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:05Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:09:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:05.997 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:06.021 263406 INFO neutron.agent.dhcp.agent [None req-bccb8f61-58ed-46ff-9d03-86ac2dc2c06c - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:09:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:09:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:09:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:06.060 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:09:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159650 "" "Go-http-client/1.1"
Dec 02 10:09:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:09:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20179 "" "Go-http-client/1.1"
Dec 02 10:09:06 np0005541913.localdomain dnsmasq[321128]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:09:06 np0005541913.localdomain podman[321148]: 2025-12-02 10:09:06.155875358 +0000 UTC m=+0.119681739 container kill 4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:09:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:06.375 263406 INFO neutron.agent.dhcp.agent [None req-78a43ff1-e309-4f20-af31-b4cb911dcd32 - - - - - -] DHCP configuration for ports {'61687ac2-e5f9-4379-b916-cbc20f7dcee8'} is completed
Dec 02 10:09:06 np0005541913.localdomain ceph-mon[298296]: pgmap v334: 177 pgs: 177 active+clean; 146 MiB data, 779 MiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 7.2 KiB/s wr, 1 op/s
Dec 02 10:09:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/4130488056' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:09:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:09:06 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:06.685 2 INFO neutron.agent.securitygroups_rpc [None req-eff2d2a9-9509-4ec3-933e-196163edb064 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:06 np0005541913.localdomain podman[321168]: 2025-12-02 10:09:06.691430801 +0000 UTC m=+0.082530331 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:09:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:06.696 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:06.697 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:06.697 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:06.697 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:06 np0005541913.localdomain podman[321168]: 2025-12-02 10:09:06.701327324 +0000 UTC m=+0.092426904 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:09:06 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:09:06 np0005541913.localdomain systemd[1]: tmp-crun.Q645Sp.mount: Deactivated successfully.
Dec 02 10:09:06 np0005541913.localdomain podman[321169]: 2025-12-02 10:09:06.806810466 +0000 UTC m=+0.194889176 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 10:09:06 np0005541913.localdomain podman[321169]: 2025-12-02 10:09:06.852236166 +0000 UTC m=+0.240314936 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:09:06 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:09:07 np0005541913.localdomain dnsmasq[321128]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:07 np0005541913.localdomain podman[321247]: 2025-12-02 10:09:07.017712146 +0000 UTC m=+0.072269778 container kill 4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:07 np0005541913.localdomain dnsmasq[319699]: exiting on receipt of SIGTERM
Dec 02 10:09:07 np0005541913.localdomain podman[321259]: 2025-12-02 10:09:07.062101209 +0000 UTC m=+0.067563282 container kill de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:09:07 np0005541913.localdomain systemd[1]: libpod-de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a.scope: Deactivated successfully.
Dec 02 10:09:07 np0005541913.localdomain podman[321275]: 2025-12-02 10:09:07.143741164 +0000 UTC m=+0.064784147 container died de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:09:07 np0005541913.localdomain podman[321275]: 2025-12-02 10:09:07.175083819 +0000 UTC m=+0.096126752 container cleanup de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:09:07 np0005541913.localdomain systemd[1]: libpod-conmon-de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a.scope: Deactivated successfully.
Dec 02 10:09:07 np0005541913.localdomain podman[321278]: 2025-12-02 10:09:07.22387264 +0000 UTC m=+0.133844948 container remove de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a8e4389-c9b3-4713-b533-7861fccbcf32, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:09:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e154 e154: 6 total, 6 up, 6 in
Dec 02 10:09:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1335571893' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/499415689' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:07.608 263406 INFO neutron.agent.dhcp.agent [None req-06f5a695-d526-4455-b095-d150da54f892 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:07.610 263406 INFO neutron.agent.dhcp.agent [None req-06f5a695-d526-4455-b095-d150da54f892 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:07 np0005541913.localdomain systemd[1]: tmp-crun.fNHQi2.mount: Deactivated successfully.
Dec 02 10:09:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-331fe4c85053fe605b3d7d394a4a65cbed0aeb4f2e3994530c0c6c0a05693b91-merged.mount: Deactivated successfully.
Dec 02 10:09:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de55849673dfacd1af528438a53ae08fd1579c7901e15ab043688d7244446c4a-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:07 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d8a8e4389\x2dc9b3\x2d4713\x2db533\x2d7861fccbcf32.mount: Deactivated successfully.
Dec 02 10:09:07 np0005541913.localdomain systemd[1]: tmp-crun.uEbLrP.mount: Deactivated successfully.
Dec 02 10:09:07 np0005541913.localdomain dnsmasq[321128]: exiting on receipt of SIGTERM
Dec 02 10:09:07 np0005541913.localdomain podman[321339]: 2025-12-02 10:09:07.942037918 +0000 UTC m=+0.081052501 container kill 4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:09:07 np0005541913.localdomain systemd[1]: libpod-4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1.scope: Deactivated successfully.
Dec 02 10:09:07 np0005541913.localdomain dnsmasq[320209]: exiting on receipt of SIGTERM
Dec 02 10:09:07 np0005541913.localdomain podman[321355]: 2025-12-02 10:09:07.988226348 +0000 UTC m=+0.080881935 container kill 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:07 np0005541913.localdomain systemd[1]: libpod-5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4.scope: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain podman[321366]: 2025-12-02 10:09:08.003066094 +0000 UTC m=+0.052037418 container died 4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:08 np0005541913.localdomain podman[321366]: 2025-12-02 10:09:08.02880309 +0000 UTC m=+0.077774374 container cleanup 4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:09:08 np0005541913.localdomain systemd[1]: libpod-conmon-4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1.scope: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain podman[321394]: 2025-12-02 10:09:08.047764665 +0000 UTC m=+0.046580653 container died 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:09:08 np0005541913.localdomain podman[321394]: 2025-12-02 10:09:08.078825123 +0000 UTC m=+0.077641060 container cleanup 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:09:08 np0005541913.localdomain systemd[1]: libpod-conmon-5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4.scope: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain podman[321395]: 2025-12-02 10:09:08.118041848 +0000 UTC m=+0.106947591 container remove 5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bfdb46d8-0ab9-4f91-af70-05b63804efe6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:08.143 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:08 np0005541913.localdomain podman[321373]: 2025-12-02 10:09:08.168828732 +0000 UTC m=+0.208974171 container remove 4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:09:08 np0005541913.localdomain kernel: device tap0ada2217-2a left promiscuous mode
Dec 02 10:09:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:08Z|00352|binding|INFO|Releasing lport 0ada2217-2a5d-48ec-b3e1-f4be95cae804 from this chassis (sb_readonly=0)
Dec 02 10:09:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:08Z|00353|binding|INFO|Setting lport 0ada2217-2a5d-48ec-b3e1-f4be95cae804 down in Southbound
Dec 02 10:09:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:08.179 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:08.194 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:08.205 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe8a:ca01/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0ada2217-2a5d-48ec-b3e1-f4be95cae804) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:08.208 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0ada2217-2a5d-48ec-b3e1-f4be95cae804 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:09:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:08.210 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:08.211 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7fe0da-4da0-4540-be49-dda083b82171]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:08.532 263406 INFO neutron.agent.dhcp.agent [None req-df363115-4f09-4027-892e-51432bc04e70 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:08 np0005541913.localdomain ceph-mon[298296]: pgmap v335: 177 pgs: 177 active+clean; 146 MiB data, 779 MiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 7.2 KiB/s wr, 1 op/s
Dec 02 10:09:08 np0005541913.localdomain ceph-mon[298296]: osdmap e154: 6 total, 6 up, 6 in
Dec 02 10:09:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3559245647' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:08.611 263406 INFO neutron.agent.dhcp.agent [None req-4ddd1669-849c-45dd-8c52-4ff6be9ca7d8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:08 np0005541913.localdomain systemd[1]: tmp-crun.eTW9y4.mount: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-33de825e2a2dd72ee39ef97e7e351a6db15d33a2072c70233b921fc2059cfb73-merged.mount: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4e7fd5096d7458dce095a73f3287888eeeb8d0c8b269f446beec866463c3edc1-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-155e384317babd96cf2cd1a89ffbaad899eea093c053e3a7d2f3cc36440b4ff7-merged.mount: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5632184965fd7091ff7705ef40f369861a07fa387c60aba99123c60c1d5aa8e4-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dbfdb46d8\x2d0ab9\x2d4f91\x2daf70\x2d05b63804efe6.mount: Deactivated successfully.
Dec 02 10:09:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:08Z|00354|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:09:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:08.962 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:10 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:10.017 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:10 np0005541913.localdomain ceph-mon[298296]: pgmap v337: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 14 KiB/s wr, 4 op/s
Dec 02 10:09:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:10.999 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:11.062 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:11 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:11.262 2 INFO neutron.agent.securitygroups_rpc [None req-d69c9fb8-aada-452c-807d-ffbf23ad4dde b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:09:11 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:11.292 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:11 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:11.330 2 INFO neutron.agent.securitygroups_rpc [None req-158ba6e6-ae47-4633-afb7-8fe1fff090db 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:11 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:11.528 263406 INFO neutron.agent.linux.ip_lib [None req-9b4a9558-8b26-4cd9-bc17-e585f6ba3834 - - - - - -] Device tapb4439fe1-b6 cannot be used as it has no MAC address
Dec 02 10:09:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:11.559 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:11 np0005541913.localdomain kernel: device tapb4439fe1-b6 entered promiscuous mode
Dec 02 10:09:11 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670151.5687] manager: (tapb4439fe1-b6): new Generic device (/org/freedesktop/NetworkManager/Devices/57)
Dec 02 10:09:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:11Z|00355|binding|INFO|Claiming lport b4439fe1-b6e1-4982-a031-265c40bf42ca for this chassis.
Dec 02 10:09:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:11.569 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:11Z|00356|binding|INFO|b4439fe1-b6e1-4982-a031-265c40bf42ca: Claiming unknown
Dec 02 10:09:11 np0005541913.localdomain systemd-udevd[321439]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:11.583 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb8:8fc3/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=b4439fe1-b6e1-4982-a031-265c40bf42ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:11.586 160221 INFO neutron.agent.ovn.metadata.agent [-] Port b4439fe1-b6e1-4982-a031-265c40bf42ca in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:09:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:11.589 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port c0c7af0d-4fcb-4556-861c-4af2f37ea5e4 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:11.590 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:11.592 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[82949cf7-1133-4bfd-b12c-4572a073dd90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4439fe1-b6: No such device
Dec 02 10:09:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:11.603 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4439fe1-b6: No such device
Dec 02 10:09:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:11Z|00357|binding|INFO|Setting lport b4439fe1-b6e1-4982-a031-265c40bf42ca ovn-installed in OVS
Dec 02 10:09:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:11Z|00358|binding|INFO|Setting lport b4439fe1-b6e1-4982-a031-265c40bf42ca up in Southbound
Dec 02 10:09:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:11.611 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4439fe1-b6: No such device
Dec 02 10:09:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4439fe1-b6: No such device
Dec 02 10:09:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4439fe1-b6: No such device
Dec 02 10:09:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4439fe1-b6: No such device
Dec 02 10:09:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4439fe1-b6: No such device
Dec 02 10:09:11 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb4439fe1-b6: No such device
Dec 02 10:09:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:11.642 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:11.667 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:12 np0005541913.localdomain ceph-mon[298296]: pgmap v338: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 14 KiB/s wr, 4 op/s
Dec 02 10:09:12 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:12.331 2 INFO neutron.agent.securitygroups_rpc [None req-9fd8a609-568a-4b57-8025-f518255ff815 b9c801fe16fd46b78d8c4d5c23cd99c7 50b20ebe68c9494a933fabe997d62528 - - default default] Security group member updated ['0990385a-b99f-41bd-8d17-8e7fb5ec4794']
Dec 02 10:09:12 np0005541913.localdomain podman[321510]: 
Dec 02 10:09:12 np0005541913.localdomain podman[321510]: 2025-12-02 10:09:12.43238391 +0000 UTC m=+0.087847482 container create 39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:09:12 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:12.452 2 INFO neutron.agent.securitygroups_rpc [None req-4d3ff4f1-7788-4535-9205-e4647a2c3ad1 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:12 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:12.457 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:12 np0005541913.localdomain systemd[1]: Started libpod-conmon-39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63.scope.
Dec 02 10:09:12 np0005541913.localdomain podman[321510]: 2025-12-02 10:09:12.389733844 +0000 UTC m=+0.045197466 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:12 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/566de893ebe6e92e509fb823e244732192be440f9e7f7b105f2bd6bfda91f749/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:12 np0005541913.localdomain podman[321510]: 2025-12-02 10:09:12.508790726 +0000 UTC m=+0.164254268 container init 39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:12 np0005541913.localdomain podman[321510]: 2025-12-02 10:09:12.520297133 +0000 UTC m=+0.175760695 container start 39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:09:12 np0005541913.localdomain dnsmasq[321528]: started, version 2.85 cachesize 150
Dec 02 10:09:12 np0005541913.localdomain dnsmasq[321528]: DNS service limited to local subnets
Dec 02 10:09:12 np0005541913.localdomain dnsmasq[321528]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:12 np0005541913.localdomain dnsmasq[321528]: warning: no upstream servers configured
Dec 02 10:09:12 np0005541913.localdomain dnsmasq-dhcp[321528]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:09:12 np0005541913.localdomain dnsmasq[321528]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:12 np0005541913.localdomain dnsmasq-dhcp[321528]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:12 np0005541913.localdomain dnsmasq-dhcp[321528]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:12 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:12.587 263406 INFO neutron.agent.dhcp.agent [None req-9b4a9558-8b26-4cd9-bc17-e585f6ba3834 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:10Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908858850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908925460>], id=76e0cc3c-a1f7-44b3-8218-4e157a8dda23, ip_allocation=immediate, mac_address=fa:16:3e:3c:a2:56, name=tempest-NetworksTestDHCPv6-314118676, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=40, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['3bc4be47-66b2-4149-ab7b-9ea605321c8c'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:08Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2134, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:11Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:09:12 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:12.657 263406 INFO neutron.agent.dhcp.agent [None req-37f788d9-fbc4-4b54-9317-d5e82251a062 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:09:12 np0005541913.localdomain dnsmasq[321528]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:09:12 np0005541913.localdomain dnsmasq-dhcp[321528]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:12 np0005541913.localdomain dnsmasq-dhcp[321528]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:12 np0005541913.localdomain podman[321545]: 2025-12-02 10:09:12.785444209 +0000 UTC m=+0.064437238 container kill 39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:09:12 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:12.977 263406 INFO neutron.agent.dhcp.agent [None req-2b7b6b4e-b5a9-4583-9e28-2e4db20559bd - - - - - -] DHCP configuration for ports {'76e0cc3c-a1f7-44b3-8218-4e157a8dda23'} is completed
Dec 02 10:09:13 np0005541913.localdomain dnsmasq[321528]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:13 np0005541913.localdomain dnsmasq-dhcp[321528]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:13 np0005541913.localdomain dnsmasq-dhcp[321528]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:13 np0005541913.localdomain podman[321584]: 2025-12-02 10:09:13.109279048 +0000 UTC m=+0.053219719 container kill 39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:14.134 263406 INFO neutron.agent.linux.ip_lib [None req-d77cc7dd-36e8-4708-a6e9-086c5a8a2bde - - - - - -] Device tapcbda893e-6a cannot be used as it has no MAC address
Dec 02 10:09:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:14.164 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:14 np0005541913.localdomain kernel: device tapcbda893e-6a entered promiscuous mode
Dec 02 10:09:14 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670154.1707] manager: (tapcbda893e-6a): new Generic device (/org/freedesktop/NetworkManager/Devices/58)
Dec 02 10:09:14 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:14Z|00359|binding|INFO|Claiming lport cbda893e-6a95-4f21-b53a-4734c24663e0 for this chassis.
Dec 02 10:09:14 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:14Z|00360|binding|INFO|cbda893e-6a95-4f21-b53a-4734c24663e0: Claiming unknown
Dec 02 10:09:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:14.171 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:14 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:14.182 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-e13ed0b0-82be-499b-b8af-a15d85a02df9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e13ed0b0-82be-499b-b8af-a15d85a02df9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e53e7092-6e4c-49fc-9858-fc71f27a93fb, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=cbda893e-6a95-4f21-b53a-4734c24663e0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:14 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:14.184 160221 INFO neutron.agent.ovn.metadata.agent [-] Port cbda893e-6a95-4f21-b53a-4734c24663e0 in datapath e13ed0b0-82be-499b-b8af-a15d85a02df9 bound to our chassis
Dec 02 10:09:14 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:14.185 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e13ed0b0-82be-499b-b8af-a15d85a02df9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:09:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:14.190 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:14 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:14.191 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[088c9c1c-3bd8-4158-bd45-e37ab15bfee2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:14 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:14Z|00361|binding|INFO|Setting lport cbda893e-6a95-4f21-b53a-4734c24663e0 ovn-installed in OVS
Dec 02 10:09:14 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:14Z|00362|binding|INFO|Setting lport cbda893e-6a95-4f21-b53a-4734c24663e0 up in Southbound
Dec 02 10:09:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:14.194 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:14.208 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:14 np0005541913.localdomain systemd[1]: tmp-crun.zHZ4qY.mount: Deactivated successfully.
Dec 02 10:09:14 np0005541913.localdomain podman[321628]: 2025-12-02 10:09:14.224341605 +0000 UTC m=+0.080469246 container kill 39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:09:14 np0005541913.localdomain dnsmasq[321528]: exiting on receipt of SIGTERM
Dec 02 10:09:14 np0005541913.localdomain systemd[1]: libpod-39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63.scope: Deactivated successfully.
Dec 02 10:09:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:14.249 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:14.272 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:14 np0005541913.localdomain ceph-mon[298296]: pgmap v339: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 8.9 KiB/s wr, 5 op/s
Dec 02 10:09:14 np0005541913.localdomain podman[321647]: 2025-12-02 10:09:14.28311604 +0000 UTC m=+0.049213032 container died 39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:09:14 np0005541913.localdomain podman[321647]: 2025-12-02 10:09:14.307212923 +0000 UTC m=+0.073309895 container cleanup 39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:09:14 np0005541913.localdomain systemd[1]: libpod-conmon-39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63.scope: Deactivated successfully.
Dec 02 10:09:14 np0005541913.localdomain podman[321655]: 2025-12-02 10:09:14.382028807 +0000 UTC m=+0.128062924 container remove 39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:09:14 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:14Z|00363|binding|INFO|Releasing lport b4439fe1-b6e1-4982-a031-265c40bf42ca from this chassis (sb_readonly=0)
Dec 02 10:09:14 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:14Z|00364|binding|INFO|Setting lport b4439fe1-b6e1-4982-a031-265c40bf42ca down in Southbound
Dec 02 10:09:14 np0005541913.localdomain kernel: device tapb4439fe1-b6 left promiscuous mode
Dec 02 10:09:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:14.396 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:14 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:14.416 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb8:8fc3/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=b4439fe1-b6e1-4982-a031-265c40bf42ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:14 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:14.418 160221 INFO neutron.agent.ovn.metadata.agent [-] Port b4439fe1-b6e1-4982-a031-265c40bf42ca in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:09:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:14.420 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:14 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:14.421 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:14 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:14.422 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb3318b-366a-4096-bffe-caa4cdb772d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-566de893ebe6e92e509fb823e244732192be440f9e7f7b105f2bd6bfda91f749-merged.mount: Deactivated successfully.
Dec 02 10:09:14 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-39cf601d7c30ccaa89aa40f12098b88e5a3cf6674de0305e23e1be3bfba25c63-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:14 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:09:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:14.865 263406 INFO neutron.agent.dhcp.agent [None req-ef039e85-1a3a-4527-a735-198fd421120b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:15 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:15.186 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:15 np0005541913.localdomain podman[321726]: 
Dec 02 10:09:15 np0005541913.localdomain podman[321726]: 2025-12-02 10:09:15.290086675 +0000 UTC m=+0.101170217 container create c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e13ed0b0-82be-499b-b8af-a15d85a02df9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:15 np0005541913.localdomain systemd[1]: Started libpod-conmon-c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2.scope.
Dec 02 10:09:15 np0005541913.localdomain podman[321726]: 2025-12-02 10:09:15.237291468 +0000 UTC m=+0.048375020 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:15 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:15 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/061e35e8e2271744f22f5b8b10d46d8b4670aa5cc18d5b13debaf7e3b4be708c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:15 np0005541913.localdomain podman[321726]: 2025-12-02 10:09:15.364173619 +0000 UTC m=+0.175257161 container init c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e13ed0b0-82be-499b-b8af-a15d85a02df9, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:15 np0005541913.localdomain podman[321726]: 2025-12-02 10:09:15.373494008 +0000 UTC m=+0.184577550 container start c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e13ed0b0-82be-499b-b8af-a15d85a02df9, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:15 np0005541913.localdomain dnsmasq[321744]: started, version 2.85 cachesize 150
Dec 02 10:09:15 np0005541913.localdomain dnsmasq[321744]: DNS service limited to local subnets
Dec 02 10:09:15 np0005541913.localdomain dnsmasq[321744]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:15 np0005541913.localdomain dnsmasq[321744]: warning: no upstream servers configured
Dec 02 10:09:15 np0005541913.localdomain dnsmasq-dhcp[321744]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:09:15 np0005541913.localdomain dnsmasq[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/addn_hosts - 0 addresses
Dec 02 10:09:15 np0005541913.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/host
Dec 02 10:09:15 np0005541913.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/opts
Dec 02 10:09:15 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:15.568 263406 INFO neutron.agent.dhcp.agent [None req-c0a58fc7-ea7b-4d18-97de-d664fbcf8601 - - - - - -] DHCP configuration for ports {'5ef446fe-4b56-442f-96a4-11d2e9927b0a'} is completed
Dec 02 10:09:15 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:15.918 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:16.002 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:16.064 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 e155: 6 total, 6 up, 6 in
Dec 02 10:09:16 np0005541913.localdomain ceph-mon[298296]: pgmap v340: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 8.9 KiB/s wr, 5 op/s
Dec 02 10:09:16 np0005541913.localdomain ceph-mon[298296]: osdmap e155: 6 total, 6 up, 6 in
Dec 02 10:09:18 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:18.014 2 INFO neutron.agent.securitygroups_rpc [None req-1b7dd085-a5c1-4a81-bd02-4cabc7845a6f f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:18 np0005541913.localdomain ceph-mon[298296]: pgmap v342: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 1.5 KiB/s rd, 9.4 KiB/s wr, 5 op/s
Dec 02 10:09:18 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:09:18 np0005541913.localdomain podman[321745]: 2025-12-02 10:09:18.452697017 +0000 UTC m=+0.089940968 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec 02 10:09:18 np0005541913.localdomain podman[321745]: 2025-12-02 10:09:18.494395108 +0000 UTC m=+0.131639069 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:09:18 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:09:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:09:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2213891855' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:09:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:09:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2213891855' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:09:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4204336087' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:09:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4204336087' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:09:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2213891855' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:09:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2213891855' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:09:19 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:19.424 2 INFO neutron.agent.securitygroups_rpc [None req-278e88f3-562a-4f7b-8f10-c3e2bfd4ee2e 8b49e5c866794aad866d55bb5f154d67 7dffef2e74844a7ebb6ee68826fb7e57 - - default default] Security group member updated ['32471057-4d02-424a-9e3e-19629ab1677d']
Dec 02 10:09:19 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:19.478 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:19Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908813c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088139d0>], id=ff9379c5-c4de-4f4c-9009-a3c2753f59eb, ip_allocation=immediate, mac_address=fa:16:3e:85:d4:94, name=tempest-RoutersTest-290673381, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:09:11Z, description=, dns_domain=, id=e13ed0b0-82be-499b-b8af-a15d85a02df9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-286923336, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11794, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2135, status=ACTIVE, subnets=['b8a39615-c67f-42f5-884d-1d6d18d7847a'], tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:12Z, vlan_transparent=None, network_id=e13ed0b0-82be-499b-b8af-a15d85a02df9, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['32471057-4d02-424a-9e3e-19629ab1677d'], standard_attr_id=2179, status=DOWN, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:19Z on network e13ed0b0-82be-499b-b8af-a15d85a02df9
Dec 02 10:09:19 np0005541913.localdomain dnsmasq[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/addn_hosts - 1 addresses
Dec 02 10:09:19 np0005541913.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/host
Dec 02 10:09:19 np0005541913.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/opts
Dec 02 10:09:19 np0005541913.localdomain podman[321782]: 2025-12-02 10:09:19.720712209 +0000 UTC m=+0.055678325 container kill c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e13ed0b0-82be-499b-b8af-a15d85a02df9, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:19 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:19.740 2 INFO neutron.agent.securitygroups_rpc [None req-e9fc3440-8683-40fd-946b-446e84f960a4 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:19 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:19.770 263406 INFO neutron.agent.linux.ip_lib [None req-cf694e92-ab83-4ed8-bd35-b32a57da5eb8 - - - - - -] Device tap324c0357-f6 cannot be used as it has no MAC address
Dec 02 10:09:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:19.804 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:19 np0005541913.localdomain kernel: device tap324c0357-f6 entered promiscuous mode
Dec 02 10:09:19 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670159.8119] manager: (tap324c0357-f6): new Generic device (/org/freedesktop/NetworkManager/Devices/59)
Dec 02 10:09:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:19Z|00365|binding|INFO|Claiming lport 324c0357-f680-4288-a358-1a1dfc9002b3 for this chassis.
Dec 02 10:09:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:19Z|00366|binding|INFO|324c0357-f680-4288-a358-1a1dfc9002b3: Claiming unknown
Dec 02 10:09:19 np0005541913.localdomain systemd-udevd[321807]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:19.812 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:19Z|00367|binding|INFO|Setting lport 324c0357-f680-4288-a358-1a1dfc9002b3 up in Southbound
Dec 02 10:09:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:19Z|00368|binding|INFO|Setting lport 324c0357-f680-4288-a358-1a1dfc9002b3 ovn-installed in OVS
Dec 02 10:09:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:19.822 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:19.825 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:19 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:19.821 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe70:68f5/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=324c0357-f680-4288-a358-1a1dfc9002b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:19 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:19.823 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 324c0357-f680-4288-a358-1a1dfc9002b3 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:09:19 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:19.826 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port ae9d2580-403f-4ce2-a075-d2b7d708275b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:19 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:19.826 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:19 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:19.827 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[617969b8-b118-4ec5-b423-0ef5a6f9dcb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:19 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap324c0357-f6: No such device
Dec 02 10:09:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:19.848 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:19 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap324c0357-f6: No such device
Dec 02 10:09:19 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap324c0357-f6: No such device
Dec 02 10:09:19 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap324c0357-f6: No such device
Dec 02 10:09:19 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap324c0357-f6: No such device
Dec 02 10:09:19 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap324c0357-f6: No such device
Dec 02 10:09:19 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap324c0357-f6: No such device
Dec 02 10:09:19 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap324c0357-f6: No such device
Dec 02 10:09:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:19.908 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:19.936 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:20 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:20.044 263406 INFO neutron.agent.dhcp.agent [None req-e85393e9-c731-4b60-8094-ea395ce72bb0 - - - - - -] DHCP configuration for ports {'ff9379c5-c4de-4f4c-9009-a3c2753f59eb'} is completed
Dec 02 10:09:20 np0005541913.localdomain ceph-mon[298296]: pgmap v343: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 2.6 KiB/s wr, 37 op/s
Dec 02 10:09:20 np0005541913.localdomain podman[321883]: 
Dec 02 10:09:20 np0005541913.localdomain podman[321883]: 2025-12-02 10:09:20.76656963 +0000 UTC m=+0.081732929 container create 367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:09:20 np0005541913.localdomain systemd[1]: Started libpod-conmon-367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d.scope.
Dec 02 10:09:20 np0005541913.localdomain podman[321883]: 2025-12-02 10:09:20.724692744 +0000 UTC m=+0.039856083 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:20 np0005541913.localdomain systemd[1]: tmp-crun.e2I5B1.mount: Deactivated successfully.
Dec 02 10:09:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:09:20 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1785879792' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:09:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:09:20 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1785879792' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:09:20 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:20 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/283af53ad12a8290739b2dcc7eba4c6bcf76cb8c5e9c11277a0dcadc0c44c56f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:20 np0005541913.localdomain podman[321883]: 2025-12-02 10:09:20.865118246 +0000 UTC m=+0.180281525 container init 367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:09:20 np0005541913.localdomain podman[321883]: 2025-12-02 10:09:20.871727882 +0000 UTC m=+0.186891161 container start 367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:20 np0005541913.localdomain dnsmasq[321902]: started, version 2.85 cachesize 150
Dec 02 10:09:20 np0005541913.localdomain dnsmasq[321902]: DNS service limited to local subnets
Dec 02 10:09:20 np0005541913.localdomain dnsmasq[321902]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:20 np0005541913.localdomain dnsmasq[321902]: warning: no upstream servers configured
Dec 02 10:09:20 np0005541913.localdomain dnsmasq[321902]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:20 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:20.938 263406 INFO neutron.agent.dhcp.agent [None req-cf694e92-ab83-4ed8-bd35-b32a57da5eb8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:19Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089d0250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089d0430>], id=0e2b3ab0-808b-4d9a-adbe-b5e67180b17d, ip_allocation=immediate, mac_address=fa:16:3e:68:9f:3f, name=tempest-NetworksTestDHCPv6-645243242, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['19bc6d19-aeb3-4f9d-8660-0603a2e336bc'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:14Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2180, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:19Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:09:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:21.003 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:21.010 263406 INFO neutron.agent.dhcp.agent [None req-9d90c008-f4df-47a7-9d2e-c8ec48cc7e38 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:09:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:21.066 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:21 np0005541913.localdomain dnsmasq[321902]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:09:21 np0005541913.localdomain podman[321921]: 2025-12-02 10:09:21.140405952 +0000 UTC m=+0.058806258 container kill 367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:09:21 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:21.178 2 INFO neutron.agent.securitygroups_rpc [None req-8371118d-5c83-45c5-bfa7-f542b4f1df3f 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:21.329 263406 INFO neutron.agent.dhcp.agent [None req-3f718340-16a8-4d1c-a551-b7c320347f95 - - - - - -] DHCP configuration for ports {'0e2b3ab0-808b-4d9a-adbe-b5e67180b17d'} is completed
Dec 02 10:09:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:09:21 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1785879792' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:09:21 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1785879792' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:09:21 np0005541913.localdomain dnsmasq[321902]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:21 np0005541913.localdomain podman[321968]: 2025-12-02 10:09:21.468835155 +0000 UTC m=+0.062640101 container kill 367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:09:21 np0005541913.localdomain podman[321956]: 2025-12-02 10:09:21.456452375 +0000 UTC m=+0.086375303 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:21 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:21.511 2 INFO neutron.agent.securitygroups_rpc [None req-10f867de-2584-4ed7-a0e8-fb9276ac33a8 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:21 np0005541913.localdomain podman[321956]: 2025-12-02 10:09:21.540284089 +0000 UTC m=+0.170207047 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:09:21 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:09:21 np0005541913.localdomain dnsmasq[321902]: exiting on receipt of SIGTERM
Dec 02 10:09:21 np0005541913.localdomain podman[322014]: 2025-12-02 10:09:21.900902469 +0000 UTC m=+0.071774664 container kill 367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:21 np0005541913.localdomain systemd[1]: tmp-crun.rmCV2Y.mount: Deactivated successfully.
Dec 02 10:09:21 np0005541913.localdomain systemd[1]: libpod-367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d.scope: Deactivated successfully.
Dec 02 10:09:21 np0005541913.localdomain podman[322028]: 2025-12-02 10:09:21.989825089 +0000 UTC m=+0.075085712 container died 367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:22 np0005541913.localdomain podman[322028]: 2025-12-02 10:09:22.026484796 +0000 UTC m=+0.111745399 container cleanup 367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 02 10:09:22 np0005541913.localdomain systemd[1]: libpod-conmon-367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d.scope: Deactivated successfully.
Dec 02 10:09:22 np0005541913.localdomain podman[322035]: 2025-12-02 10:09:22.070638862 +0000 UTC m=+0.136849517 container remove 367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:22 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:22Z|00369|binding|INFO|Releasing lport 324c0357-f680-4288-a358-1a1dfc9002b3 from this chassis (sb_readonly=0)
Dec 02 10:09:22 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:22Z|00370|binding|INFO|Setting lport 324c0357-f680-4288-a358-1a1dfc9002b3 down in Southbound
Dec 02 10:09:22 np0005541913.localdomain kernel: device tap324c0357-f6 left promiscuous mode
Dec 02 10:09:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:22.084 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:22.095 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe70:68f5/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=324c0357-f680-4288-a358-1a1dfc9002b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:22.097 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 324c0357-f680-4288-a358-1a1dfc9002b3 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:09:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:22.099 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:22.100 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[84026192-a26d-457f-a86e-767350efa6b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:22.104 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:22.358 263406 INFO neutron.agent.dhcp.agent [None req-96c827b3-f68e-4cf8-b99a-86434dedc63c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:22 np0005541913.localdomain ceph-mon[298296]: pgmap v344: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 2.6 KiB/s wr, 37 op/s
Dec 02 10:09:22 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/103747809' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:09:22 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/103747809' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:09:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-283af53ad12a8290739b2dcc7eba4c6bcf76cb8c5e9c11277a0dcadc0c44c56f-merged.mount: Deactivated successfully.
Dec 02 10:09:22 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-367e3bd9c0d8614352308ed6c9b0cbb4f6e0d65f91c4d6f155eee6b363e8052d-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:22 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:09:23 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:23.060 2 INFO neutron.agent.securitygroups_rpc [None req-7bc34f63-f96a-4396-b70c-07601d07dee2 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:09:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:09:23 np0005541913.localdomain podman[322058]: 2025-12-02 10:09:23.44997624 +0000 UTC m=+0.088971683 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:09:23 np0005541913.localdomain podman[322058]: 2025-12-02 10:09:23.456858913 +0000 UTC m=+0.095854366 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:09:23 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:09:23 np0005541913.localdomain podman[322057]: 2025-12-02 10:09:23.498273827 +0000 UTC m=+0.136748476 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 10:09:23 np0005541913.localdomain podman[322057]: 2025-12-02 10:09:23.511870549 +0000 UTC m=+0.150345248 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, version=9.6, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, release=1755695350, managed_by=edpm_ansible)
Dec 02 10:09:23 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:09:23 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:23.752 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:23 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:23.773 2 INFO neutron.agent.securitygroups_rpc [None req-ef3a9568-c379-4b2a-a06d-b347ad68d0c7 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:23 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:23.834 2 INFO neutron.agent.securitygroups_rpc [None req-f87c0fb9-70c4-4316-8fe5-2d1d482ef952 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:23 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:23.896 263406 INFO neutron.agent.linux.ip_lib [None req-3c460768-e00a-4238-817c-5ffda8c4d9c8 - - - - - -] Device tap317acccc-f5 cannot be used as it has no MAC address
Dec 02 10:09:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:23.956 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:23 np0005541913.localdomain kernel: device tap317acccc-f5 entered promiscuous mode
Dec 02 10:09:23 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670163.9624] manager: (tap317acccc-f5): new Generic device (/org/freedesktop/NetworkManager/Devices/60)
Dec 02 10:09:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:23.962 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:23 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:23Z|00371|binding|INFO|Claiming lport 317acccc-f5e4-452d-a107-edeeec553e45 for this chassis.
Dec 02 10:09:23 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:23Z|00372|binding|INFO|317acccc-f5e4-452d-a107-edeeec553e45: Claiming unknown
Dec 02 10:09:23 np0005541913.localdomain systemd-udevd[322108]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:23 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:23Z|00373|binding|INFO|Setting lport 317acccc-f5e4-452d-a107-edeeec553e45 ovn-installed in OVS
Dec 02 10:09:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:23.976 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:23 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:23Z|00374|binding|INFO|Setting lport 317acccc-f5e4-452d-a107-edeeec553e45 up in Southbound
Dec 02 10:09:23 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:23.986 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2d:2b7a/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=317acccc-f5e4-452d-a107-edeeec553e45) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap317acccc-f5: No such device
Dec 02 10:09:23 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:23.989 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 317acccc-f5e4-452d-a107-edeeec553e45 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:09:23 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:23.991 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 036480a6-e951-4d35-a3ac-a50bf3818be6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:23 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:23.991 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap317acccc-f5: No such device
Dec 02 10:09:23 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:23.993 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[1a422661-540f-4968-85ca-653ae219f99e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap317acccc-f5: No such device
Dec 02 10:09:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:23.998 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap317acccc-f5: No such device
Dec 02 10:09:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap317acccc-f5: No such device
Dec 02 10:09:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap317acccc-f5: No such device
Dec 02 10:09:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap317acccc-f5: No such device
Dec 02 10:09:24 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap317acccc-f5: No such device
Dec 02 10:09:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:24.042 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:24.070 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:24 np0005541913.localdomain ceph-mon[298296]: pgmap v345: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 2.0 KiB/s wr, 66 op/s
Dec 02 10:09:24 np0005541913.localdomain sudo[322151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:09:24 np0005541913.localdomain sudo[322151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:09:24 np0005541913.localdomain sudo[322151]: pam_unix(sudo:session): session closed for user root
Dec 02 10:09:24 np0005541913.localdomain sudo[322175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:09:24 np0005541913.localdomain sudo[322175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:09:24 np0005541913.localdomain podman[322215]: 
Dec 02 10:09:24 np0005541913.localdomain podman[322215]: 2025-12-02 10:09:24.882234177 +0000 UTC m=+0.099338617 container create a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:09:24 np0005541913.localdomain systemd[1]: Started libpod-conmon-a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4.scope.
Dec 02 10:09:24 np0005541913.localdomain podman[322215]: 2025-12-02 10:09:24.834589218 +0000 UTC m=+0.051693698 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:24 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:24 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0a3660f35d8be91ceba0dc28d4805873619e4df442f680ea55f00f5be81d6df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:24 np0005541913.localdomain podman[322215]: 2025-12-02 10:09:24.946965412 +0000 UTC m=+0.164069822 container init a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:09:24 np0005541913.localdomain podman[322215]: 2025-12-02 10:09:24.954218356 +0000 UTC m=+0.171322796 container start a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:24 np0005541913.localdomain dnsmasq[322247]: started, version 2.85 cachesize 150
Dec 02 10:09:24 np0005541913.localdomain dnsmasq[322247]: DNS service limited to local subnets
Dec 02 10:09:24 np0005541913.localdomain dnsmasq[322247]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:24 np0005541913.localdomain dnsmasq[322247]: warning: no upstream servers configured
Dec 02 10:09:24 np0005541913.localdomain dnsmasq-dhcp[322247]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:09:24 np0005541913.localdomain dnsmasq[322247]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:24 np0005541913.localdomain dnsmasq-dhcp[322247]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:24 np0005541913.localdomain dnsmasq-dhcp[322247]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:24 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:24.984 2 INFO neutron.agent.securitygroups_rpc [None req-b3aa5b43-46a5-4652-aa07-2f62355aecf1 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:25 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:25.013 263406 INFO neutron.agent.dhcp.agent [None req-3c460768-e00a-4238-817c-5ffda8c4d9c8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:23Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a13d60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b58880>], id=f6ac3ef5-c653-4eda-b061-848a27409fb0, ip_allocation=immediate, mac_address=fa:16:3e:b8:be:3d, name=tempest-NetworksTestDHCPv6-1258162334, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=44, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['bec535fc-c757-4086-bb74-960704a071e4'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:21Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2201, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:23Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:09:25 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:25.096 263406 INFO neutron.agent.dhcp.agent [None req-8e1a68e1-ee7f-4c8e-9636-da2fc072c6b2 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:09:25 np0005541913.localdomain dnsmasq[322247]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:09:25 np0005541913.localdomain dnsmasq-dhcp[322247]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:25 np0005541913.localdomain dnsmasq-dhcp[322247]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:25 np0005541913.localdomain podman[322272]: 2025-12-02 10:09:25.204728631 +0000 UTC m=+0.067025007 container kill a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:25 np0005541913.localdomain sudo[322175]: pam_unix(sudo:session): session closed for user root
Dec 02 10:09:25 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:25.389 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:19Z, description=, device_id=1e3f609d-8d11-4c97-acad-11c6a919197a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990881fbb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990881fdc0>], id=ff9379c5-c4de-4f4c-9009-a3c2753f59eb, ip_allocation=immediate, mac_address=fa:16:3e:85:d4:94, name=tempest-RoutersTest-290673381, network_id=e13ed0b0-82be-499b-b8af-a15d85a02df9, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['32471057-4d02-424a-9e3e-19629ab1677d'], standard_attr_id=2179, status=ACTIVE, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:22Z on network e13ed0b0-82be-499b-b8af-a15d85a02df9
Dec 02 10:09:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:09:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:09:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:09:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:09:25 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:25.432 263406 INFO neutron.agent.dhcp.agent [None req-7309156d-d815-42c1-9594-612866a2ca84 - - - - - -] DHCP configuration for ports {'f6ac3ef5-c653-4eda-b061-848a27409fb0'} is completed
Dec 02 10:09:25 np0005541913.localdomain sudo[322321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:09:25 np0005541913.localdomain sudo[322321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:09:25 np0005541913.localdomain sudo[322321]: pam_unix(sudo:session): session closed for user root
Dec 02 10:09:25 np0005541913.localdomain dnsmasq[322247]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:25 np0005541913.localdomain dnsmasq-dhcp[322247]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:25 np0005541913.localdomain dnsmasq-dhcp[322247]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:25 np0005541913.localdomain podman[322331]: 2025-12-02 10:09:25.556496666 +0000 UTC m=+0.090821312 container kill a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:09:25 np0005541913.localdomain dnsmasq[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/addn_hosts - 1 addresses
Dec 02 10:09:25 np0005541913.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/host
Dec 02 10:09:25 np0005541913.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/opts
Dec 02 10:09:25 np0005541913.localdomain podman[322375]: 2025-12-02 10:09:25.6962397 +0000 UTC m=+0.063234486 container kill c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e13ed0b0-82be-499b-b8af-a15d85a02df9, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:25 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:25.979 263406 INFO neutron.agent.dhcp.agent [None req-b34c7367-ce76-40d6-9111-bfa3b717f280 - - - - - -] DHCP configuration for ports {'ff9379c5-c4de-4f4c-9009-a3c2753f59eb'} is completed
Dec 02 10:09:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:26.007 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:26.068 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:26 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:26.131 2 INFO neutron.agent.securitygroups_rpc [None req-28c3366c-1c91-49f5-b694-3c934cb049e5 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:26 np0005541913.localdomain dnsmasq[322247]: exiting on receipt of SIGTERM
Dec 02 10:09:26 np0005541913.localdomain podman[322419]: 2025-12-02 10:09:26.147967098 +0000 UTC m=+0.068603700 container kill a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:09:26 np0005541913.localdomain systemd[1]: libpod-a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4.scope: Deactivated successfully.
Dec 02 10:09:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:26 np0005541913.localdomain podman[322433]: 2025-12-02 10:09:26.216426872 +0000 UTC m=+0.056498667 container died a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:09:26 np0005541913.localdomain podman[322433]: 2025-12-02 10:09:26.251871427 +0000 UTC m=+0.091943192 container cleanup a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:09:26 np0005541913.localdomain systemd[1]: libpod-conmon-a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4.scope: Deactivated successfully.
Dec 02 10:09:26 np0005541913.localdomain podman[322435]: 2025-12-02 10:09:26.314000812 +0000 UTC m=+0.144593864 container remove a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:26Z|00375|binding|INFO|Releasing lport 317acccc-f5e4-452d-a107-edeeec553e45 from this chassis (sb_readonly=0)
Dec 02 10:09:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:26.326 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:26Z|00376|binding|INFO|Setting lport 317acccc-f5e4-452d-a107-edeeec553e45 down in Southbound
Dec 02 10:09:26 np0005541913.localdomain kernel: device tap317acccc-f5 left promiscuous mode
Dec 02 10:09:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:26.335 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2d:2b7a/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=317acccc-f5e4-452d-a107-edeeec553e45) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:26.338 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 317acccc-f5e4-452d-a107-edeeec553e45 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:09:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:26.340 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:26.341 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[23217173-90ea-4a58-b254-e08eeec010f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:26.356 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:26 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:26.379 2 INFO neutron.agent.securitygroups_rpc [None req-ec4a6adc-d68b-4418-9c41-c326e9a3fc34 49e91c7702d54b1ab47e5f6dec5e0208 204a1137a20e40c995bb9cd512e75a5c - - default default] Security group member updated ['53fe5435-6101-4ff1-81ad-b53da833172b']
Dec 02 10:09:26 np0005541913.localdomain ceph-mon[298296]: pgmap v346: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 2.0 KiB/s wr, 66 op/s
Dec 02 10:09:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:09:26 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:26.574 263406 INFO neutron.agent.dhcp.agent [None req-e6581cb2-aacd-4cba-bbb4-23fbe9cc03c5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e0a3660f35d8be91ceba0dc28d4805873619e4df442f680ea55f00f5be81d6df-merged.mount: Deactivated successfully.
Dec 02 10:09:26 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a352e5b2dc38eefd424d28c0f2c8258cdb09415dc850c6ca1c5fd5d018e425e4-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:26 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:09:27 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:27.368 2 INFO neutron.agent.securitygroups_rpc [None req-a7f54859-efcd-4ecf-b40a-33f0bd3f4545 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:09:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "format": "json"}]: dispatch
Dec 02 10:09:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:09:27 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:27.612 2 INFO neutron.agent.securitygroups_rpc [None req-d5002157-4534-49a0-a135-4c64a8485ed7 8b49e5c866794aad866d55bb5f154d67 7dffef2e74844a7ebb6ee68826fb7e57 - - default default] Security group member updated ['32471057-4d02-424a-9e3e-19629ab1677d']
Dec 02 10:09:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:27.930 263406 INFO neutron.agent.linux.ip_lib [None req-72c8040f-566b-4a27-b66c-9270e4271be9 - - - - - -] Device tap0f726015-c9 cannot be used as it has no MAC address
Dec 02 10:09:27 np0005541913.localdomain dnsmasq[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/addn_hosts - 0 addresses
Dec 02 10:09:27 np0005541913.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/host
Dec 02 10:09:27 np0005541913.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/e13ed0b0-82be-499b-b8af-a15d85a02df9/opts
Dec 02 10:09:27 np0005541913.localdomain podman[322484]: 2025-12-02 10:09:27.936469689 +0000 UTC m=+0.059896967 container kill c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e13ed0b0-82be-499b-b8af-a15d85a02df9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:09:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:27.961 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:27 np0005541913.localdomain kernel: device tap0f726015-c9 entered promiscuous mode
Dec 02 10:09:27 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670167.9710] manager: (tap0f726015-c9): new Generic device (/org/freedesktop/NetworkManager/Devices/61)
Dec 02 10:09:27 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:27Z|00377|binding|INFO|Claiming lport 0f726015-c9a1-4dea-9271-1b6ceac095a9 for this chassis.
Dec 02 10:09:27 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:27Z|00378|binding|INFO|0f726015-c9a1-4dea-9271-1b6ceac095a9: Claiming unknown
Dec 02 10:09:27 np0005541913.localdomain systemd-udevd[322503]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:27.979 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:27 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:27Z|00379|binding|INFO|Setting lport 0f726015-c9a1-4dea-9271-1b6ceac095a9 up in Southbound
Dec 02 10:09:27 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:27Z|00380|binding|INFO|Setting lport 0f726015-c9a1-4dea-9271-1b6ceac095a9 ovn-installed in OVS
Dec 02 10:09:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:27.985 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:27.987 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:27.985 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0f726015-c9a1-4dea-9271-1b6ceac095a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:27.988 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0f726015-c9a1-4dea-9271-1b6ceac095a9 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:09:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:27.990 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port e0ba1cdf-a582-40ed-9ec2-5ecac58c1001 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:27.991 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:27.992 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:27.992 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[47c41184-ee3c-4f1d-9e4f-9725a8a21018]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:28.027 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:28.080 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:28.109 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:28 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:28Z|00381|binding|INFO|Releasing lport cbda893e-6a95-4f21-b53a-4734c24663e0 from this chassis (sb_readonly=0)
Dec 02 10:09:28 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:28Z|00382|binding|INFO|Setting lport cbda893e-6a95-4f21-b53a-4734c24663e0 down in Southbound
Dec 02 10:09:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:28.160 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:28 np0005541913.localdomain kernel: device tapcbda893e-6a left promiscuous mode
Dec 02 10:09:28 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:28.172 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-e13ed0b0-82be-499b-b8af-a15d85a02df9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e13ed0b0-82be-499b-b8af-a15d85a02df9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e53e7092-6e4c-49fc-9858-fc71f27a93fb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=cbda893e-6a95-4f21-b53a-4734c24663e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:28 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:28.174 160221 INFO neutron.agent.ovn.metadata.agent [-] Port cbda893e-6a95-4f21-b53a-4734c24663e0 in datapath e13ed0b0-82be-499b-b8af-a15d85a02df9 unbound from our chassis
Dec 02 10:09:28 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:28.176 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e13ed0b0-82be-499b-b8af-a15d85a02df9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:28 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:28.177 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a6859549-6394-4a64-9441-4331f5fcec1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:28 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:28.195 2 INFO neutron.agent.securitygroups_rpc [None req-31d4af97-7fb6-4706-a5b2-299b30ee98fa 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:28.202 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:28 np0005541913.localdomain podman[322565]: 
Dec 02 10:09:28 np0005541913.localdomain podman[322565]: 2025-12-02 10:09:28.935813731 +0000 UTC m=+0.080313251 container create f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:09:28 np0005541913.localdomain systemd[1]: Started libpod-conmon-f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1.scope.
Dec 02 10:09:28 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:28 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0082650ca85757f463f17513ece17c6de93625b11703d74e781dbc5f1a1c4255/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:28 np0005541913.localdomain podman[322565]: 2025-12-02 10:09:28.998948604 +0000 UTC m=+0.143448104 container init f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:09:28 np0005541913.localdomain podman[322565]: 2025-12-02 10:09:28.899865663 +0000 UTC m=+0.044365183 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:29 np0005541913.localdomain podman[322565]: 2025-12-02 10:09:29.011669453 +0000 UTC m=+0.156168953 container start f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:29 np0005541913.localdomain dnsmasq[322584]: started, version 2.85 cachesize 150
Dec 02 10:09:29 np0005541913.localdomain dnsmasq[322584]: DNS service limited to local subnets
Dec 02 10:09:29 np0005541913.localdomain dnsmasq[322584]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:29 np0005541913.localdomain dnsmasq[322584]: warning: no upstream servers configured
Dec 02 10:09:29 np0005541913.localdomain dnsmasq-dhcp[322584]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:09:29 np0005541913.localdomain dnsmasq[322584]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:29 np0005541913.localdomain dnsmasq-dhcp[322584]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:29 np0005541913.localdomain dnsmasq-dhcp[322584]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:29 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:29.077 263406 INFO neutron.agent.dhcp.agent [None req-72c8040f-566b-4a27-b66c-9270e4271be9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:27Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087e6d60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087e6dc0>], id=6bc3a2cd-c417-4b5d-9595-682f73217846, ip_allocation=immediate, mac_address=fa:16:3e:3e:7c:03, name=tempest-NetworksTestDHCPv6-1742388920, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=46, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['3b8e86d5-811c-4d15-ac08-ca299a1dcf8d'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:26Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2218, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:27Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:09:29 np0005541913.localdomain ceph-mon[298296]: pgmap v347: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 1.8 KiB/s wr, 61 op/s
Dec 02 10:09:29 np0005541913.localdomain dnsmasq[322584]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 1 addresses
Dec 02 10:09:29 np0005541913.localdomain dnsmasq-dhcp[322584]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:29 np0005541913.localdomain podman[322602]: 2025-12-02 10:09:29.277814055 +0000 UTC m=+0.054619546 container kill f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:29 np0005541913.localdomain dnsmasq-dhcp[322584]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:29 np0005541913.localdomain systemd[1]: tmp-crun.wQmwH8.mount: Deactivated successfully.
Dec 02 10:09:30 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:30.224 263406 INFO neutron.agent.dhcp.agent [None req-a2769c21-c1c1-49b9-ac11-cb0ef3ad6959 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:09:30 np0005541913.localdomain ceph-mon[298296]: pgmap v348: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 4.7 KiB/s wr, 56 op/s
Dec 02 10:09:30 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:30.417 263406 INFO neutron.agent.dhcp.agent [None req-6740c1e5-40a3-467e-8ae4-367d22a6f53f - - - - - -] DHCP configuration for ports {'6bc3a2cd-c417-4b5d-9595-682f73217846'} is completed
Dec 02 10:09:30 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:30.731 2 INFO neutron.agent.securitygroups_rpc [None req-b3ef0962-bb50-4849-b9de-83492a397177 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:31.009 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:31.069 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:31 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:31.262 2 INFO neutron.agent.securitygroups_rpc [None req-b62fea3d-778e-4171-9633-628f1b789028 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:09:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:09:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:09:31 np0005541913.localdomain dnsmasq[322584]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:31 np0005541913.localdomain dnsmasq-dhcp[322584]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:31 np0005541913.localdomain podman[322638]: 2025-12-02 10:09:31.514801139 +0000 UTC m=+0.064611283 container kill f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 10:09:31 np0005541913.localdomain dnsmasq-dhcp[322584]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:31 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:09:31 np0005541913.localdomain podman[322653]: 2025-12-02 10:09:31.646880808 +0000 UTC m=+0.103891459 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:31 np0005541913.localdomain podman[322653]: 2025-12-02 10:09:31.662763691 +0000 UTC m=+0.119774322 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 02 10:09:31 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:09:31 np0005541913.localdomain dnsmasq[321744]: exiting on receipt of SIGTERM
Dec 02 10:09:31 np0005541913.localdomain podman[322697]: 2025-12-02 10:09:31.944715285 +0000 UTC m=+0.062160127 container kill c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e13ed0b0-82be-499b-b8af-a15d85a02df9, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:09:31 np0005541913.localdomain systemd[1]: libpod-c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2.scope: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain podman[322711]: 2025-12-02 10:09:32.026935276 +0000 UTC m=+0.059409264 container died c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e13ed0b0-82be-499b-b8af-a15d85a02df9, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:32 np0005541913.localdomain podman[322711]: 2025-12-02 10:09:32.082901677 +0000 UTC m=+0.115375585 container remove c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e13ed0b0-82be-499b-b8af-a15d85a02df9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:09:32 np0005541913.localdomain systemd[1]: libpod-conmon-c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2.scope: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:32.122 263406 INFO neutron.agent.dhcp.agent [None req-5f17fb96-f330-4019-aa5e-95d7d9fba19e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:32 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:32.404 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:32 np0005541913.localdomain dnsmasq[322584]: exiting on receipt of SIGTERM
Dec 02 10:09:32 np0005541913.localdomain podman[322752]: 2025-12-02 10:09:32.414175466 +0000 UTC m=+0.056154017 container kill f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:32 np0005541913.localdomain systemd[1]: libpod-f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1.scope: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:32.470 2 INFO neutron.agent.securitygroups_rpc [None req-1d81950f-2cd1-4171-b1b7-8ccf81612998 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:32 np0005541913.localdomain podman[322765]: 2025-12-02 10:09:32.490408427 +0000 UTC m=+0.058672214 container died f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:32 np0005541913.localdomain systemd[1]: tmp-crun.IH6egM.mount: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-061e35e8e2271744f22f5b8b10d46d8b4670aa5cc18d5b13debaf7e3b4be708c-merged.mount: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c91cb890c763f2b817a9d03e47be27b8f7450bbd14e8105681e8ee0af033f1b2-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2de13ed0b0\x2d82be\x2d499b\x2db8af\x2da15d85a02df9.mount: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0082650ca85757f463f17513ece17c6de93625b11703d74e781dbc5f1a1c4255-merged.mount: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain podman[322765]: 2025-12-02 10:09:32.527513676 +0000 UTC m=+0.095777393 container cleanup f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:09:32 np0005541913.localdomain systemd[1]: libpod-conmon-f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1.scope: Deactivated successfully.
Dec 02 10:09:32 np0005541913.localdomain podman[322766]: 2025-12-02 10:09:32.558510122 +0000 UTC m=+0.123345338 container remove f05328c70b6a8371db284d537df6a69435e16c191b198679fd9cab936d4007b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:09:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:32Z|00383|binding|INFO|Releasing lport 0f726015-c9a1-4dea-9271-1b6ceac095a9 from this chassis (sb_readonly=0)
Dec 02 10:09:32 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:32Z|00384|binding|INFO|Setting lport 0f726015-c9a1-4dea-9271-1b6ceac095a9 down in Southbound
Dec 02 10:09:32 np0005541913.localdomain kernel: device tap0f726015-c9 left promiscuous mode
Dec 02 10:09:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:32.609 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:32 np0005541913.localdomain ceph-mon[298296]: pgmap v349: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 3.6 KiB/s wr, 27 op/s
Dec 02 10:09:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:32.617 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0f726015-c9a1-4dea-9271-1b6ceac095a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:32.620 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0f726015-c9a1-4dea-9271-1b6ceac095a9 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:09:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:32.623 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:32.624 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[71d74e38-4c16-4f9b-9973-ed8640a1ace5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:32.630 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:33 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:33Z|00385|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:09:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:33.093 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:33 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:33.223 263406 INFO neutron.agent.dhcp.agent [None req-faa8b226-e2ab-4254-84c3-644b5b41fc72 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:33 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:09:33 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:33.507 2 INFO neutron.agent.securitygroups_rpc [None req-2ab86868-457f-4852-a90e-5fcf962a86b2 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:33 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:33.705 2 INFO neutron.agent.securitygroups_rpc [None req-4574e29d-6803-42ec-b043-afe3e9e41c81 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:09:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:09:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:09:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:09:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:09:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:09:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:09:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:09:34 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:34.167 2 INFO neutron.agent.securitygroups_rpc [None req-808059d3-8bd0-4321-909f-628d45d51793 49e91c7702d54b1ab47e5f6dec5e0208 204a1137a20e40c995bb9cd512e75a5c - - default default] Security group member updated ['53fe5435-6101-4ff1-81ad-b53da833172b']
Dec 02 10:09:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:34.221 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:34.223 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:34.225 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:09:34 np0005541913.localdomain ceph-mon[298296]: pgmap v350: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 8.7 KiB/s wr, 29 op/s
Dec 02 10:09:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:09:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:09:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:09:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 02 10:09:35 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:35.546 2 INFO neutron.agent.securitygroups_rpc [None req-3d05d8f5-1d82-449d-b4e5-f5f672622e53 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:35 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:09:35 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:09:35 np0005541913.localdomain ceph-mon[298296]: pgmap v351: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 8.2 KiB/s wr, 2 op/s
Dec 02 10:09:35 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:35.670 263406 INFO neutron.agent.linux.ip_lib [None req-a7211d26-a01f-4c7f-9f57-27cc0c04f11f - - - - - -] Device tap0845b737-de cannot be used as it has no MAC address
Dec 02 10:09:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:35.697 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:35 np0005541913.localdomain kernel: device tap0845b737-de entered promiscuous mode
Dec 02 10:09:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:35Z|00386|binding|INFO|Claiming lport 0845b737-de43-4aef-bed0-c9dd0310ccc7 for this chassis.
Dec 02 10:09:35 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670175.7036] manager: (tap0845b737-de): new Generic device (/org/freedesktop/NetworkManager/Devices/62)
Dec 02 10:09:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:35.704 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:35Z|00387|binding|INFO|0845b737-de43-4aef-bed0-c9dd0310ccc7: Claiming unknown
Dec 02 10:09:35 np0005541913.localdomain systemd-udevd[322803]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:35.718 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef2:2261/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0845b737-de43-4aef-bed0-c9dd0310ccc7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:35.721 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0845b737-de43-4aef-bed0-c9dd0310ccc7 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:09:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:35.723 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 956d2c69-d1c9-44fb-9d8b-fcedfd67220a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:35.723 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:35.724 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[530815cf-9616-49ad-be0d-5b29488422bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0845b737-de: No such device
Dec 02 10:09:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:35.732 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0845b737-de: No such device
Dec 02 10:09:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:35Z|00388|binding|INFO|Setting lport 0845b737-de43-4aef-bed0-c9dd0310ccc7 ovn-installed in OVS
Dec 02 10:09:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:35Z|00389|binding|INFO|Setting lport 0845b737-de43-4aef-bed0-c9dd0310ccc7 up in Southbound
Dec 02 10:09:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:35.733 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0845b737-de: No such device
Dec 02 10:09:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0845b737-de: No such device
Dec 02 10:09:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0845b737-de: No such device
Dec 02 10:09:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0845b737-de: No such device
Dec 02 10:09:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0845b737-de: No such device
Dec 02 10:09:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0845b737-de: No such device
Dec 02 10:09:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:35.769 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:35.791 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:36.011 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:09:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:09:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:09:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:09:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:36.070 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:09:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18779 "" "Go-http-client/1.1"
Dec 02 10:09:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:36 np0005541913.localdomain podman[322874]: 
Dec 02 10:09:36 np0005541913.localdomain podman[322874]: 2025-12-02 10:09:36.603658171 +0000 UTC m=+0.089939398 container create 62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 02 10:09:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:36.621 2 INFO neutron.agent.securitygroups_rpc [None req-11576e49-abbc-421e-9ae1-ea6ee8281fd6 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:36 np0005541913.localdomain systemd[1]: Started libpod-conmon-62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be.scope.
Dec 02 10:09:36 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:36 np0005541913.localdomain podman[322874]: 2025-12-02 10:09:36.565681469 +0000 UTC m=+0.051962736 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:36 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e045d2b0104541aa45b9552e205f851555854865b20457010235722c9179ccb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:36 np0005541913.localdomain podman[322874]: 2025-12-02 10:09:36.678493015 +0000 UTC m=+0.164774252 container init 62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:09:36 np0005541913.localdomain podman[322874]: 2025-12-02 10:09:36.688688058 +0000 UTC m=+0.174969295 container start 62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:36 np0005541913.localdomain dnsmasq[322893]: started, version 2.85 cachesize 150
Dec 02 10:09:36 np0005541913.localdomain dnsmasq[322893]: DNS service limited to local subnets
Dec 02 10:09:36 np0005541913.localdomain dnsmasq[322893]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:36 np0005541913.localdomain dnsmasq[322893]: warning: no upstream servers configured
Dec 02 10:09:36 np0005541913.localdomain dnsmasq[322893]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:36 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:36.784 263406 INFO neutron.agent.linux.ip_lib [None req-ecfa2bea-906c-4dc7-b115-80f4dd5bed17 - - - - - -] Device tapcbaad62d-24 cannot be used as it has no MAC address
Dec 02 10:09:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:36.809 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:36 np0005541913.localdomain kernel: device tapcbaad62d-24 entered promiscuous mode
Dec 02 10:09:36 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670176.8155] manager: (tapcbaad62d-24): new Generic device (/org/freedesktop/NetworkManager/Devices/63)
Dec 02 10:09:36 np0005541913.localdomain systemd-udevd[322805]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:36Z|00390|binding|INFO|Claiming lport cbaad62d-24be-4326-997a-688e88770b3c for this chassis.
Dec 02 10:09:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:36.815 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:36Z|00391|binding|INFO|cbaad62d-24be-4326-997a-688e88770b3c: Claiming unknown
Dec 02 10:09:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:36.829 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-05109c7b-d482-4449-af19-f4a4bb49c893', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05109c7b-d482-4449-af19-f4a4bb49c893', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=610be6b7-10e7-4876-ab18-6d4030872d9d, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=cbaad62d-24be-4326-997a-688e88770b3c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:36.832 160221 INFO neutron.agent.ovn.metadata.agent [-] Port cbaad62d-24be-4326-997a-688e88770b3c in datapath 05109c7b-d482-4449-af19-f4a4bb49c893 bound to our chassis
Dec 02 10:09:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:36.836 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port dbd18595-d1e5-46d1-a5da-d0034be89313 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:36.836 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 05109c7b-d482-4449-af19-f4a4bb49c893, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:36.837 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[57659bd6-01a8-4489-b2fd-a466450f9425]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:09:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapcbaad62d-24: No such device
Dec 02 10:09:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:36Z|00392|binding|INFO|Setting lport cbaad62d-24be-4326-997a-688e88770b3c ovn-installed in OVS
Dec 02 10:09:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:36Z|00393|binding|INFO|Setting lport cbaad62d-24be-4326-997a-688e88770b3c up in Southbound
Dec 02 10:09:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapcbaad62d-24: No such device
Dec 02 10:09:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:36.852 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:36 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:36.853 263406 INFO neutron.agent.dhcp.agent [None req-541c081a-bfd8-411c-90b8-cdd29ccd627f - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:09:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapcbaad62d-24: No such device
Dec 02 10:09:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapcbaad62d-24: No such device
Dec 02 10:09:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapcbaad62d-24: No such device
Dec 02 10:09:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapcbaad62d-24: No such device
Dec 02 10:09:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapcbaad62d-24: No such device
Dec 02 10:09:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapcbaad62d-24: No such device
Dec 02 10:09:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:36.893 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:36.916 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:36 np0005541913.localdomain podman[322906]: 2025-12-02 10:09:36.948710986 +0000 UTC m=+0.091508759 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:09:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:09:36 np0005541913.localdomain podman[322906]: 2025-12-02 10:09:36.984009497 +0000 UTC m=+0.126807210 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:09:36 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:09:37 np0005541913.localdomain podman[322963]: 2025-12-02 10:09:37.06292134 +0000 UTC m=+0.076240763 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:37 np0005541913.localdomain dnsmasq[322893]: exiting on receipt of SIGTERM
Dec 02 10:09:37 np0005541913.localdomain podman[322982]: 2025-12-02 10:09:37.082409889 +0000 UTC m=+0.049898961 container kill 62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:09:37 np0005541913.localdomain systemd[1]: libpod-62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be.scope: Deactivated successfully.
Dec 02 10:09:37 np0005541913.localdomain podman[322963]: 2025-12-02 10:09:37.098972101 +0000 UTC m=+0.112291534 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Dec 02 10:09:37 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:09:37 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:37.116 2 INFO neutron.agent.securitygroups_rpc [None req-841b4da2-cab1-42f7-ac13-ca29294f546a 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:37 np0005541913.localdomain podman[323006]: 2025-12-02 10:09:37.154390068 +0000 UTC m=+0.060444132 container died 62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:09:37 np0005541913.localdomain podman[323006]: 2025-12-02 10:09:37.182920058 +0000 UTC m=+0.088974042 container cleanup 62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:09:37 np0005541913.localdomain systemd[1]: libpod-conmon-62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be.scope: Deactivated successfully.
Dec 02 10:09:37 np0005541913.localdomain podman[323011]: 2025-12-02 10:09:37.231880362 +0000 UTC m=+0.126007039 container remove 62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:09:37 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:37.478 2 INFO neutron.agent.securitygroups_rpc [None req-841b4da2-cab1-42f7-ac13-ca29294f546a 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e045d2b0104541aa45b9552e205f851555854865b20457010235722c9179ccb-merged.mount: Deactivated successfully.
Dec 02 10:09:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62afbb3cf5a784195d384305e65cc4cbdf5cb827266ffaa429b3ac48efb8d8be-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:37 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:37.649 2 INFO neutron.agent.securitygroups_rpc [None req-fb787287-e6b7-452a-9552-33fb0c49fb57 f91ea2f3e6064338bfd751b12b56ae7b 873db74a4a7a4aad823d1b7e8b2d6c26 - - default default] Security group member updated ['faece1fb-3d42-4fda-a7a4-ce9b1aa942b6']
Dec 02 10:09:37 np0005541913.localdomain podman[323078]: 
Dec 02 10:09:37 np0005541913.localdomain podman[323078]: 2025-12-02 10:09:37.818502846 +0000 UTC m=+0.085451299 container create 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:37 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:37.828 2 INFO neutron.agent.securitygroups_rpc [None req-f57a8374-1238-48d5-81d1-d11d5ba885ce 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:37 np0005541913.localdomain systemd[1]: Started libpod-conmon-7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a.scope.
Dec 02 10:09:37 np0005541913.localdomain systemd[1]: tmp-crun.U5rcLP.mount: Deactivated successfully.
Dec 02 10:09:37 np0005541913.localdomain podman[323078]: 2025-12-02 10:09:37.779883096 +0000 UTC m=+0.046831619 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:37 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:37 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a87ff39b2ecb04dea17c5dd28c2d134e115adfe624f9984f492cbd7de6d99878/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:37 np0005541913.localdomain podman[323078]: 2025-12-02 10:09:37.912063309 +0000 UTC m=+0.179011762 container init 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:37 np0005541913.localdomain podman[323078]: 2025-12-02 10:09:37.918710586 +0000 UTC m=+0.185659039 container start 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 02 10:09:37 np0005541913.localdomain dnsmasq[323096]: started, version 2.85 cachesize 150
Dec 02 10:09:37 np0005541913.localdomain dnsmasq[323096]: DNS service limited to local subnets
Dec 02 10:09:37 np0005541913.localdomain dnsmasq[323096]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:37 np0005541913.localdomain dnsmasq[323096]: warning: no upstream servers configured
Dec 02 10:09:37 np0005541913.localdomain dnsmasq-dhcp[323096]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:09:37 np0005541913.localdomain dnsmasq[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/addn_hosts - 0 addresses
Dec 02 10:09:37 np0005541913.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/host
Dec 02 10:09:37 np0005541913.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/opts
Dec 02 10:09:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:38.101 263406 INFO neutron.agent.dhcp.agent [None req-b675ed77-47ef-446a-ada0-2884f7eec885 - - - - - -] DHCP configuration for ports {'7c84f980-101d-4cd5-af99-219bdb6dca01'} is completed
Dec 02 10:09:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:38.152 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:37Z, description=, device_id=71e9a386-3dc5-4933-920f-317312e12047, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908858ac0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908996310>], id=0803d891-92bd-4433-91d3-38fa14b1f114, ip_allocation=immediate, mac_address=fa:16:3e:b4:9d:e8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:09:33Z, description=, dns_domain=, id=05109c7b-d482-4449-af19-f4a4bb49c893, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1849952638, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=26733, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2238, status=ACTIVE, subnets=['6a712875-7f47-461d-b090-4a856246df1e'], tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:34Z, vlan_transparent=None, network_id=05109c7b-d482-4449-af19-f4a4bb49c893, port_security_enabled=False, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2254, status=DOWN, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:38Z on network 05109c7b-d482-4449-af19-f4a4bb49c893
Dec 02 10:09:38 np0005541913.localdomain ceph-mon[298296]: pgmap v352: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 8.2 KiB/s wr, 2 op/s
Dec 02 10:09:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:09:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:09:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:09:38 np0005541913.localdomain dnsmasq[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/addn_hosts - 1 addresses
Dec 02 10:09:38 np0005541913.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/host
Dec 02 10:09:38 np0005541913.localdomain podman[323133]: 2025-12-02 10:09:38.348854999 +0000 UTC m=+0.051175955 container kill 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:38 np0005541913.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/opts
Dec 02 10:09:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:38.765 263406 INFO neutron.agent.dhcp.agent [None req-b80f4a0b-081e-44ed-8d38-5d7b00ae9377 - - - - - -] DHCP configuration for ports {'0803d891-92bd-4433-91d3-38fa14b1f114'} is completed
Dec 02 10:09:38 np0005541913.localdomain podman[323181]: 
Dec 02 10:09:38 np0005541913.localdomain podman[323181]: 2025-12-02 10:09:38.816104041 +0000 UTC m=+0.083362743 container create 26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 10:09:38 np0005541913.localdomain systemd[1]: Started libpod-conmon-26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0.scope.
Dec 02 10:09:38 np0005541913.localdomain podman[323181]: 2025-12-02 10:09:38.777571784 +0000 UTC m=+0.044830516 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:38 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:38 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8a360b6b870487f4f87e73a8aa1f4f9953a1bffbc3a531d45090d4e01f5f96a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:38 np0005541913.localdomain podman[323181]: 2025-12-02 10:09:38.89900512 +0000 UTC m=+0.166263832 container init 26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:09:38 np0005541913.localdomain podman[323181]: 2025-12-02 10:09:38.909642863 +0000 UTC m=+0.176901565 container start 26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 10:09:38 np0005541913.localdomain dnsmasq[323200]: started, version 2.85 cachesize 150
Dec 02 10:09:38 np0005541913.localdomain dnsmasq[323200]: DNS service limited to local subnets
Dec 02 10:09:38 np0005541913.localdomain dnsmasq[323200]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:38 np0005541913.localdomain dnsmasq[323200]: warning: no upstream servers configured
Dec 02 10:09:38 np0005541913.localdomain dnsmasq-dhcp[323200]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 02 10:09:38 np0005541913.localdomain dnsmasq[323200]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:38 np0005541913.localdomain dnsmasq-dhcp[323200]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:38 np0005541913.localdomain dnsmasq-dhcp[323200]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:38.972 263406 INFO neutron.agent.dhcp.agent [None req-00734e8a-b518-42d7-adc6-79dc4b989d6a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:36Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b585b0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908b588b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990887db50>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908b588e0>], id=20c5b76e-0c80-44cf-aacd-3f2f3640633d, ip_allocation=immediate, mac_address=fa:16:3e:28:87:15, name=tempest-NetworksTestDHCPv6-176597900, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=49, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['be3eb0ce-daea-40cb-853a-f3837f9a82f6', 'e0b78287-f484-4227-959d-ba30d71df44a'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:35Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2251, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:37Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:09:39 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:39.002 2 INFO neutron.agent.securitygroups_rpc [None req-1ad7ca5c-e344-40e1-8595-888c801ea96b 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:39 np0005541913.localdomain dnsmasq[323200]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:09:39 np0005541913.localdomain dnsmasq-dhcp[323200]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:39 np0005541913.localdomain dnsmasq-dhcp[323200]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:39 np0005541913.localdomain podman[323219]: 2025-12-02 10:09:39.16354924 +0000 UTC m=+0.055230893 container kill 26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:09:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:39.194 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:37Z, description=, device_id=71e9a386-3dc5-4933-920f-317312e12047, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908841c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908841070>], id=0803d891-92bd-4433-91d3-38fa14b1f114, ip_allocation=immediate, mac_address=fa:16:3e:b4:9d:e8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:09:33Z, description=, dns_domain=, id=05109c7b-d482-4449-af19-f4a4bb49c893, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1849952638, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=26733, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2238, status=ACTIVE, subnets=['6a712875-7f47-461d-b090-4a856246df1e'], tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:34Z, vlan_transparent=None, network_id=05109c7b-d482-4449-af19-f4a4bb49c893, port_security_enabled=False, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2254, status=DOWN, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:38Z on network 05109c7b-d482-4449-af19-f4a4bb49c893
Dec 02 10:09:39 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:39.226 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:09:39 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:39.262 2 INFO neutron.agent.securitygroups_rpc [None req-cc3286c0-8479-41a4-833f-f53341ebdf18 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:39.278 263406 INFO neutron.agent.dhcp.agent [None req-7a0f97ce-92b6-48f6-aac2-ab14b5b42e71 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '0845b737-de43-4aef-bed0-c9dd0310ccc7'} is completed
Dec 02 10:09:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:39.403 263406 INFO neutron.agent.dhcp.agent [None req-0ef5e002-d21d-41ee-9265-970c9927a5b5 - - - - - -] DHCP configuration for ports {'20c5b76e-0c80-44cf-aacd-3f2f3640633d'} is completed
Dec 02 10:09:39 np0005541913.localdomain dnsmasq[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/addn_hosts - 1 addresses
Dec 02 10:09:39 np0005541913.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/host
Dec 02 10:09:39 np0005541913.localdomain podman[323272]: 2025-12-02 10:09:39.453462695 +0000 UTC m=+0.051703678 container kill 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:09:39 np0005541913.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/opts
Dec 02 10:09:39 np0005541913.localdomain dnsmasq[323200]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:39 np0005541913.localdomain dnsmasq-dhcp[323200]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:39 np0005541913.localdomain dnsmasq-dhcp[323200]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:39 np0005541913.localdomain podman[323285]: 2025-12-02 10:09:39.507216548 +0000 UTC m=+0.067488449 container kill 26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:09:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:39.741 263406 INFO neutron.agent.dhcp.agent [None req-098607c9-980d-4bbb-9369-6adb34c05e75 - - - - - -] DHCP configuration for ports {'0803d891-92bd-4433-91d3-38fa14b1f114'} is completed
Dec 02 10:09:40 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:40.226 2 INFO neutron.agent.securitygroups_rpc [None req-d726f52f-c5d0-4b2e-935e-07d00a13737f 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:40 np0005541913.localdomain ceph-mon[298296]: pgmap v353: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 18 KiB/s wr, 13 op/s
Dec 02 10:09:40 np0005541913.localdomain ceph-mon[298296]: mgrmap e47: np0005541914.lljzmk(active, since 9m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:09:40 np0005541913.localdomain systemd[1]: tmp-crun.Lxqmdo.mount: Deactivated successfully.
Dec 02 10:09:40 np0005541913.localdomain dnsmasq[323200]: exiting on receipt of SIGTERM
Dec 02 10:09:40 np0005541913.localdomain systemd[1]: libpod-26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0.scope: Deactivated successfully.
Dec 02 10:09:40 np0005541913.localdomain podman[323333]: 2025-12-02 10:09:40.655653213 +0000 UTC m=+0.069139124 container kill 26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:09:40 np0005541913.localdomain podman[323345]: 2025-12-02 10:09:40.711054789 +0000 UTC m=+0.045680378 container died 26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:09:40 np0005541913.localdomain podman[323345]: 2025-12-02 10:09:40.758513933 +0000 UTC m=+0.093139452 container cleanup 26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:09:40 np0005541913.localdomain systemd[1]: libpod-conmon-26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0.scope: Deactivated successfully.
Dec 02 10:09:40 np0005541913.localdomain podman[323354]: 2025-12-02 10:09:40.841548417 +0000 UTC m=+0.155288479 container remove 26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:41.014 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:41.072 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:41 np0005541913.localdomain dnsmasq[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/addn_hosts - 0 addresses
Dec 02 10:09:41 np0005541913.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/host
Dec 02 10:09:41 np0005541913.localdomain podman[323406]: 2025-12-02 10:09:41.155982756 +0000 UTC m=+0.069584706 container kill 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:09:41 np0005541913.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/05109c7b-d482-4449-af19-f4a4bb49c893/opts
Dec 02 10:09:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:41 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:09:41 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:09:41 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:09:41 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 02 10:09:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:41.446 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:41 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:41Z|00394|binding|INFO|Releasing lport cbaad62d-24be-4326-997a-688e88770b3c from this chassis (sb_readonly=0)
Dec 02 10:09:41 np0005541913.localdomain kernel: device tapcbaad62d-24 left promiscuous mode
Dec 02 10:09:41 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:41Z|00395|binding|INFO|Setting lport cbaad62d-24be-4326-997a-688e88770b3c down in Southbound
Dec 02 10:09:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:41.472 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:41.625 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-05109c7b-d482-4449-af19-f4a4bb49c893', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05109c7b-d482-4449-af19-f4a4bb49c893', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=610be6b7-10e7-4876-ab18-6d4030872d9d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=cbaad62d-24be-4326-997a-688e88770b3c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:41.628 160221 INFO neutron.agent.ovn.metadata.agent [-] Port cbaad62d-24be-4326-997a-688e88770b3c in datapath 05109c7b-d482-4449-af19-f4a4bb49c893 unbound from our chassis
Dec 02 10:09:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:41.630 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 05109c7b-d482-4449-af19-f4a4bb49c893, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:41.631 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a6931e-45da-42a0-a51c-79c9b74eeaad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-f8a360b6b870487f4f87e73a8aa1f4f9953a1bffbc3a531d45090d4e01f5f96a-merged.mount: Deactivated successfully.
Dec 02 10:09:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-26cbe6a9c45b23ed1f48daaadc1b5d59d7c5d5159e0dbc7b04bb373d602e27a0-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:41 np0005541913.localdomain podman[323466]: 
Dec 02 10:09:41 np0005541913.localdomain podman[323466]: 2025-12-02 10:09:41.840692773 +0000 UTC m=+0.093257127 container create f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 10:09:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720.scope.
Dec 02 10:09:41 np0005541913.localdomain podman[323466]: 2025-12-02 10:09:41.794868781 +0000 UTC m=+0.047433175 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06ba67381f0b9b7877d863f3ab5fbc0fd8196e0bb57eb66befc8af43a84552e6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:41 np0005541913.localdomain podman[323466]: 2025-12-02 10:09:41.909937908 +0000 UTC m=+0.162502262 container init f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:09:41 np0005541913.localdomain podman[323466]: 2025-12-02 10:09:41.920396567 +0000 UTC m=+0.172960931 container start f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:09:41 np0005541913.localdomain dnsmasq[323485]: started, version 2.85 cachesize 150
Dec 02 10:09:41 np0005541913.localdomain dnsmasq[323485]: DNS service limited to local subnets
Dec 02 10:09:41 np0005541913.localdomain dnsmasq[323485]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:41 np0005541913.localdomain dnsmasq[323485]: warning: no upstream servers configured
Dec 02 10:09:41 np0005541913.localdomain dnsmasq[323485]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:42.204 263406 INFO neutron.agent.dhcp.agent [None req-50462b5c-5931-4a56-ba65-733f2870f6f1 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '0845b737-de43-4aef-bed0-c9dd0310ccc7'} is completed
Dec 02 10:09:42 np0005541913.localdomain dnsmasq[323485]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:42 np0005541913.localdomain podman[323503]: 2025-12-02 10:09:42.302974993 +0000 UTC m=+0.048371481 container kill f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:42 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:09:42 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:09:42 np0005541913.localdomain ceph-mon[298296]: pgmap v354: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 14 KiB/s wr, 12 op/s
Dec 02 10:09:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:42.708 263406 INFO neutron.agent.dhcp.agent [None req-905300cb-4c0c-448f-89c3-f221575adcef - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '0845b737-de43-4aef-bed0-c9dd0310ccc7'} is completed
Dec 02 10:09:42 np0005541913.localdomain podman[323541]: 2025-12-02 10:09:42.841295928 +0000 UTC m=+0.069574995 container kill f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 10:09:42 np0005541913.localdomain dnsmasq[323485]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:43 np0005541913.localdomain dnsmasq[323096]: exiting on receipt of SIGTERM
Dec 02 10:09:43 np0005541913.localdomain systemd[1]: tmp-crun.0fXi5Z.mount: Deactivated successfully.
Dec 02 10:09:43 np0005541913.localdomain podman[323577]: 2025-12-02 10:09:43.073546018 +0000 UTC m=+0.051811833 container kill 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:43 np0005541913.localdomain systemd[1]: libpod-7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a.scope: Deactivated successfully.
Dec 02 10:09:43 np0005541913.localdomain podman[323591]: 2025-12-02 10:09:43.122321437 +0000 UTC m=+0.038323943 container died 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:09:43 np0005541913.localdomain podman[323591]: 2025-12-02 10:09:43.151553707 +0000 UTC m=+0.067556213 container cleanup 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:09:43 np0005541913.localdomain systemd[1]: libpod-conmon-7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a.scope: Deactivated successfully.
Dec 02 10:09:43 np0005541913.localdomain podman[323593]: 2025-12-02 10:09:43.209177632 +0000 UTC m=+0.118122809 container remove 7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05109c7b-d482-4449-af19-f4a4bb49c893, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:09:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:43.292 263406 INFO neutron.agent.dhcp.agent [None req-ef93004c-3c39-406f-bf32-a724b707d8de - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '0845b737-de43-4aef-bed0-c9dd0310ccc7'} is completed
Dec 02 10:09:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:43.299 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:43 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2055797992' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:09:43 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2055797992' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:09:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a87ff39b2ecb04dea17c5dd28c2d134e115adfe624f9984f492cbd7de6d99878-merged.mount: Deactivated successfully.
Dec 02 10:09:43 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c35db7fe2764775d51d30f075b938e893c95ba1b691a66fc72c0d39e338b55a-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:43 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d05109c7b\x2dd482\x2d4449\x2daf19\x2df4a4bb49c893.mount: Deactivated successfully.
Dec 02 10:09:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:43.679 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:43Z|00396|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:09:43 np0005541913.localdomain dnsmasq[323485]: exiting on receipt of SIGTERM
Dec 02 10:09:43 np0005541913.localdomain systemd[1]: libpod-f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720.scope: Deactivated successfully.
Dec 02 10:09:43 np0005541913.localdomain podman[323638]: 2025-12-02 10:09:43.988231563 +0000 UTC m=+0.066330168 container kill f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:09:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:44.029 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:44 np0005541913.localdomain podman[323653]: 2025-12-02 10:09:44.109865315 +0000 UTC m=+0.061022838 container died f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:09:44 np0005541913.localdomain systemd[1]: tmp-crun.RJnPdc.mount: Deactivated successfully.
Dec 02 10:09:44 np0005541913.localdomain podman[323653]: 2025-12-02 10:09:44.160518224 +0000 UTC m=+0.111675727 container remove f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:44 np0005541913.localdomain systemd[1]: libpod-conmon-f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720.scope: Deactivated successfully.
Dec 02 10:09:44 np0005541913.localdomain ceph-mon[298296]: pgmap v355: 177 pgs: 177 active+clean; 146 MiB data, 870 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 63 op/s
Dec 02 10:09:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:09:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:44 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:09:44 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:44.616 2 INFO neutron.agent.securitygroups_rpc [None req-fbbd4af2-250f-4ff2-b3ab-e75b109a47fa 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-06ba67381f0b9b7877d863f3ab5fbc0fd8196e0bb57eb66befc8af43a84552e6-merged.mount: Deactivated successfully.
Dec 02 10:09:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f7d73300a9323e51821c20b4cb037d2740df83ae267c170b5d076fff552cc720-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:44 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:44.680 2 INFO neutron.agent.securitygroups_rpc [None req-26916261-820c-405a-8570-4b6047e10a3c 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:45 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:09:45 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:45.721 263406 INFO neutron.agent.linux.ip_lib [None req-770421f6-3f2e-4c35-98e2-e6c251798a4e - - - - - -] Device tap71626d7c-b9 cannot be used as it has no MAC address
Dec 02 10:09:45 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:45.751 2 INFO neutron.agent.securitygroups_rpc [None req-5632dc43-e5b5-45de-a516-10b988e48fe8 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:45.750 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:45 np0005541913.localdomain kernel: device tap71626d7c-b9 entered promiscuous mode
Dec 02 10:09:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:45.756 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:45 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670185.7579] manager: (tap71626d7c-b9): new Generic device (/org/freedesktop/NetworkManager/Devices/64)
Dec 02 10:09:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:45Z|00397|binding|INFO|Claiming lport 71626d7c-b9fe-49b7-966b-658b52a6b534 for this chassis.
Dec 02 10:09:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:45Z|00398|binding|INFO|71626d7c-b9fe-49b7-966b-658b52a6b534: Claiming unknown
Dec 02 10:09:45 np0005541913.localdomain systemd-udevd[323716]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:45.774 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-1ff380aa-d975-48ea-aada-9148640d9136', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ff380aa-d975-48ea-aada-9148640d9136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aae5e2dae10d49c38d5d63835c7677e3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f283dbb1-0190-4442-8c26-ed3f8c0bba35, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=71626d7c-b9fe-49b7-966b-658b52a6b534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:45.776 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 71626d7c-b9fe-49b7-966b-658b52a6b534 in datapath 1ff380aa-d975-48ea-aada-9148640d9136 bound to our chassis
Dec 02 10:09:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:45.778 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1ff380aa-d975-48ea-aada-9148640d9136 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:09:45 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:45.778 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[22e7a3d1-2a7f-4b26-b843-853a8a6a21af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:45 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap71626d7c-b9: No such device
Dec 02 10:09:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:45.794 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:45 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap71626d7c-b9: No such device
Dec 02 10:09:45 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:45.798 2 INFO neutron.agent.securitygroups_rpc [None req-50568852-e227-40d9-a94b-d9d972f0134a 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:45 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap71626d7c-b9: No such device
Dec 02 10:09:45 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap71626d7c-b9: No such device
Dec 02 10:09:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:45.807 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:45Z|00399|binding|INFO|Setting lport 71626d7c-b9fe-49b7-966b-658b52a6b534 ovn-installed in OVS
Dec 02 10:09:45 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:45Z|00400|binding|INFO|Setting lport 71626d7c-b9fe-49b7-966b-658b52a6b534 up in Southbound
Dec 02 10:09:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:45.808 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:45 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap71626d7c-b9: No such device
Dec 02 10:09:45 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap71626d7c-b9: No such device
Dec 02 10:09:45 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap71626d7c-b9: No such device
Dec 02 10:09:45 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap71626d7c-b9: No such device
Dec 02 10:09:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:45.845 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:45.876 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:46.020 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:46 np0005541913.localdomain podman[323766]: 
Dec 02 10:09:46 np0005541913.localdomain podman[323766]: 2025-12-02 10:09:46.049208985 +0000 UTC m=+0.104036433 container create a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:46.074 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:46 np0005541913.localdomain podman[323766]: 2025-12-02 10:09:45.994419095 +0000 UTC m=+0.049246583 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:46 np0005541913.localdomain systemd[1]: Started libpod-conmon-a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515.scope.
Dec 02 10:09:46 np0005541913.localdomain systemd[1]: tmp-crun.Rnv80Q.mount: Deactivated successfully.
Dec 02 10:09:46 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:46 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a71f2964b61897d8f2914a8e03c9b6a8f005672548d86055f884c131b4bf1b6c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:46 np0005541913.localdomain podman[323766]: 2025-12-02 10:09:46.164997081 +0000 UTC m=+0.219824519 container init a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:09:46 np0005541913.localdomain podman[323766]: 2025-12-02 10:09:46.175909022 +0000 UTC m=+0.230736470 container start a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:09:46 np0005541913.localdomain dnsmasq[323791]: started, version 2.85 cachesize 150
Dec 02 10:09:46 np0005541913.localdomain dnsmasq[323791]: DNS service limited to local subnets
Dec 02 10:09:46 np0005541913.localdomain dnsmasq[323791]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:46 np0005541913.localdomain dnsmasq[323791]: warning: no upstream servers configured
Dec 02 10:09:46 np0005541913.localdomain dnsmasq-dhcp[323791]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:09:46 np0005541913.localdomain dnsmasq-dhcp[323791]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 02 10:09:46 np0005541913.localdomain dnsmasq[323791]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:46 np0005541913.localdomain dnsmasq-dhcp[323791]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:46 np0005541913.localdomain dnsmasq-dhcp[323791]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:46.250 263406 INFO neutron.agent.dhcp.agent [None req-061472a1-7ad7-445e-a4cc-30ebd0a6ca91 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:43Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908813ca0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908813b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990884a4f0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f990884a5e0>], id=01a2d6dd-8ce8-48bd-93c0-ba2e2c26cdf6, ip_allocation=immediate, mac_address=fa:16:3e:dc:c3:b7, name=tempest-NetworksTestDHCPv6-1006860435, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['1234e32b-b572-4333-ae69-9076e6f1997e', '1fb1b810-2d2d-4940-8cad-41ff695eec18'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:43Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2274, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:43Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:09:46 np0005541913.localdomain ceph-mon[298296]: pgmap v356: 177 pgs: 177 active+clean; 146 MiB data, 870 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 62 op/s
Dec 02 10:09:46 np0005541913.localdomain dnsmasq[323791]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:09:46 np0005541913.localdomain dnsmasq-dhcp[323791]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:46 np0005541913.localdomain dnsmasq-dhcp[323791]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:46 np0005541913.localdomain podman[323816]: 2025-12-02 10:09:46.441329995 +0000 UTC m=+0.055295695 container kill a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 10:09:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:46.455 263406 INFO neutron.agent.dhcp.agent [None req-f015b6be-42f5-411a-b629-df358d78ea55 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '0845b737-de43-4aef-bed0-c9dd0310ccc7'} is completed
Dec 02 10:09:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:46.679 263406 INFO neutron.agent.dhcp.agent [None req-4e612800-7418-428f-a75f-5888b299f387 - - - - - -] DHCP configuration for ports {'01a2d6dd-8ce8-48bd-93c0-ba2e2c26cdf6'} is completed
Dec 02 10:09:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:46.703 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 35e0944d-0cc9-46d5-b463-8e827905e9f6 with type ""
Dec 02 10:09:46 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:46Z|00401|binding|INFO|Removing iface tap71626d7c-b9 ovn-installed in OVS
Dec 02 10:09:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:46.705 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-1ff380aa-d975-48ea-aada-9148640d9136', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ff380aa-d975-48ea-aada-9148640d9136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aae5e2dae10d49c38d5d63835c7677e3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f283dbb1-0190-4442-8c26-ed3f8c0bba35, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=71626d7c-b9fe-49b7-966b-658b52a6b534) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:46 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:46Z|00402|binding|INFO|Removing lport 71626d7c-b9fe-49b7-966b-658b52a6b534 ovn-installed in OVS
Dec 02 10:09:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:46.707 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 71626d7c-b9fe-49b7-966b-658b52a6b534 in datapath 1ff380aa-d975-48ea-aada-9148640d9136 unbound from our chassis
Dec 02 10:09:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:46.710 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1ff380aa-d975-48ea-aada-9148640d9136 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:09:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:46.711 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c6194c-8fd9-44ce-ad9a-07d39f439a33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:46.758 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:46 np0005541913.localdomain dnsmasq[323791]: exiting on receipt of SIGTERM
Dec 02 10:09:46 np0005541913.localdomain podman[323887]: 2025-12-02 10:09:46.916410865 +0000 UTC m=+0.064164280 container kill a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:09:46 np0005541913.localdomain systemd[1]: libpod-a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515.scope: Deactivated successfully.
Dec 02 10:09:46 np0005541913.localdomain podman[323877]: 
Dec 02 10:09:46 np0005541913.localdomain podman[323877]: 2025-12-02 10:09:46.928930399 +0000 UTC m=+0.105824931 container create 22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ff380aa-d975-48ea-aada-9148640d9136, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:09:46 np0005541913.localdomain systemd[1]: Started libpod-conmon-22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044.scope.
Dec 02 10:09:46 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:46 np0005541913.localdomain podman[323877]: 2025-12-02 10:09:46.878727611 +0000 UTC m=+0.055622193 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:46 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d91a0a58d34bd39dcc08a9cbec3014076ed043bc85e9c4b743578420c1d1c1a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:46 np0005541913.localdomain podman[323877]: 2025-12-02 10:09:46.998678088 +0000 UTC m=+0.175572630 container init 22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ff380aa-d975-48ea-aada-9148640d9136, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 02 10:09:47 np0005541913.localdomain podman[323877]: 2025-12-02 10:09:47.00815195 +0000 UTC m=+0.185046462 container start 22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ff380aa-d975-48ea-aada-9148640d9136, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:47 np0005541913.localdomain dnsmasq[323935]: started, version 2.85 cachesize 150
Dec 02 10:09:47 np0005541913.localdomain dnsmasq[323935]: DNS service limited to local subnets
Dec 02 10:09:47 np0005541913.localdomain dnsmasq[323935]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:47 np0005541913.localdomain dnsmasq[323935]: warning: no upstream servers configured
Dec 02 10:09:47 np0005541913.localdomain dnsmasq-dhcp[323935]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:09:47 np0005541913.localdomain dnsmasq[323935]: read /var/lib/neutron/dhcp/1ff380aa-d975-48ea-aada-9148640d9136/addn_hosts - 0 addresses
Dec 02 10:09:47 np0005541913.localdomain dnsmasq-dhcp[323935]: read /var/lib/neutron/dhcp/1ff380aa-d975-48ea-aada-9148640d9136/host
Dec 02 10:09:47 np0005541913.localdomain dnsmasq-dhcp[323935]: read /var/lib/neutron/dhcp/1ff380aa-d975-48ea-aada-9148640d9136/opts
Dec 02 10:09:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:47Z|00403|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:09:47 np0005541913.localdomain podman[323904]: 2025-12-02 10:09:47.0531721 +0000 UTC m=+0.117305748 container died a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:09:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:47.059 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:47 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:47 np0005541913.localdomain podman[323904]: 2025-12-02 10:09:47.08505909 +0000 UTC m=+0.149192708 container cleanup a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:09:47 np0005541913.localdomain systemd[1]: libpod-conmon-a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515.scope: Deactivated successfully.
Dec 02 10:09:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:47.125 263406 INFO neutron.agent.dhcp.agent [None req-9b0f8d69-dce7-4607-abc9-0f8fd56d2265 - - - - - -] DHCP configuration for ports {'e213d2f6-a81e-4742-91b5-9e4adb2a81c7'} is completed
Dec 02 10:09:47 np0005541913.localdomain podman[323906]: 2025-12-02 10:09:47.134536808 +0000 UTC m=+0.191556196 container remove a60738a1582b279a04499731949681e07ae5f4163407375a94b2bf434e0a3515 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:47 np0005541913.localdomain dnsmasq[323935]: exiting on receipt of SIGTERM
Dec 02 10:09:47 np0005541913.localdomain podman[323958]: 2025-12-02 10:09:47.253152939 +0000 UTC m=+0.062144676 container kill 22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ff380aa-d975-48ea-aada-9148640d9136, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:47 np0005541913.localdomain systemd[1]: libpod-22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044.scope: Deactivated successfully.
Dec 02 10:09:47 np0005541913.localdomain podman[323971]: 2025-12-02 10:09:47.315439119 +0000 UTC m=+0.048499403 container died 22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ff380aa-d975-48ea-aada-9148640d9136, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:47 np0005541913.localdomain podman[323971]: 2025-12-02 10:09:47.395305327 +0000 UTC m=+0.128365561 container cleanup 22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ff380aa-d975-48ea-aada-9148640d9136, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:47 np0005541913.localdomain systemd[1]: libpod-conmon-22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044.scope: Deactivated successfully.
Dec 02 10:09:47 np0005541913.localdomain podman[323973]: 2025-12-02 10:09:47.417371145 +0000 UTC m=+0.141521082 container remove 22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ff380aa-d975-48ea-aada-9148640d9136, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:47 np0005541913.localdomain kernel: device tap71626d7c-b9 left promiscuous mode
Dec 02 10:09:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:47.428 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:47.440 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:47.466 263406 INFO neutron.agent.dhcp.agent [None req-c3ff87fb-0d04-4261-bc95-972a4420b0ea - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:47.467 263406 INFO neutron.agent.dhcp.agent [None req-c3ff87fb-0d04-4261-bc95-972a4420b0ea - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:47.467 263406 INFO neutron.agent.dhcp.agent [None req-c3ff87fb-0d04-4261-bc95-972a4420b0ea - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:47Z|00404|binding|INFO|Releasing lport 0845b737-de43-4aef-bed0-c9dd0310ccc7 from this chassis (sb_readonly=0)
Dec 02 10:09:47 np0005541913.localdomain kernel: device tap0845b737-de left promiscuous mode
Dec 02 10:09:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:47Z|00405|binding|INFO|Setting lport 0845b737-de43-4aef-bed0-c9dd0310ccc7 down in Southbound
Dec 02 10:09:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:47.484 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:47.500 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:47.502 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:47.510 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::f816:3eff:fef2:2261/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0845b737-de43-4aef-bed0-c9dd0310ccc7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:47.512 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0845b737-de43-4aef-bed0-c9dd0310ccc7 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:09:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:47.514 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:47.515 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[756864ba-a8f9-453b-a232-692e09fd918d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:47.553 263406 INFO neutron.agent.dhcp.agent [None req-9a676760-0409-4231-9eff-6533d601fa1e - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '0845b737-de43-4aef-bed0-c9dd0310ccc7'} is completed
Dec 02 10:09:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d91a0a58d34bd39dcc08a9cbec3014076ed043bc85e9c4b743578420c1d1c1a5-merged.mount: Deactivated successfully.
Dec 02 10:09:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-22d4491abb2099d1a4e03c8c4b14c710b2ffdbfc86fefd09e19d61438cac3044-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:48 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a71f2964b61897d8f2914a8e03c9b6a8f005672548d86055f884c131b4bf1b6c-merged.mount: Deactivated successfully.
Dec 02 10:09:48 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d1ff380aa\x2dd975\x2d48ea\x2daada\x2d9148640d9136.mount: Deactivated successfully.
Dec 02 10:09:48 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:09:48 np0005541913.localdomain ceph-mon[298296]: pgmap v357: 177 pgs: 177 active+clean; 146 MiB data, 870 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 62 op/s
Dec 02 10:09:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:09:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:09:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:09:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 02 10:09:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:48.547 263406 INFO neutron.agent.linux.ip_lib [None req-16efee1f-1beb-4ba5-a1a9-346a37ad8668 - - - - - -] Device tap92c576f1-7f cannot be used as it has no MAC address
Dec 02 10:09:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:48.575 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:48 np0005541913.localdomain kernel: device tap92c576f1-7f entered promiscuous mode
Dec 02 10:09:48 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670188.5831] manager: (tap92c576f1-7f): new Generic device (/org/freedesktop/NetworkManager/Devices/65)
Dec 02 10:09:48 np0005541913.localdomain systemd-udevd[323718]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:48.584 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:48Z|00406|binding|INFO|Claiming lport 92c576f1-7fec-4bf8-bac6-a5142e33525f for this chassis.
Dec 02 10:09:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:48Z|00407|binding|INFO|92c576f1-7fec-4bf8-bac6-a5142e33525f: Claiming unknown
Dec 02 10:09:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:48Z|00408|binding|INFO|Setting lport 92c576f1-7fec-4bf8-bac6-a5142e33525f ovn-installed in OVS
Dec 02 10:09:48 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:48Z|00409|binding|INFO|Setting lport 92c576f1-7fec-4bf8-bac6-a5142e33525f up in Southbound
Dec 02 10:09:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:48.595 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=92c576f1-7fec-4bf8-bac6-a5142e33525f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:48.601 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 92c576f1-7fec-4bf8-bac6-a5142e33525f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:09:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:48.601 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:48.603 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 13eaf550-7fb7-4bf7-a22d-4be7f3776b15 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:48.604 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:48 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:48.605 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[38c01e5d-54c4-4100-96fd-1b16e194ff32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:09:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:48.623 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:48.661 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:48.692 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:48 np0005541913.localdomain podman[324014]: 2025-12-02 10:09:48.714719088 +0000 UTC m=+0.085444587 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 02 10:09:48 np0005541913.localdomain podman[324014]: 2025-12-02 10:09:48.721179881 +0000 UTC m=+0.091905340 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 10:09:48 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:09:49 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:49.245 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:09:49 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:09:49 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:09:49 np0005541913.localdomain podman[324083]: 
Dec 02 10:09:49 np0005541913.localdomain podman[324083]: 2025-12-02 10:09:49.525382472 +0000 UTC m=+0.096495693 container create 2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 02 10:09:49 np0005541913.localdomain systemd[1]: Started libpod-conmon-2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339.scope.
Dec 02 10:09:49 np0005541913.localdomain podman[324083]: 2025-12-02 10:09:49.478945004 +0000 UTC m=+0.050058245 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:49 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:49 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd5e0da3a9274120093985a24a538856f9ad9edfb441f144c923f70d7f4a7a93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:49 np0005541913.localdomain podman[324083]: 2025-12-02 10:09:49.60033512 +0000 UTC m=+0.171448311 container init 2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:09:49 np0005541913.localdomain podman[324083]: 2025-12-02 10:09:49.606824772 +0000 UTC m=+0.177937973 container start 2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:09:49 np0005541913.localdomain dnsmasq[324102]: started, version 2.85 cachesize 150
Dec 02 10:09:49 np0005541913.localdomain dnsmasq[324102]: DNS service limited to local subnets
Dec 02 10:09:49 np0005541913.localdomain dnsmasq[324102]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:49 np0005541913.localdomain dnsmasq[324102]: warning: no upstream servers configured
Dec 02 10:09:49 np0005541913.localdomain dnsmasq-dhcp[324102]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:09:49 np0005541913.localdomain dnsmasq[324102]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:49 np0005541913.localdomain dnsmasq-dhcp[324102]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:49 np0005541913.localdomain dnsmasq-dhcp[324102]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:49Z|00410|binding|INFO|Releasing lport 92c576f1-7fec-4bf8-bac6-a5142e33525f from this chassis (sb_readonly=0)
Dec 02 10:09:49 np0005541913.localdomain kernel: device tap92c576f1-7f left promiscuous mode
Dec 02 10:09:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:49Z|00411|binding|INFO|Setting lport 92c576f1-7fec-4bf8-bac6-a5142e33525f down in Southbound
Dec 02 10:09:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:49.690 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:49.697 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=92c576f1-7fec-4bf8-bac6-a5142e33525f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:49.698 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 92c576f1-7fec-4bf8-bac6-a5142e33525f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:09:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:49.699 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:49.700 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ef650d9a-d2d1-4507-be80-9513ea15722b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:49.715 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:49 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:49.730 263406 INFO neutron.agent.dhcp.agent [None req-817c9eb6-03c8-4368-8c16-46eb6b30f438 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:09:50 np0005541913.localdomain ceph-mon[298296]: pgmap v358: 177 pgs: 177 active+clean; 146 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 65 op/s
Dec 02 10:09:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:51.020 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:51.075 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:51.477 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:19:93 2001:db8:0:1:f816:3eff:fee6:1993'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a59d5a92-7a77-419d-a87f-fbb46ea78955) old=Port_Binding(mac=['fa:16:3e:e6:19:93 2001:db8::f816:3eff:fee6:1993'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee6:1993/64', 'neutron:device_id': 'ovnmeta-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:51.479 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a59d5a92-7a77-419d-a87f-fbb46ea78955 in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 updated
Dec 02 10:09:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:51.481 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:51.482 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6a9b1125-40d6-45e1-ad1e-31267c7212af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:09:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:09:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:51.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:51 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:09:51 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:51.998 263406 INFO neutron.agent.linux.ip_lib [None req-3703d572-c5ea-43bd-881a-ea84f051eddb - - - - - -] Device tapb35a7019-cd cannot be used as it has no MAC address
Dec 02 10:09:52 np0005541913.localdomain dnsmasq[324102]: exiting on receipt of SIGTERM
Dec 02 10:09:52 np0005541913.localdomain systemd[1]: libpod-2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339.scope: Deactivated successfully.
Dec 02 10:09:52 np0005541913.localdomain podman[324130]: 2025-12-02 10:09:52.008894105 +0000 UTC m=+0.078139534 container kill 2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.023 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain kernel: device tapb35a7019-cd entered promiscuous mode
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.031 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670192.0313] manager: (tapb35a7019-cd): new Generic device (/org/freedesktop/NetworkManager/Devices/66)
Dec 02 10:09:52 np0005541913.localdomain systemd-udevd[324170]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:52Z|00412|binding|INFO|Claiming lport b35a7019-cd13-49ba-ae0b-aa70d3ce3b27 for this chassis.
Dec 02 10:09:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:52Z|00413|binding|INFO|b35a7019-cd13-49ba-ae0b-aa70d3ce3b27: Claiming unknown
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.045 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-5f48cce7-247c-4b5d-8287-ac14f7453254', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f48cce7-247c-4b5d-8287-ac14f7453254', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bad680c763640dba71a7865b355817c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=beadeea7-0616-4ea7-b4f9-7f4239a4c055, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=b35a7019-cd13-49ba-ae0b-aa70d3ce3b27) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.047 160221 INFO neutron.agent.ovn.metadata.agent [-] Port b35a7019-cd13-49ba-ae0b-aa70d3ce3b27 in datapath 5f48cce7-247c-4b5d-8287-ac14f7453254 bound to our chassis
Dec 02 10:09:52 np0005541913.localdomain podman[324122]: 2025-12-02 10:09:51.99707956 +0000 UTC m=+0.081518043 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.052 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6d45c4c4-f9a4-4f81-8d66-a84620e4f8a7 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.052 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5f48cce7-247c-4b5d-8287-ac14f7453254, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.053 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[54de9c86-fa9e-43fa-8071-1a8a3441794c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:52Z|00414|binding|INFO|Setting lport b35a7019-cd13-49ba-ae0b-aa70d3ce3b27 ovn-installed in OVS
Dec 02 10:09:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:52Z|00415|binding|INFO|Setting lport b35a7019-cd13-49ba-ae0b-aa70d3ce3b27 up in Southbound
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.081 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain podman[324122]: 2025-12-02 10:09:52.083075992 +0000 UTC m=+0.167514455 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 02 10:09:52 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:09:52 np0005541913.localdomain podman[324169]: 2025-12-02 10:09:52.114761856 +0000 UTC m=+0.059103266 container died 2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.130 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.166 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain podman[324169]: 2025-12-02 10:09:52.205948377 +0000 UTC m=+0.150289787 container remove 2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:52 np0005541913.localdomain systemd[1]: libpod-conmon-2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339.scope: Deactivated successfully.
Dec 02 10:09:52 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:52.278 263406 INFO neutron.agent.linux.ip_lib [None req-92afd4b8-08b5-47b8-b188-6b3ae2f822da - - - - - -] Device tap92c576f1-7f cannot be used as it has no MAC address
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.294 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain kernel: device tap92c576f1-7f entered promiscuous mode
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.298 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670192.2984] manager: (tap92c576f1-7f): new Generic device (/org/freedesktop/NetworkManager/Devices/67)
Dec 02 10:09:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:52Z|00416|binding|INFO|Claiming lport 92c576f1-7fec-4bf8-bac6-a5142e33525f for this chassis.
Dec 02 10:09:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:52Z|00417|binding|INFO|92c576f1-7fec-4bf8-bac6-a5142e33525f: Claiming unknown
Dec 02 10:09:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:52Z|00418|binding|INFO|Setting lport 92c576f1-7fec-4bf8-bac6-a5142e33525f ovn-installed in OVS
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.303 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.305 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.333 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.363 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:52.378 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.412 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fed5:4cac/64 2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=92c576f1-7fec-4bf8-bac6-a5142e33525f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:52Z|00419|binding|INFO|Setting lport 92c576f1-7fec-4bf8-bac6-a5142e33525f up in Southbound
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.413 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 92c576f1-7fec-4bf8-bac6-a5142e33525f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.415 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 13eaf550-7fb7-4bf7-a22d-4be7f3776b15 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.416 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:52 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:52.416 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[d21532d6-968d-47bc-8c89-bc428e99735c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:09:52 np0005541913.localdomain ceph-mon[298296]: pgmap v359: 177 pgs: 177 active+clean; 146 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 1.8 MiB/s wr, 54 op/s
Dec 02 10:09:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-dd5e0da3a9274120093985a24a538856f9ad9edfb441f144c923f70d7f4a7a93-merged.mount: Deactivated successfully.
Dec 02 10:09:52 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2d078d0579b19647e101536e938f8941c1375245736bc9718b9a57e8363a4339-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:52 np0005541913.localdomain podman[324280]: 
Dec 02 10:09:53 np0005541913.localdomain podman[324280]: 2025-12-02 10:09:53.009508291 +0000 UTC m=+0.081193335 container create 9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:09:53 np0005541913.localdomain podman[324280]: 2025-12-02 10:09:52.961519351 +0000 UTC m=+0.033204435 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: Started libpod-conmon-9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f.scope.
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:53 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c861e4f12c023b4413be9bdb00f291501aad1d68a54fc61728f90b4aa4245df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:53 np0005541913.localdomain podman[324280]: 2025-12-02 10:09:53.085070694 +0000 UTC m=+0.156755738 container init 9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:53 np0005541913.localdomain podman[324280]: 2025-12-02 10:09:53.094104175 +0000 UTC m=+0.165789189 container start 9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324316]: started, version 2.85 cachesize 150
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324316]: DNS service limited to local subnets
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324316]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324316]: warning: no upstream servers configured
Dec 02 10:09:53 np0005541913.localdomain dnsmasq-dhcp[324316]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324316]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/addn_hosts - 0 addresses
Dec 02 10:09:53 np0005541913.localdomain dnsmasq-dhcp[324316]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/host
Dec 02 10:09:53 np0005541913.localdomain dnsmasq-dhcp[324316]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/opts
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324316]: exiting on receipt of SIGTERM
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: libpod-9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f.scope: Deactivated successfully.
Dec 02 10:09:53 np0005541913.localdomain podman[324323]: 2025-12-02 10:09:53.181796372 +0000 UTC m=+0.061856880 container died 9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:09:53 np0005541913.localdomain podman[324323]: 2025-12-02 10:09:53.214800111 +0000 UTC m=+0.094860599 container cleanup 9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:09:53 np0005541913.localdomain podman[324335]: 2025-12-02 10:09:53.240183708 +0000 UTC m=+0.059477036 container cleanup 9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:53 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:53.244 2 INFO neutron.agent.securitygroups_rpc [None req-2a02d4a7-eedb-47f7-975e-8a697d665d71 6a4701e292e04a82a827d127f0ef5b65 0b7e671d1f944c979f6feba0246d3141 - - default default] Security group member updated ['274309be-bd70-4043-9459-2a1d0784f871']
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: libpod-conmon-9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f.scope: Deactivated successfully.
Dec 02 10:09:53 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:53.304 2 INFO neutron.agent.securitygroups_rpc [None req-384a8cd4-c502-4296-9a0a-cda4da9440fe 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:53 np0005541913.localdomain podman[324350]: 2025-12-02 10:09:53.311852437 +0000 UTC m=+0.075275606 container remove 9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:53.313 263406 INFO neutron.agent.dhcp.agent [None req-9d31655c-bfc4-45b2-b15e-b46b658fb8d4 - - - - - -] DHCP configuration for ports {'aa0c09f4-e09c-448e-8cfa-94b6199246a9'} is completed
Dec 02 10:09:53 np0005541913.localdomain podman[324368]: 
Dec 02 10:09:53 np0005541913.localdomain podman[324368]: 2025-12-02 10:09:53.402308218 +0000 UTC m=+0.076291664 container create 79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: Started libpod-conmon-79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b.scope.
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:53 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b078218d8ff167202cb16b55a3c321df829dc784041145cd6eb709858f6ad54d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:53 np0005541913.localdomain podman[324368]: 2025-12-02 10:09:53.358323176 +0000 UTC m=+0.032306692 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:53 np0005541913.localdomain podman[324368]: 2025-12-02 10:09:53.458437744 +0000 UTC m=+0.132421180 container init 79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 02 10:09:53 np0005541913.localdomain podman[324368]: 2025-12-02 10:09:53.468639945 +0000 UTC m=+0.142623381 container start 79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324387]: started, version 2.85 cachesize 150
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324387]: DNS service limited to local subnets
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324387]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324387]: warning: no upstream servers configured
Dec 02 10:09:53 np0005541913.localdomain dnsmasq-dhcp[324387]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:53 np0005541913.localdomain dnsmasq-dhcp[324387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:53 np0005541913.localdomain dnsmasq-dhcp[324387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:53.513 263406 INFO neutron.agent.dhcp.agent [None req-92afd4b8-08b5-47b8-b188-6b3ae2f822da - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:52Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d5520>, <neutron.agent.linux.dhcp.DictModel object at 0x7f99087d57c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d5820>, <neutron.agent.linux.dhcp.DictModel object at 0x7f99087d50a0>], id=8dce7b35-4d0d-431f-9d67-447f953f0572, ip_allocation=immediate, mac_address=fa:16:3e:53:62:1e, name=tempest-NetworksTestDHCPv6-568989, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['daf22fec-c798-4dca-81bd-963fd98c882c', 'f425644c-747f-4698-a225-0d467296fbc7'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:49Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2344, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:52Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:09:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:09:53 np0005541913.localdomain dnsmasq[324387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:09:53 np0005541913.localdomain podman[324406]: 2025-12-02 10:09:53.680228285 +0000 UTC m=+0.060185496 container kill 79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:09:53 np0005541913.localdomain dnsmasq-dhcp[324387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:53 np0005541913.localdomain dnsmasq-dhcp[324387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:53.814 263406 INFO neutron.agent.dhcp.agent [None req-4f9397d6-fafc-476f-99f1-4de003c9b3ac - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '92c576f1-7fec-4bf8-bac6-a5142e33525f'} is completed
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:09:53 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:53.870 2 INFO neutron.agent.securitygroups_rpc [None req-552eb951-c19a-4f29-a133-451809159dee 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7c861e4f12c023b4413be9bdb00f291501aad1d68a54fc61728f90b4aa4245df-merged.mount: Deactivated successfully.
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c73790160d60417a1f35139c8e30d45abddb173af456abed2dccc0b2c16973f-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:53 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:53.952 2 INFO neutron.agent.securitygroups_rpc [None req-793d7f6f-bcaa-4aba-a1ec-f239eb834fe6 6a4701e292e04a82a827d127f0ef5b65 0b7e671d1f944c979f6feba0246d3141 - - default default] Security group member updated ['274309be-bd70-4043-9459-2a1d0784f871']
Dec 02 10:09:53 np0005541913.localdomain systemd[1]: tmp-crun.26EMWx.mount: Deactivated successfully.
Dec 02 10:09:53 np0005541913.localdomain podman[324428]: 2025-12-02 10:09:53.966480063 +0000 UTC m=+0.102545353 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 10:09:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:53.975 263406 INFO neutron.agent.dhcp.agent [None req-bc22bd2e-6d7c-4680-8562-b571d99bfecc - - - - - -] DHCP configuration for ports {'8dce7b35-4d0d-431f-9d67-447f953f0572'} is completed
Dec 02 10:09:54 np0005541913.localdomain podman[324429]: 2025-12-02 10:09:54.014068171 +0000 UTC m=+0.147637646 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:09:54 np0005541913.localdomain podman[324429]: 2025-12-02 10:09:54.020825532 +0000 UTC m=+0.154395007 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:09:54 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:09:54 np0005541913.localdomain podman[324428]: 2025-12-02 10:09:54.036970341 +0000 UTC m=+0.173035631 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:09:54 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:09:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:54.117 263406 INFO neutron.agent.linux.ip_lib [None req-70e9519e-ce9b-440a-b4cf-04deff565c9c - - - - - -] Device tapb822caec-0f cannot be used as it has no MAC address
Dec 02 10:09:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:54.144 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:54 np0005541913.localdomain kernel: device tapb822caec-0f entered promiscuous mode
Dec 02 10:09:54 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670194.1496] manager: (tapb822caec-0f): new Generic device (/org/freedesktop/NetworkManager/Devices/68)
Dec 02 10:09:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:54Z|00420|binding|INFO|Claiming lport b822caec-0ff2-45ee-8f19-f63f7afb253f for this chassis.
Dec 02 10:09:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:54Z|00421|binding|INFO|b822caec-0ff2-45ee-8f19-f63f7afb253f: Claiming unknown
Dec 02 10:09:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:54.152 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:54.164 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-5198bb66-dd27-48f3-9334-ab53b7335bc8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5198bb66-dd27-48f3-9334-ab53b7335bc8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e4dc3a6-b635-4672-b1fb-6fb25996b32e, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=b822caec-0ff2-45ee-8f19-f63f7afb253f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:54.167 160221 INFO neutron.agent.ovn.metadata.agent [-] Port b822caec-0ff2-45ee-8f19-f63f7afb253f in datapath 5198bb66-dd27-48f3-9334-ab53b7335bc8 bound to our chassis
Dec 02 10:09:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:54.168 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5198bb66-dd27-48f3-9334-ab53b7335bc8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:09:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:54.169 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[aeab8ae4-840f-48c0-8995-378a8668dc1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:54 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb822caec-0f: No such device
Dec 02 10:09:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:54Z|00422|binding|INFO|Setting lport b822caec-0ff2-45ee-8f19-f63f7afb253f ovn-installed in OVS
Dec 02 10:09:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:54Z|00423|binding|INFO|Setting lport b822caec-0ff2-45ee-8f19-f63f7afb253f up in Southbound
Dec 02 10:09:54 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb822caec-0f: No such device
Dec 02 10:09:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:54.186 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:54 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb822caec-0f: No such device
Dec 02 10:09:54 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb822caec-0f: No such device
Dec 02 10:09:54 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb822caec-0f: No such device
Dec 02 10:09:54 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb822caec-0f: No such device
Dec 02 10:09:54 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb822caec-0f: No such device
Dec 02 10:09:54 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapb822caec-0f: No such device
Dec 02 10:09:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:54.228 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:54.255 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:54 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:54.534 2 INFO neutron.agent.securitygroups_rpc [None req-ecc73d10-d9a3-477f-859a-88e3d0a4a336 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:09:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:54.568 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cc6380cd-d1fe-41c0-9f77-54a6bc7687ef", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:09:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cc6380cd-d1fe-41c0-9f77-54a6bc7687ef", "format": "json"}]: dispatch
Dec 02 10:09:54 np0005541913.localdomain ceph-mon[298296]: pgmap v360: 177 pgs: 177 active+clean; 146 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 1.8 MiB/s wr, 57 op/s
Dec 02 10:09:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:09:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:09:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:09:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 02 10:09:54 np0005541913.localdomain dnsmasq[324387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:54 np0005541913.localdomain dnsmasq-dhcp[324387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:09:54 np0005541913.localdomain dnsmasq-dhcp[324387]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:09:54 np0005541913.localdomain podman[324540]: 2025-12-02 10:09:54.758181261 +0000 UTC m=+0.059322402 container kill 79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:09:55 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:55.009 2 INFO neutron.agent.securitygroups_rpc [None req-c4292fab-d4f2-45ec-8373-3372677610e3 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:55 np0005541913.localdomain podman[324586]: 
Dec 02 10:09:55 np0005541913.localdomain podman[324586]: 2025-12-02 10:09:55.158104558 +0000 UTC m=+0.092459375 container create c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5198bb66-dd27-48f3-9334-ab53b7335bc8, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:09:55 np0005541913.localdomain systemd[1]: Started libpod-conmon-c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5.scope.
Dec 02 10:09:55 np0005541913.localdomain podman[324586]: 2025-12-02 10:09:55.111885577 +0000 UTC m=+0.046240464 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:55 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:55 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1c0a1d963187de3e3c282bd1aa2fa59ad7e13adf0d6aa09786d03fa9f921cd1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:55 np0005541913.localdomain podman[324586]: 2025-12-02 10:09:55.23244382 +0000 UTC m=+0.166798637 container init c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5198bb66-dd27-48f3-9334-ab53b7335bc8, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:55 np0005541913.localdomain podman[324586]: 2025-12-02 10:09:55.241650775 +0000 UTC m=+0.176005592 container start c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5198bb66-dd27-48f3-9334-ab53b7335bc8, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:55 np0005541913.localdomain dnsmasq[324604]: started, version 2.85 cachesize 150
Dec 02 10:09:55 np0005541913.localdomain dnsmasq[324604]: DNS service limited to local subnets
Dec 02 10:09:55 np0005541913.localdomain dnsmasq[324604]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:55 np0005541913.localdomain dnsmasq[324604]: warning: no upstream servers configured
Dec 02 10:09:55 np0005541913.localdomain dnsmasq-dhcp[324604]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:09:55 np0005541913.localdomain dnsmasq[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/addn_hosts - 0 addresses
Dec 02 10:09:55 np0005541913.localdomain dnsmasq-dhcp[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/host
Dec 02 10:09:55 np0005541913.localdomain dnsmasq-dhcp[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/opts
Dec 02 10:09:55 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:55.252 2 INFO neutron.agent.securitygroups_rpc [None req-4940f51c-3349-4656-978b-9a0b4cd29cb9 2903ef7b8c704dc09be34f96aeda2cff 6d11f96a2f644a22a82a6af9a2a1e5d2 - - default default] Security group member updated ['2e0224f5-51f6-419e-8240-7e06ddf53ec7']
Dec 02 10:09:55 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:55.290 2 INFO neutron.agent.securitygroups_rpc [None req-d1fab671-1814-41db-9614-65c239fa9e70 6a4701e292e04a82a827d127f0ef5b65 0b7e671d1f944c979f6feba0246d3141 - - default default] Security group member updated ['274309be-bd70-4043-9459-2a1d0784f871']
Dec 02 10:09:55 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:55.431 2 INFO neutron.agent.securitygroups_rpc [None req-4940f51c-3349-4656-978b-9a0b4cd29cb9 2903ef7b8c704dc09be34f96aeda2cff 6d11f96a2f644a22a82a6af9a2a1e5d2 - - default default] Security group member updated ['2e0224f5-51f6-419e-8240-7e06ddf53ec7']
Dec 02 10:09:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:55.432 263406 INFO neutron.agent.dhcp.agent [None req-f7d0b580-ff82-41b4-865d-1e55b256aa67 - - - - - -] DHCP configuration for ports {'e183ec07-efa1-48df-8f31-d919e8b5836e'} is completed
Dec 02 10:09:55 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:09:55 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:09:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:56.024 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:56.077 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:09:56 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:56.308 2 INFO neutron.agent.securitygroups_rpc [None req-2709b5dd-db11-4508-a989-29103dd3702e 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:56 np0005541913.localdomain dnsmasq[324387]: exiting on receipt of SIGTERM
Dec 02 10:09:56 np0005541913.localdomain podman[324623]: 2025-12-02 10:09:56.431037951 +0000 UTC m=+0.061598672 container kill 79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:09:56 np0005541913.localdomain systemd[1]: libpod-79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b.scope: Deactivated successfully.
Dec 02 10:09:56 np0005541913.localdomain podman[324635]: 2025-12-02 10:09:56.501670363 +0000 UTC m=+0.056676601 container died 79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:09:56 np0005541913.localdomain podman[324635]: 2025-12-02 10:09:56.5398008 +0000 UTC m=+0.094806998 container cleanup 79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 10:09:56 np0005541913.localdomain systemd[1]: libpod-conmon-79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b.scope: Deactivated successfully.
Dec 02 10:09:56 np0005541913.localdomain podman[324642]: 2025-12-02 10:09:56.588068396 +0000 UTC m=+0.128653940 container remove 79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:09:56 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:56.640 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:55Z, description=, device_id=7059c3fd-a028-4cdb-9894-b6db3dc33369, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d0b80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d0b50>], id=80164a20-3f5d-4eea-94e1-e26ceeea882c, ip_allocation=immediate, mac_address=fa:16:3e:04:14:44, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:09:47Z, description=, dns_domain=, id=5f48cce7-247c-4b5d-8287-ac14f7453254, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-897890507-network, port_security_enabled=True, project_id=5bad680c763640dba71a7865b355817c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30386, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2315, status=ACTIVE, subnets=['424417bc-1ee7-4d11-9ebf-680585d829a5'], tags=[], tenant_id=5bad680c763640dba71a7865b355817c, updated_at=2025-12-02T10:09:49Z, vlan_transparent=None, network_id=5f48cce7-247c-4b5d-8287-ac14f7453254, port_security_enabled=False, project_id=5bad680c763640dba71a7865b355817c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2357, status=DOWN, tags=[], tenant_id=5bad680c763640dba71a7865b355817c, updated_at=2025-12-02T10:09:56Z on network 5f48cce7-247c-4b5d-8287-ac14f7453254
Dec 02 10:09:56 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:56.648 2 INFO neutron.agent.securitygroups_rpc [None req-4752b673-ecc5-46d9-8169-f464ead4adc9 6a4701e292e04a82a827d127f0ef5b65 0b7e671d1f944c979f6feba0246d3141 - - default default] Security group member updated ['274309be-bd70-4043-9459-2a1d0784f871']
Dec 02 10:09:56 np0005541913.localdomain ceph-mon[298296]: pgmap v361: 177 pgs: 177 active+clean; 146 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 16 KiB/s wr, 5 op/s
Dec 02 10:09:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cc6380cd-d1fe-41c0-9f77-54a6bc7687ef", "format": "json"}]: dispatch
Dec 02 10:09:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cc6380cd-d1fe-41c0-9f77-54a6bc7687ef", "force": true, "format": "json"}]: dispatch
Dec 02 10:09:56 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:56.666 2 INFO neutron.agent.securitygroups_rpc [None req-237e1b13-6077-411f-87b4-3c14ff8061ce 2903ef7b8c704dc09be34f96aeda2cff 6d11f96a2f644a22a82a6af9a2a1e5d2 - - default default] Security group member updated ['2e0224f5-51f6-419e-8240-7e06ddf53ec7']
Dec 02 10:09:56 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:56.770 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:56Z, description=, device_id=f6d749d1-1fc5-4651-88a9-dade37b0c49d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a0ff70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087ee790>], id=49e6ac27-4822-4f2e-9d02-eb159ad3a2f0, ip_allocation=immediate, mac_address=fa:16:3e:b3:4f:dd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:09:50Z, description=, dns_domain=, id=5198bb66-dd27-48f3-9334-ab53b7335bc8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1795840910, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64348, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2327, status=ACTIVE, subnets=['b62f2cc5-7e42-4784-b31d-7caa26c4d241'], tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:52Z, vlan_transparent=None, network_id=5198bb66-dd27-48f3-9334-ab53b7335bc8, port_security_enabled=False, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2358, status=DOWN, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:56Z on network 5198bb66-dd27-48f3-9334-ab53b7335bc8
Dec 02 10:09:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:56.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:56.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:09:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b078218d8ff167202cb16b55a3c321df829dc784041145cd6eb709858f6ad54d-merged.mount: Deactivated successfully.
Dec 02 10:09:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79150746a9216edc8205f684f273bce18a18959b91236e548172f664aa0c724b-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:56 np0005541913.localdomain dnsmasq[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/addn_hosts - 1 addresses
Dec 02 10:09:56 np0005541913.localdomain dnsmasq-dhcp[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/host
Dec 02 10:09:56 np0005541913.localdomain dnsmasq-dhcp[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/opts
Dec 02 10:09:56 np0005541913.localdomain podman[324702]: 2025-12-02 10:09:56.996791908 +0000 UTC m=+0.066578866 container kill c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5198bb66-dd27-48f3-9334-ab53b7335bc8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:09:57 np0005541913.localdomain podman[324743]: 
Dec 02 10:09:57 np0005541913.localdomain podman[324743]: 2025-12-02 10:09:57.155566829 +0000 UTC m=+0.105217375 container create ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:57 np0005541913.localdomain systemd[1]: Started libpod-conmon-ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829.scope.
Dec 02 10:09:57 np0005541913.localdomain podman[324743]: 2025-12-02 10:09:57.113762935 +0000 UTC m=+0.063413461 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:57 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:57 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/615057096d3c152d4d1f671a5bd383cdd6ec5d4ac33268b4ace59e5fc7761d1b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:57 np0005541913.localdomain podman[324743]: 2025-12-02 10:09:57.233803063 +0000 UTC m=+0.183453589 container init ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:09:57 np0005541913.localdomain podman[324743]: 2025-12-02 10:09:57.243482382 +0000 UTC m=+0.193132998 container start ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324773]: started, version 2.85 cachesize 150
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324773]: DNS service limited to local subnets
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324773]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324773]: warning: no upstream servers configured
Dec 02 10:09:57 np0005541913.localdomain dnsmasq-dhcp[324773]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324773]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/addn_hosts - 1 addresses
Dec 02 10:09:57 np0005541913.localdomain dnsmasq-dhcp[324773]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/host
Dec 02 10:09:57 np0005541913.localdomain dnsmasq-dhcp[324773]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/opts
Dec 02 10:09:57 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:57.291 263406 INFO neutron.agent.dhcp.agent [None req-2336ef4e-d482-43e2-8a28-f574fcaa88f4 - - - - - -] DHCP configuration for ports {'49e6ac27-4822-4f2e-9d02-eb159ad3a2f0'} is completed
Dec 02 10:09:57 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:57.442 2 INFO neutron.agent.securitygroups_rpc [None req-84ee1250-9ce8-4943-9e17-d2eb70522c28 2903ef7b8c704dc09be34f96aeda2cff 6d11f96a2f644a22a82a6af9a2a1e5d2 - - default default] Security group member updated ['2e0224f5-51f6-419e-8240-7e06ddf53ec7']
Dec 02 10:09:57 np0005541913.localdomain podman[324804]: 
Dec 02 10:09:57 np0005541913.localdomain podman[324804]: 2025-12-02 10:09:57.567005333 +0000 UTC m=+0.092235629 container create e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:09:57 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:57.576 263406 INFO neutron.agent.dhcp.agent [None req-5bf11345-04cb-4b1b-b13f-a78f9fed4276 - - - - - -] DHCP configuration for ports {'80164a20-3f5d-4eea-94e1-e26ceeea882c'} is completed
Dec 02 10:09:57 np0005541913.localdomain systemd[1]: Started libpod-conmon-e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978.scope.
Dec 02 10:09:57 np0005541913.localdomain podman[324804]: 2025-12-02 10:09:57.522001144 +0000 UTC m=+0.047231470 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:09:57 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:09:57 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6c5b19e6915c8546eb393f0137d1cfffca610aac4aee4335fd5750356f1d42d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:09:57 np0005541913.localdomain podman[324804]: 2025-12-02 10:09:57.637522873 +0000 UTC m=+0.162753169 container init e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:09:57 np0005541913.localdomain podman[324804]: 2025-12-02 10:09:57.647603331 +0000 UTC m=+0.172833627 container start e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324822]: started, version 2.85 cachesize 150
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324822]: DNS service limited to local subnets
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324822]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324822]: warning: no upstream servers configured
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324822]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:09:57 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:57.653 2 INFO neutron.agent.securitygroups_rpc [None req-f449fe39-4274-4abb-aff3-e3ba219c9fe2 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:57 np0005541913.localdomain ceph-mon[298296]: pgmap v362: 177 pgs: 177 active+clean; 146 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 16 KiB/s wr, 5 op/s
Dec 02 10:09:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:57.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:57.829 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:09:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:57.829 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:09:57 np0005541913.localdomain systemd[1]: tmp-crun.Ai3RmK.mount: Deactivated successfully.
Dec 02 10:09:57 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:57.898 263406 INFO neutron.agent.dhcp.agent [None req-a19147eb-1323-426a-9512-386b3545cb86 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '92c576f1-7fec-4bf8-bac6-a5142e33525f'} is completed
Dec 02 10:09:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:57.947 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:09:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:57.948 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:09:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:57.949 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:09:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:57.949 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:09:57 np0005541913.localdomain dnsmasq[324822]: exiting on receipt of SIGTERM
Dec 02 10:09:57 np0005541913.localdomain systemd[1]: tmp-crun.Xzsmy8.mount: Deactivated successfully.
Dec 02 10:09:57 np0005541913.localdomain systemd[1]: libpod-e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978.scope: Deactivated successfully.
Dec 02 10:09:57 np0005541913.localdomain podman[324840]: 2025-12-02 10:09:57.974491963 +0000 UTC m=+0.070321155 container kill e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:09:58 np0005541913.localdomain podman[324852]: 2025-12-02 10:09:58.047119618 +0000 UTC m=+0.060854773 container died e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:09:58 np0005541913.localdomain podman[324852]: 2025-12-02 10:09:58.110315312 +0000 UTC m=+0.124050407 container cleanup e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:09:58 np0005541913.localdomain systemd[1]: libpod-conmon-e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978.scope: Deactivated successfully.
Dec 02 10:09:58 np0005541913.localdomain podman[324859]: 2025-12-02 10:09:58.174522953 +0000 UTC m=+0.175244780 container remove e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:09:58 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:58Z|00424|binding|INFO|Releasing lport 92c576f1-7fec-4bf8-bac6-a5142e33525f from this chassis (sb_readonly=0)
Dec 02 10:09:58 np0005541913.localdomain kernel: device tap92c576f1-7f left promiscuous mode
Dec 02 10:09:58 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:58Z|00425|binding|INFO|Setting lport 92c576f1-7fec-4bf8-bac6-a5142e33525f down in Southbound
Dec 02 10:09:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:58.189 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:58.198 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fed5:4cac/64 2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=92c576f1-7fec-4bf8-bac6-a5142e33525f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:58.200 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 92c576f1-7fec-4bf8-bac6-a5142e33525f in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:09:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:58.203 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:58.204 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[94fec0b5-18a0-42f7-bfd6-78a187b4c10c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:58.217 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:58 np0005541913.localdomain sshd[324883]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:09:58 np0005541913.localdomain sshd[324883]: Connection closed by 172.104.11.34 port 40326 [preauth]
Dec 02 10:09:58 np0005541913.localdomain sshd[324885]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:09:58 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:09:58.481 2 INFO neutron.agent.securitygroups_rpc [None req-c9840334-22ed-4fcf-9fb8-d440584d45ac 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:09:58 np0005541913.localdomain sshd[324885]: Connection closed by 172.104.11.34 port 40332 [preauth]
Dec 02 10:09:58 np0005541913.localdomain sshd[324887]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:09:58 np0005541913.localdomain sshd[324887]: Connection closed by 172.104.11.34 port 40340 [preauth]
Dec 02 10:09:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:09:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:09:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:09:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:09:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e6c5b19e6915c8546eb393f0137d1cfffca610aac4aee4335fd5750356f1d42d-merged.mount: Deactivated successfully.
Dec 02 10:09:58 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1bfd94cb54279b9c7451b5402b6c2019cb751bd2299790659c1e363c08b6978-userdata-shm.mount: Deactivated successfully.
Dec 02 10:09:58 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:09:58 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:58.950 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:56Z, description=, device_id=f6d749d1-1fc5-4651-88a9-dade37b0c49d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088b0490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088b06a0>], id=49e6ac27-4822-4f2e-9d02-eb159ad3a2f0, ip_allocation=immediate, mac_address=fa:16:3e:b3:4f:dd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:09:50Z, description=, dns_domain=, id=5198bb66-dd27-48f3-9334-ab53b7335bc8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1795840910, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64348, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2327, status=ACTIVE, subnets=['b62f2cc5-7e42-4784-b31d-7caa26c4d241'], tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:52Z, vlan_transparent=None, network_id=5198bb66-dd27-48f3-9334-ab53b7335bc8, port_security_enabled=False, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2358, status=DOWN, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:09:56Z on network 5198bb66-dd27-48f3-9334-ab53b7335bc8
Dec 02 10:09:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:58.969 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:09:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:58.993 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:09:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:58.993 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:09:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:58.993 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.014 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.015 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.016 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.016 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.017 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:09:59 np0005541913.localdomain dnsmasq[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/addn_hosts - 1 addresses
Dec 02 10:09:59 np0005541913.localdomain dnsmasq-dhcp[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/host
Dec 02 10:09:59 np0005541913.localdomain podman[324907]: 2025-12-02 10:09:59.178009936 +0000 UTC m=+0.063005440 container kill c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5198bb66-dd27-48f3-9334-ab53b7335bc8, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:09:59 np0005541913.localdomain dnsmasq-dhcp[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/opts
Dec 02 10:09:59 np0005541913.localdomain systemd[1]: tmp-crun.fgOlCr.mount: Deactivated successfully.
Dec 02 10:09:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:59.286 263406 INFO neutron.agent.linux.ip_lib [None req-546f31c4-5bf8-41b9-9961-fb9737511c5f - - - - - -] Device tap6cf9ab6e-02 cannot be used as it has no MAC address
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.367 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:59 np0005541913.localdomain kernel: device tap6cf9ab6e-02 entered promiscuous mode
Dec 02 10:09:59 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:59Z|00426|binding|INFO|Claiming lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b for this chassis.
Dec 02 10:09:59 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:59Z|00427|binding|INFO|6cf9ab6e-0266-437d-b907-6e6749aa6e0b: Claiming unknown
Dec 02 10:09:59 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670199.3749] manager: (tap6cf9ab6e-02): new Generic device (/org/freedesktop/NetworkManager/Devices/69)
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.377 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:59 np0005541913.localdomain systemd-udevd[324956]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:09:59 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:59Z|00428|binding|INFO|Setting lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b ovn-installed in OVS
Dec 02 10:09:59 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:09:59Z|00429|binding|INFO|Setting lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b up in Southbound
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.386 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:59 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:59.387 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=6cf9ab6e-0266-437d-b907-6e6749aa6e0b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.389 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:59 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:59.391 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 6cf9ab6e-0266-437d-b907-6e6749aa6e0b in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:09:59 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:59.394 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1ac3f767-f224-4e0f-9781-32978a5bc943 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:09:59 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:59.395 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:09:59 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:09:59.399 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc56732-e173-4a4a-8f62-c3e78a222e46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.421 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:59 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:09:59 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2398904436' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.464 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.466 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:09:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:59.468 263406 INFO neutron.agent.dhcp.agent [None req-dc9d0775-f91f-4900-abb7-f0a752e25a49 - - - - - -] DHCP configuration for ports {'49e6ac27-4822-4f2e-9d02-eb159ad3a2f0'} is completed
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.503 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.539 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.539 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.702 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.703 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11207MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.703 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.703 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:09:59 np0005541913.localdomain ceph-mon[298296]: pgmap v363: 177 pgs: 177 active+clean; 147 MiB data, 816 MiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 30 KiB/s wr, 9 op/s
Dec 02 10:09:59 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2398904436' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:09:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:09:59.819 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:09:55Z, description=, device_id=7059c3fd-a028-4cdb-9894-b6db3dc33369, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908781880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908781310>], id=80164a20-3f5d-4eea-94e1-e26ceeea882c, ip_allocation=immediate, mac_address=fa:16:3e:04:14:44, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:09:47Z, description=, dns_domain=, id=5f48cce7-247c-4b5d-8287-ac14f7453254, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-897890507-network, port_security_enabled=True, project_id=5bad680c763640dba71a7865b355817c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30386, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2315, status=ACTIVE, subnets=['424417bc-1ee7-4d11-9ebf-680585d829a5'], tags=[], tenant_id=5bad680c763640dba71a7865b355817c, updated_at=2025-12-02T10:09:49Z, vlan_transparent=None, network_id=5f48cce7-247c-4b5d-8287-ac14f7453254, port_security_enabled=False, project_id=5bad680c763640dba71a7865b355817c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2357, status=DOWN, tags=[], tenant_id=5bad680c763640dba71a7865b355817c, updated_at=2025-12-02T10:09:56Z on network 5f48cce7-247c-4b5d-8287-ac14f7453254
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.827 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.828 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.828 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:09:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:09:59.868 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:10:00 np0005541913.localdomain systemd[1]: tmp-crun.zNIkIb.mount: Deactivated successfully.
Dec 02 10:10:00 np0005541913.localdomain dnsmasq[324773]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/addn_hosts - 1 addresses
Dec 02 10:10:00 np0005541913.localdomain dnsmasq-dhcp[324773]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/host
Dec 02 10:10:00 np0005541913.localdomain podman[325009]: 2025-12-02 10:10:00.032477296 +0000 UTC m=+0.069080871 container kill ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 10:10:00 np0005541913.localdomain dnsmasq-dhcp[324773]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/opts
Dec 02 10:10:00 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:00.213 263406 INFO neutron.agent.dhcp.agent [None req-50696e38-84b6-4013-a7df-6126c21f36f8 - - - - - -] DHCP configuration for ports {'80164a20-3f5d-4eea-94e1-e26ceeea882c'} is completed
Dec 02 10:10:00 np0005541913.localdomain podman[325072]: 
Dec 02 10:10:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:00.322 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:10:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:00.328 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:10:00 np0005541913.localdomain podman[325072]: 2025-12-02 10:10:00.32905001 +0000 UTC m=+0.075174073 container create dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:10:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:00.344 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:10:00 np0005541913.localdomain systemd[1]: Started libpod-conmon-dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b.scope.
Dec 02 10:10:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:00.370 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:10:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:00.371 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:10:00 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:00 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b4a62014718597189450d4d03bb09a5957565165ae07f83f474537569bc374c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:00 np0005541913.localdomain podman[325072]: 2025-12-02 10:10:00.382987758 +0000 UTC m=+0.129111791 container init dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:10:00 np0005541913.localdomain podman[325072]: 2025-12-02 10:10:00.388814853 +0000 UTC m=+0.134938886 container start dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:10:00 np0005541913.localdomain podman[325072]: 2025-12-02 10:10:00.29901824 +0000 UTC m=+0.045142293 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:00 np0005541913.localdomain dnsmasq[325091]: started, version 2.85 cachesize 150
Dec 02 10:10:00 np0005541913.localdomain dnsmasq[325091]: DNS service limited to local subnets
Dec 02 10:10:00 np0005541913.localdomain dnsmasq[325091]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:00 np0005541913.localdomain dnsmasq[325091]: warning: no upstream servers configured
Dec 02 10:10:00 np0005541913.localdomain dnsmasq-dhcp[325091]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:10:00 np0005541913.localdomain dnsmasq[325091]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:10:00 np0005541913.localdomain dnsmasq-dhcp[325091]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:10:00 np0005541913.localdomain dnsmasq-dhcp[325091]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:10:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:00Z|00430|binding|INFO|Releasing lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b from this chassis (sb_readonly=0)
Dec 02 10:10:00 np0005541913.localdomain kernel: device tap6cf9ab6e-02 left promiscuous mode
Dec 02 10:10:00 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:00Z|00431|binding|INFO|Setting lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b down in Southbound
Dec 02 10:10:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:00.524 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:00.531 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=6cf9ab6e-0266-437d-b907-6e6749aa6e0b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:00.532 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 6cf9ab6e-0266-437d-b907-6e6749aa6e0b in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:10:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:00.534 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:10:00 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:00.535 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[21ba40b4-a4bc-43c9-88f4-15ea42832350]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:00.544 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:00.545 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:00 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:00.551 263406 INFO neutron.agent.dhcp.agent [None req-a4f6292e-605d-4f4d-bb3c-256e0eaab931 - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955'} is completed
Dec 02 10:10:00 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:00.707 2 INFO neutron.agent.securitygroups_rpc [None req-3e03e95a-f561-4345-b792-65b4ec75916c 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "05ca7661-f391-4234-9c50-a2000ddc14bd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:10:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "05ca7661-f391-4234-9c50-a2000ddc14bd", "format": "json"}]: dispatch
Dec 02 10:10:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:10:00 np0005541913.localdomain ceph-mon[298296]: overall HEALTH_OK
Dec 02 10:10:00 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/740472713' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:10:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:01.028 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:01.078 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:01 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:10:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:10:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:10:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:10:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 02 10:10:01 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:10:01 np0005541913.localdomain ceph-mon[298296]: pgmap v364: 177 pgs: 177 active+clean; 147 MiB data, 816 MiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 22 KiB/s wr, 6 op/s
Dec 02 10:10:02 np0005541913.localdomain dnsmasq[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/addn_hosts - 0 addresses
Dec 02 10:10:02 np0005541913.localdomain dnsmasq-dhcp[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/host
Dec 02 10:10:02 np0005541913.localdomain dnsmasq-dhcp[324604]: read /var/lib/neutron/dhcp/5198bb66-dd27-48f3-9334-ab53b7335bc8/opts
Dec 02 10:10:02 np0005541913.localdomain podman[325111]: 2025-12-02 10:10:02.270806755 +0000 UTC m=+0.040578772 container kill c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5198bb66-dd27-48f3-9334-ab53b7335bc8, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:10:02 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:10:02 np0005541913.localdomain systemd[1]: tmp-crun.10D4YQ.mount: Deactivated successfully.
Dec 02 10:10:02 np0005541913.localdomain podman[325125]: 2025-12-02 10:10:02.346738939 +0000 UTC m=+0.059168078 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:10:02 np0005541913.localdomain podman[325125]: 2025-12-02 10:10:02.359041307 +0000 UTC m=+0.071470406 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:02 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:10:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:02.438 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:02 np0005541913.localdomain kernel: device tapb822caec-0f left promiscuous mode
Dec 02 10:10:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:02Z|00432|binding|INFO|Releasing lport b822caec-0ff2-45ee-8f19-f63f7afb253f from this chassis (sb_readonly=0)
Dec 02 10:10:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:02Z|00433|binding|INFO|Setting lport b822caec-0ff2-45ee-8f19-f63f7afb253f down in Southbound
Dec 02 10:10:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:02.445 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-5198bb66-dd27-48f3-9334-ab53b7335bc8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5198bb66-dd27-48f3-9334-ab53b7335bc8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e4dc3a6-b635-4672-b1fb-6fb25996b32e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=b822caec-0ff2-45ee-8f19-f63f7afb253f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:02.447 160221 INFO neutron.agent.ovn.metadata.agent [-] Port b822caec-0ff2-45ee-8f19-f63f7afb253f in datapath 5198bb66-dd27-48f3-9334-ab53b7335bc8 unbound from our chassis
Dec 02 10:10:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:02.448 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5198bb66-dd27-48f3-9334-ab53b7335bc8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:10:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:02.449 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[04701ea4-523b-4921-9730-f970fa223c05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:02.460 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:02 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:02.847 263406 INFO neutron.agent.linux.ip_lib [None req-4c811050-eaf6-4394-8106-82618af8c12d - - - - - -] Device tapc38cde41-ba cannot be used as it has no MAC address
Dec 02 10:10:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:02.921 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:02 np0005541913.localdomain kernel: device tapc38cde41-ba entered promiscuous mode
Dec 02 10:10:02 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670202.9314] manager: (tapc38cde41-ba): new Generic device (/org/freedesktop/NetworkManager/Devices/70)
Dec 02 10:10:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:02.930 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:02Z|00434|binding|INFO|Claiming lport c38cde41-bad1-4f80-97fd-24542d25f8b3 for this chassis.
Dec 02 10:10:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:02Z|00435|binding|INFO|c38cde41-bad1-4f80-97fd-24542d25f8b3: Claiming unknown
Dec 02 10:10:02 np0005541913.localdomain systemd-udevd[325162]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:10:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:02.946 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:02.945 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d11f96a2f644a22a82a6af9a2a1e5d2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5d7f4c8-6f79-49e1-9d65-a5856ac6b73d, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c38cde41-bad1-4f80-97fd-24542d25f8b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:02.948 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c38cde41-bad1-4f80-97fd-24542d25f8b3 in datapath 41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd bound to our chassis
Dec 02 10:10:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:02Z|00436|binding|INFO|Setting lport c38cde41-bad1-4f80-97fd-24542d25f8b3 ovn-installed in OVS
Dec 02 10:10:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:02Z|00437|binding|INFO|Setting lport c38cde41-bad1-4f80-97fd-24542d25f8b3 up in Southbound
Dec 02 10:10:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:02.951 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:10:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:02.952 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:02.952 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[2a257ef1-5566-46ca-a63d-e17977fa0829]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:02 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc38cde41-ba: No such device
Dec 02 10:10:02 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc38cde41-ba: No such device
Dec 02 10:10:02 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc38cde41-ba: No such device
Dec 02 10:10:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:02.968 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:02 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc38cde41-ba: No such device
Dec 02 10:10:02 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc38cde41-ba: No such device
Dec 02 10:10:02 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc38cde41-ba: No such device
Dec 02 10:10:02 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc38cde41-ba: No such device
Dec 02 10:10:02 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc38cde41-ba: No such device
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.005 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.035 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:03.053 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:10:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:03.053 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:10:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:03.054 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:10:03 np0005541913.localdomain dnsmasq[325091]: exiting on receipt of SIGTERM
Dec 02 10:10:03 np0005541913.localdomain podman[325209]: 2025-12-02 10:10:03.175773472 +0000 UTC m=+0.064403067 container kill dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:10:03 np0005541913.localdomain systemd[1]: libpod-dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b.scope: Deactivated successfully.
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.205 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:10:03 np0005541913.localdomain podman[325228]: 2025-12-02 10:10:03.244933985 +0000 UTC m=+0.046924962 container died dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:10:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-4b4a62014718597189450d4d03bb09a5957565165ae07f83f474537569bc374c-merged.mount: Deactivated successfully.
Dec 02 10:10:03 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b-userdata-shm.mount: Deactivated successfully.
Dec 02 10:10:03 np0005541913.localdomain podman[325228]: 2025-12-02 10:10:03.29391674 +0000 UTC m=+0.095907687 container remove dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:10:03 np0005541913.localdomain systemd[1]: libpod-conmon-dae6d72e9c18774b5e51d9e39587d0d8c31c301c25915e3b0f4617851eb2357b.scope: Deactivated successfully.
Dec 02 10:10:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:03.385 263406 INFO neutron.agent.linux.ip_lib [None req-9ae0b500-4ffd-4e9d-8c95-6328544983eb - - - - - -] Device tap6cf9ab6e-02 cannot be used as it has no MAC address
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.417 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:03 np0005541913.localdomain kernel: device tap6cf9ab6e-02 entered promiscuous mode
Dec 02 10:10:03 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670203.4224] manager: (tap6cf9ab6e-02): new Generic device (/org/freedesktop/NetworkManager/Devices/71)
Dec 02 10:10:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:03Z|00438|binding|INFO|Claiming lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b for this chassis.
Dec 02 10:10:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:03Z|00439|binding|INFO|6cf9ab6e-0266-437d-b907-6e6749aa6e0b: Claiming unknown
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.422 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:03 np0005541913.localdomain systemd-udevd[325165]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:10:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:03Z|00440|binding|INFO|Setting lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b ovn-installed in OVS
Dec 02 10:10:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:03Z|00441|binding|INFO|Setting lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b up in Southbound
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.429 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.432 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:03.432 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feed:7a24/64 2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=6cf9ab6e-0266-437d-b907-6e6749aa6e0b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:03.435 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 6cf9ab6e-0266-437d-b907-6e6749aa6e0b in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 bound to our chassis
Dec 02 10:10:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:03.439 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1ac3f767-f224-4e0f-9781-32978a5bc943 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:10:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:03.440 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:10:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:03.441 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[94adb680-4c97-4d91-8410-2cbb60f5ca8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.469 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.517 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.546 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:03 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:03.550 2 INFO neutron.agent.securitygroups_rpc [None req-7ff6a690-1608-4241-962d-cf0eb5f2eb30 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:10:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:03.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:10:04 np0005541913.localdomain podman[325335]: 
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[324604]: exiting on receipt of SIGTERM
Dec 02 10:10:04 np0005541913.localdomain podman[325343]: 2025-12-02 10:10:04.027267993 +0000 UTC m=+0.077875646 container kill c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5198bb66-dd27-48f3-9334-ab53b7335bc8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:04 np0005541913.localdomain systemd[1]: libpod-c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5.scope: Deactivated successfully.
Dec 02 10:10:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:10:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:10:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:10:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:10:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:10:04 np0005541913.localdomain podman[325335]: 2025-12-02 10:10:04.060404556 +0000 UTC m=+0.136144839 container create b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:04 np0005541913.localdomain podman[325335]: 2025-12-02 10:10:03.972081573 +0000 UTC m=+0.047821926 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:10:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:10:04 np0005541913.localdomain systemd[1]: Started libpod-conmon-b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7.scope.
Dec 02 10:10:04 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:04 np0005541913.localdomain podman[325370]: 2025-12-02 10:10:04.1363794 +0000 UTC m=+0.085129759 container died c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5198bb66-dd27-48f3-9334-ab53b7335bc8, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:10:04 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ead580261bee1ee4aad75a4d864a42a4a9d764055292bd9533e5e426eea53df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:04 np0005541913.localdomain podman[325335]: 2025-12-02 10:10:04.144934628 +0000 UTC m=+0.220674901 container init b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 02 10:10:04 np0005541913.localdomain podman[325335]: 2025-12-02 10:10:04.15734909 +0000 UTC m=+0.233089373 container start b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325398]: started, version 2.85 cachesize 150
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325398]: DNS service limited to local subnets
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325398]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325398]: warning: no upstream servers configured
Dec 02 10:10:04 np0005541913.localdomain dnsmasq-dhcp[325398]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325398]: read /var/lib/neutron/dhcp/41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd/addn_hosts - 0 addresses
Dec 02 10:10:04 np0005541913.localdomain dnsmasq-dhcp[325398]: read /var/lib/neutron/dhcp/41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd/host
Dec 02 10:10:04 np0005541913.localdomain dnsmasq-dhcp[325398]: read /var/lib/neutron/dhcp/41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd/opts
Dec 02 10:10:04 np0005541913.localdomain podman[325370]: 2025-12-02 10:10:04.179147541 +0000 UTC m=+0.127897870 container remove c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5198bb66-dd27-48f3-9334-ab53b7335bc8, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:04 np0005541913.localdomain systemd[1]: libpod-conmon-c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5.scope: Deactivated successfully.
Dec 02 10:10:04 np0005541913.localdomain ceph-mon[298296]: pgmap v365: 177 pgs: 177 active+clean; 147 MiB data, 820 MiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 33 KiB/s wr, 9 op/s
Dec 02 10:10:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "05ca7661-f391-4234-9c50-a2000ddc14bd", "format": "json"}]: dispatch
Dec 02 10:10:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "05ca7661-f391-4234-9c50-a2000ddc14bd", "force": true, "format": "json"}]: dispatch
Dec 02 10:10:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c1c0a1d963187de3e3c282bd1aa2fa59ad7e13adf0d6aa09786d03fa9f921cd1-merged.mount: Deactivated successfully.
Dec 02 10:10:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3b4269865201f3fcf2d79f914e08f06684ae4410385bd9261992e76dad564f5-userdata-shm.mount: Deactivated successfully.
Dec 02 10:10:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:04.366 263406 INFO neutron.agent.dhcp.agent [None req-3fc1f4d3-a5f2-44fd-badd-d1878b0bc4b8 - - - - - -] DHCP configuration for ports {'c5921849-40fe-4620-9d7d-bf845e6fc8f7'} is completed
Dec 02 10:10:04 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d5198bb66\x2ddd27\x2d48f3\x2d9334\x2dab53b7335bc8.mount: Deactivated successfully.
Dec 02 10:10:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:04.404 263406 INFO neutron.agent.dhcp.agent [None req-b8fb038f-ce8a-4e7b-9b19-e65c62cf9f81 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:04.405 263406 INFO neutron.agent.dhcp.agent [None req-b8fb038f-ce8a-4e7b-9b19-e65c62cf9f81 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:04 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:04.407 2 INFO neutron.agent.securitygroups_rpc [None req-90a76f3a-a979-402e-97fd-700e856a8199 7602b6bff04a41118e902187d8f95daa 39113116e26e4da3a6194d2f44d952a8 - - default default] Security group member updated ['062c5d07-6a15-41a5-85bf-27aede3f5276']
Dec 02 10:10:04 np0005541913.localdomain podman[325421]: 
Dec 02 10:10:04 np0005541913.localdomain podman[325421]: 2025-12-02 10:10:04.425727922 +0000 UTC m=+0.099820252 container create 23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:04 np0005541913.localdomain systemd[1]: Started libpod-conmon-23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050.scope.
Dec 02 10:10:04 np0005541913.localdomain podman[325421]: 2025-12-02 10:10:04.378318838 +0000 UTC m=+0.052411238 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:04 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:04 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f8d065ef74c831797f48f3e133ab46edcfa70b7eeb8c3d98871e70265859073/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:04 np0005541913.localdomain podman[325421]: 2025-12-02 10:10:04.497306259 +0000 UTC m=+0.171398589 container init 23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:10:04 np0005541913.localdomain podman[325421]: 2025-12-02 10:10:04.503640548 +0000 UTC m=+0.177732888 container start 23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325439]: started, version 2.85 cachesize 150
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325439]: DNS service limited to local subnets
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325439]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325439]: warning: no upstream servers configured
Dec 02 10:10:04 np0005541913.localdomain dnsmasq-dhcp[325439]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 02 10:10:04 np0005541913.localdomain dnsmasq-dhcp[325439]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325439]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:10:04 np0005541913.localdomain dnsmasq-dhcp[325439]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:10:04 np0005541913.localdomain dnsmasq-dhcp[325439]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:10:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:04.549 263406 INFO neutron.agent.dhcp.agent [None req-9ae0b500-4ffd-4e9d-8c95-6328544983eb - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:10:03Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088bc040>, <neutron.agent.linux.dhcp.DictModel object at 0x7f99088bcaf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990882e700>, <neutron.agent.linux.dhcp.DictModel object at 0x7f990882e9a0>], id=e29fa419-e9ff-497c-948a-66ee2d0016ec, ip_allocation=immediate, mac_address=fa:16:3e:0f:cf:b4, name=tempest-NetworksTestDHCPv6-219671849, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:07:29Z, description=, dns_domain=, id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-53840882, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13733, qos_policy_id=None, revision_number=61, router:external=False, shared=False, standard_attr_id=1669, status=ACTIVE, subnets=['d338963b-de10-4ec8-85df-80dcebdfe976', 'f0b3b782-77ce-4779-87aa-9e999cbd999a'], tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:09:58Z, vlan_transparent=None, network_id=7d517d9d-ba68-4c0f-b344-6c3be9d614a4, port_security_enabled=True, project_id=39113116e26e4da3a6194d2f44d952a8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['062c5d07-6a15-41a5-85bf-27aede3f5276'], standard_attr_id=2382, status=DOWN, tags=[], tenant_id=39113116e26e4da3a6194d2f44d952a8, updated_at=2025-12-02T10:10:03Z on network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4
Dec 02 10:10:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:04.645 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:04 np0005541913.localdomain podman[325458]: 2025-12-02 10:10:04.750923578 +0000 UTC m=+0.059155347 container kill 23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:10:04 np0005541913.localdomain dnsmasq[325439]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 2 addresses
Dec 02 10:10:04 np0005541913.localdomain dnsmasq-dhcp[325439]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:10:04 np0005541913.localdomain dnsmasq-dhcp[325439]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:10:04 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:04.928 2 INFO neutron.agent.securitygroups_rpc [None req-c77326cb-1074-4872-8ee0-28f281df7dfe 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:04.969 263406 INFO neutron.agent.dhcp.agent [None req-cd4d733a-b346-46d0-b65e-141f452d51fe - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '6cf9ab6e-0266-437d-b907-6e6749aa6e0b'} is completed
Dec 02 10:10:05 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:05.084 263406 INFO neutron.agent.dhcp.agent [None req-b2c039fa-be99-4594-bbf1-d5d7ab02bb29 - - - - - -] DHCP configuration for ports {'e29fa419-e9ff-497c-948a-66ee2d0016ec'} is completed
Dec 02 10:10:05 np0005541913.localdomain dnsmasq[325439]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:10:05 np0005541913.localdomain dnsmasq-dhcp[325439]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:10:05 np0005541913.localdomain dnsmasq-dhcp[325439]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:10:05 np0005541913.localdomain podman[325495]: 2025-12-02 10:10:05.119761026 +0000 UTC m=+0.059357272 container kill 23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:10:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:10:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:10:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:05 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:10:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1389731968' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:10:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1389731968' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:10:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:05Z|00442|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:10:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:05.438 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:05 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:05.448 2 INFO neutron.agent.securitygroups_rpc [None req-9cd60b66-d893-4669-ab17-1eefaaf90d0c 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:05 np0005541913.localdomain dnsmasq[325439]: exiting on receipt of SIGTERM
Dec 02 10:10:05 np0005541913.localdomain podman[325534]: 2025-12-02 10:10:05.635302485 +0000 UTC m=+0.056183258 container kill 23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:05 np0005541913.localdomain systemd[1]: libpod-23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050.scope: Deactivated successfully.
Dec 02 10:10:05 np0005541913.localdomain podman[325548]: 2025-12-02 10:10:05.689572422 +0000 UTC m=+0.040895091 container died 23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:05 np0005541913.localdomain podman[325548]: 2025-12-02 10:10:05.771387262 +0000 UTC m=+0.122709911 container cleanup 23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:10:05 np0005541913.localdomain systemd[1]: libpod-conmon-23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050.scope: Deactivated successfully.
Dec 02 10:10:05 np0005541913.localdomain podman[325550]: 2025-12-02 10:10:05.790544822 +0000 UTC m=+0.131246678 container remove 23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:10:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:05.822 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:10:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:05.852 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:10:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:05.853 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:10:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:05.854 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:10:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:06.031 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:10:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:10:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:10:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157931 "" "Go-http-client/1.1"
Dec 02 10:10:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:06.081 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:10:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19722 "" "Go-http-client/1.1"
Dec 02 10:10:06 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:06.090 2 INFO neutron.agent.securitygroups_rpc [None req-9685cf0d-187e-491c-a3b3-f6b6113116e3 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:06 np0005541913.localdomain systemd[1]: tmp-crun.DXkfrn.mount: Deactivated successfully.
Dec 02 10:10:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-9f8d065ef74c831797f48f3e133ab46edcfa70b7eeb8c3d98871e70265859073-merged.mount: Deactivated successfully.
Dec 02 10:10:06 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23bf4b2b62bb0b900376b9d626ea3d411d4402f6fd716a4f4717d78a8a152050-userdata-shm.mount: Deactivated successfully.
Dec 02 10:10:06 np0005541913.localdomain ceph-mon[298296]: pgmap v366: 177 pgs: 177 active+clean; 147 MiB data, 820 MiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 24 KiB/s wr, 7 op/s
Dec 02 10:10:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3363842838' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:10:06 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:06.719 2 INFO neutron.agent.securitygroups_rpc [None req-3503b005-9d9f-48d0-a8b4-a51ac4fa455f 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:06 np0005541913.localdomain podman[325626]: 2025-12-02 10:10:06.634045441 +0000 UTC m=+0.046540942 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:07 np0005541913.localdomain podman[325626]: 
Dec 02 10:10:07 np0005541913.localdomain podman[325626]: 2025-12-02 10:10:07.022913994 +0000 UTC m=+0.435409475 container create 35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: Started libpod-conmon-35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462.scope.
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:07 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/641192bd43133d9a6db497be22fcb6a0a7e6bed3bb5a2e42ffba83b803902053/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:07 np0005541913.localdomain podman[325626]: 2025-12-02 10:10:07.084998868 +0000 UTC m=+0.497494339 container init 35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:10:07 np0005541913.localdomain podman[325626]: 2025-12-02 10:10:07.094777889 +0000 UTC m=+0.507273370 container start 35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:07 np0005541913.localdomain dnsmasq[325657]: started, version 2.85 cachesize 150
Dec 02 10:10:07 np0005541913.localdomain dnsmasq[325657]: DNS service limited to local subnets
Dec 02 10:10:07 np0005541913.localdomain dnsmasq[325657]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:07 np0005541913.localdomain dnsmasq[325657]: warning: no upstream servers configured
Dec 02 10:10:07 np0005541913.localdomain dnsmasq-dhcp[325657]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 02 10:10:07 np0005541913.localdomain dnsmasq[325657]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/addn_hosts - 0 addresses
Dec 02 10:10:07 np0005541913.localdomain dnsmasq-dhcp[325657]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/host
Dec 02 10:10:07 np0005541913.localdomain dnsmasq-dhcp[325657]: read /var/lib/neutron/dhcp/7d517d9d-ba68-4c0f-b344-6c3be9d614a4/opts
Dec 02 10:10:07 np0005541913.localdomain podman[325641]: 2025-12-02 10:10:07.153045481 +0000 UTC m=+0.093958324 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:10:07 np0005541913.localdomain podman[325641]: 2025-12-02 10:10:07.189320748 +0000 UTC m=+0.130233481 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:10:07 np0005541913.localdomain podman[325668]: 2025-12-02 10:10:07.277868448 +0000 UTC m=+0.082504510 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:10:07 np0005541913.localdomain podman[325668]: 2025-12-02 10:10:07.350277078 +0000 UTC m=+0.154913130 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:10:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/4072896173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:10:07 np0005541913.localdomain dnsmasq[325398]: exiting on receipt of SIGTERM
Dec 02 10:10:07 np0005541913.localdomain podman[325708]: 2025-12-02 10:10:07.44867061 +0000 UTC m=+0.032173379 container kill b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: libpod-b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7.scope: Deactivated successfully.
Dec 02 10:10:07 np0005541913.localdomain podman[325722]: 2025-12-02 10:10:07.506745898 +0000 UTC m=+0.045455103 container died b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:10:07 np0005541913.localdomain podman[325722]: 2025-12-02 10:10:07.540539488 +0000 UTC m=+0.079248653 container cleanup b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: libpod-conmon-b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7.scope: Deactivated successfully.
Dec 02 10:10:07 np0005541913.localdomain podman[325723]: 2025-12-02 10:10:07.580154714 +0000 UTC m=+0.115662064 container remove b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:10:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:07Z|00443|binding|INFO|Releasing lport c38cde41-bad1-4f80-97fd-24542d25f8b3 from this chassis (sb_readonly=0)
Dec 02 10:10:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:07Z|00444|binding|INFO|Setting lport c38cde41-bad1-4f80-97fd-24542d25f8b3 down in Southbound
Dec 02 10:10:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:07.590 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:07 np0005541913.localdomain kernel: device tapc38cde41-ba left promiscuous mode
Dec 02 10:10:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:07.616 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:07.675 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d11f96a2f644a22a82a6af9a2a1e5d2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5d7f4c8-6f79-49e1-9d65-a5856ac6b73d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c38cde41-bad1-4f80-97fd-24542d25f8b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:07.677 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c38cde41-bad1-4f80-97fd-24542d25f8b3 in datapath 41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd unbound from our chassis
Dec 02 10:10:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:07.680 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41cab9a4-eb0b-40d9-b339-ce18b4e6b4dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:10:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:07.682 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6e07aad2-48c5-400f-8df8-69f0fba28507]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:07 np0005541913.localdomain dnsmasq[325657]: exiting on receipt of SIGTERM
Dec 02 10:10:07 np0005541913.localdomain podman[325766]: 2025-12-02 10:10:07.708029601 +0000 UTC m=+0.043704015 container kill 35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: libpod-35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462.scope: Deactivated successfully.
Dec 02 10:10:07 np0005541913.localdomain podman[325779]: 2025-12-02 10:10:07.749817765 +0000 UTC m=+0.031317616 container died 35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:10:07 np0005541913.localdomain podman[325779]: 2025-12-02 10:10:07.774935314 +0000 UTC m=+0.056435165 container cleanup 35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:10:07 np0005541913.localdomain systemd[1]: libpod-conmon-35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462.scope: Deactivated successfully.
Dec 02 10:10:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:07.786 263406 INFO neutron.agent.dhcp.agent [None req-854a9495-18ab-423d-9718-9bc823f2244e - - - - - -] DHCP configuration for ports {'a59d5a92-7a77-419d-a87f-fbb46ea78955', '6cf9ab6e-0266-437d-b907-6e6749aa6e0b'} is completed
Dec 02 10:10:07 np0005541913.localdomain podman[325781]: 2025-12-02 10:10:07.862577481 +0000 UTC m=+0.134202528 container remove 35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d517d9d-ba68-4c0f-b344-6c3be9d614a4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:10:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:07Z|00445|binding|INFO|Releasing lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b from this chassis (sb_readonly=0)
Dec 02 10:10:07 np0005541913.localdomain kernel: device tap6cf9ab6e-02 left promiscuous mode
Dec 02 10:10:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:07Z|00446|binding|INFO|Setting lport 6cf9ab6e-0266-437d-b907-6e6749aa6e0b down in Southbound
Dec 02 10:10:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:07.874 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:07.894 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feed:7a24/64 2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d517d9d-ba68-4c0f-b344-6c3be9d614a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '39113116e26e4da3a6194d2f44d952a8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55679031-13ed-4a23-9c9d-18d3c58230be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=6cf9ab6e-0266-437d-b907-6e6749aa6e0b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:07.897 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:07.900 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 6cf9ab6e-0266-437d-b907-6e6749aa6e0b in datapath 7d517d9d-ba68-4c0f-b344-6c3be9d614a4 unbound from our chassis
Dec 02 10:10:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:07.905 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d517d9d-ba68-4c0f-b344-6c3be9d614a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:10:07 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:07.907 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d34813-ace5-4d72-bfa7-191c89f505e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-641192bd43133d9a6db497be22fcb6a0a7e6bed3bb5a2e42ffba83b803902053-merged.mount: Deactivated successfully.
Dec 02 10:10:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35466a8f33ac14ef1e8b8b36fdea58085e9237966f543dcc97b2b64670e27462-userdata-shm.mount: Deactivated successfully.
Dec 02 10:10:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6ead580261bee1ee4aad75a4d864a42a4a9d764055292bd9533e5e426eea53df-merged.mount: Deactivated successfully.
Dec 02 10:10:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b27895503a52511bce829e24c9171e68a5eb020749f8c188e7e17280d3ae12d7-userdata-shm.mount: Deactivated successfully.
Dec 02 10:10:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:08.318 263406 INFO neutron.agent.dhcp.agent [None req-83a49769-21fc-4e0e-bbb5-d1d5644746ac - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:08 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d41cab9a4\x2deb0b\x2d40d9\x2db339\x2dce18b4e6b4dd.mount: Deactivated successfully.
Dec 02 10:10:08 np0005541913.localdomain ceph-mon[298296]: pgmap v367: 177 pgs: 177 active+clean; 147 MiB data, 820 MiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 24 KiB/s wr, 7 op/s
Dec 02 10:10:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:10:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:10:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:10:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 02 10:10:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:08.544 263406 INFO neutron.agent.dhcp.agent [None req-5d6ed0a0-fb71-407d-b47c-f30eb07d514f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:08 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7d517d9d\x2dba68\x2d4c0f\x2db344\x2d6c3be9d614a4.mount: Deactivated successfully.
Dec 02 10:10:09 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:10:09 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:10:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/24049623' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:10:10 np0005541913.localdomain ceph-mon[298296]: pgmap v368: 177 pgs: 177 active+clean; 163 MiB data, 837 MiB used, 41 GiB / 42 GiB avail; 9.5 KiB/s rd, 1.4 MiB/s wr, 24 op/s
Dec 02 10:10:10 np0005541913.localdomain ceph-mon[298296]: mgrmap e48: np0005541914.lljzmk(active, since 10m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:10:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/843803668' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:10:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:11.034 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:11.083 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:11 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:11.188 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:11 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:11.363 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:11Z|00447|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:10:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:11.671 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:10:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:10:12 np0005541913.localdomain ceph-mon[298296]: pgmap v369: 177 pgs: 177 active+clean; 163 MiB data, 837 MiB used, 41 GiB / 42 GiB avail; 9.4 KiB/s rd, 1.4 MiB/s wr, 20 op/s
Dec 02 10:10:12 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:10:13 np0005541913.localdomain ceph-mon[298296]: pgmap v370: 177 pgs: 177 active+clean; 291 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 12 MiB/s wr, 40 op/s
Dec 02 10:10:14 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:14.383 2 INFO neutron.agent.securitygroups_rpc [None req-aa29d14d-f8b7-4441-acc0-85287ab48c6d 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:14 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:10:15 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:15.079 2 INFO neutron.agent.securitygroups_rpc [None req-976c47c9-ae22-4daa-9b62-4c3bf838f3e2 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:15 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:15.590 2 INFO neutron.agent.securitygroups_rpc [None req-90e76c3f-20a5-47a4-903d-78b983867e31 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:10:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:10:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:10:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 02 10:10:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:10:15 np0005541913.localdomain ceph-mon[298296]: pgmap v371: 177 pgs: 177 active+clean; 291 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 12 MiB/s wr, 37 op/s
Dec 02 10:10:15 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:15.917 2 INFO neutron.agent.securitygroups_rpc [None req-5542fe4c-166b-4ffa-972c-029f68af7f10 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:16 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:16.016 263406 INFO neutron.agent.linux.ip_lib [None req-5ea4ad5d-c409-4169-84b1-84707fe2e0ae - - - - - -] Device tap32986807-4a cannot be used as it has no MAC address
Dec 02 10:10:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:16.036 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:16.044 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:16 np0005541913.localdomain kernel: device tap32986807-4a entered promiscuous mode
Dec 02 10:10:16 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670216.0530] manager: (tap32986807-4a): new Generic device (/org/freedesktop/NetworkManager/Devices/72)
Dec 02 10:10:16 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:16Z|00448|binding|INFO|Claiming lport 32986807-4a62-4af8-ad03-9336f56fbec0 for this chassis.
Dec 02 10:10:16 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:16Z|00449|binding|INFO|32986807-4a62-4af8-ad03-9336f56fbec0: Claiming unknown
Dec 02 10:10:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:16.056 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:16 np0005541913.localdomain systemd-udevd[325818]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:10:16 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:16.064 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d25b6f5f-b086-4558-a0fb-fc54d0ecba34, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=32986807-4a62-4af8-ad03-9336f56fbec0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:16 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:16.066 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 32986807-4a62-4af8-ad03-9336f56fbec0 in datapath 39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07 bound to our chassis
Dec 02 10:10:16 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:16.068 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:10:16 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:16.069 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[90f18f69-a4f5-451e-a0e7-95ab5d079358]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:16 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap32986807-4a: No such device
Dec 02 10:10:16 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:16Z|00450|binding|INFO|Setting lport 32986807-4a62-4af8-ad03-9336f56fbec0 ovn-installed in OVS
Dec 02 10:10:16 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:16Z|00451|binding|INFO|Setting lport 32986807-4a62-4af8-ad03-9336f56fbec0 up in Southbound
Dec 02 10:10:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:16.084 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:16 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap32986807-4a: No such device
Dec 02 10:10:16 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap32986807-4a: No such device
Dec 02 10:10:16 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap32986807-4a: No such device
Dec 02 10:10:16 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap32986807-4a: No such device
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.107 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.109 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap32986807-4a: No such device
Dec 02 10:10:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:16.117 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:16 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap32986807-4a: No such device
Dec 02 10:10:16 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap32986807-4a: No such device
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.133 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc33439d-fe6e-4e8a-86e3-91b7272799f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:10:16.109437', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1949f2fe-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.352466151, 'message_signature': 'c37427566a9770c4266d5c3a04620520137d80e6dfdfb3e0266cd49485214ef5'}]}, 'timestamp': '2025-12-02 10:10:16.134638', '_unique_id': 'ecdcb95a34104229bc9462104c1b3798'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.136 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.142 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30883092-0aed-4957-9b36-1c1308c7e09e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.138479', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '194b423a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': '10df04daffd25d6e4fa60f09e165f14f29f3d118beba6387b069a99feda8bdf5'}]}, 'timestamp': '2025-12-02 10:10:16.143154', '_unique_id': '24c4280373934464a997088423d5a51c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.144 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.145 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:16.157 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.173 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.174 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fff54991-3146-4542-9ab0-a53826a9dde4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:10:16.146212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '194ffdca-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': 'e80cd05c0b70a3cbadf8d806c4d8f5fa022fe085a1019854ade5b3164a78dcee'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:10:16.146212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19500d4c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': '2a3a102079be4a20ad0e6467e2cd70155d5af12269c4d1e4dae5ac12809bfecd'}]}, 'timestamp': '2025-12-02 10:10:16.174463', '_unique_id': 'a12ef04f78e14a20979100bacbb3704e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.175 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.177 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.177 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d7ec6b0-2a5c-443a-9c6a-c49eeed50c07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:10:16.177074', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19508100-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': 'dcec819b26e36d08be7be256b777e606e2b9fc2c79f9d494235f4abd1da696be'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:10:16.177074', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19509028-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': '8b75786dabbd1ccc028340d1dbdac2ceef6ce6e814087a006703b3bdf9028bd2'}]}, 'timestamp': '2025-12-02 10:10:16.177805', '_unique_id': '927c3925dd2749f4b862da18523fe247'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.178 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.179 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0215f8a0-276c-4163-85a3-86573be209df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.179572', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '1950e424-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': '8fe27ed67fa14abee8e901c53bfb6b91b775b8fee7a016b6d7baee48529fd85d'}]}, 'timestamp': '2025-12-02 10:10:16.179969', '_unique_id': 'cb4a898386da452fa2396bbcd5ac8bd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.182 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.182 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fae34dab-ea2b-42ce-8538-47761b74778c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:10:16.182071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19514478-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': '9d7e4ab09f672e95a4abdadb93afb7f4b93ca0cb535a5ed5dd0c31732106701f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:10:16.182071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '195154ae-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': 'dad63abdda738de45c0be74908f510335b796b5340fe24c7ee9a0068ddab1533'}]}, 'timestamp': '2025-12-02 10:10:16.182854', '_unique_id': '0e92621fccc04a87b7bce6ebafbdf1be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.184 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bc219c4-0aae-4d14-960e-249d4a31f7dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.184843', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '1951b20a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': '5b29dc4b93340cd2a5d677ae27669aa2ffd366bbfb846d2a8361014f97cb7656'}]}, 'timestamp': '2025-12-02 10:10:16.185288', '_unique_id': '4a8af8df0eb3410baaafe2346c464ecf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.187 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.187 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5353a060-5a4c-41ff-92c6-b081728be49f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:10:16.187238', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19520da4-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': '6562a5d84e0bd2de423a3346d3b7e3916ddbf8b4f15f6353f8891392d5297601'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:10:16.187238', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19521aec-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': 'a527a540c363c2ebd881e938ec7d4d2be7bf53ea3be8ebf6d74c53c9dc850deb'}]}, 'timestamp': '2025-12-02 10:10:16.187915', '_unique_id': '1e8061a8f90c4560adc6b4625f3818f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.189 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.190 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fa5396e-2932-4316-a8fd-db04cdf9f523', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:10:16.189867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1952755a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': 'c55bbc5fedfd297d4def519f52153d6407ab09074dd35f3ef2bffc1e102ba589'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:10:16.189867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '195281b2-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': '01155353d5cb9a0a5857925e73e70958f3508df55d63e1ca17af558489ca9e89'}]}, 'timestamp': '2025-12-02 10:10:16.190535', '_unique_id': 'd07e39db30a54707bc027b5bfa462b7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.201 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.202 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b37d9ffa-98cc-4e25-a181-309a1f2ea829', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:10:16.192298', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19544f06-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.411375591, 'message_signature': '9b2d22d5b1251c195302c882236f76c5181785b5ec0960c5e9ca4c996361c837'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:10:16.192298', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19545bf4-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.411375591, 'message_signature': '168ffcd4cb49466cc1c5c7c60f77fe6b94dea8bf48899d2353d4ab8082e2d522'}]}, 'timestamp': '2025-12-02 10:10:16.202707', '_unique_id': '5eedb5e2d05042a9939ad20bd6a97ff3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.203 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.204 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53c5030e-7ccc-45ea-a5e4-14e87a869938', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.204520', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '1954b1f8-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': 'ec6d517dba794707c86df6d3d96e3ebc80e4cf56dd10750ff022bfd900993c04'}]}, 'timestamp': '2025-12-02 10:10:16.204905', '_unique_id': 'd333e085e55f499fadbc234004ed6d54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df424f8f-b43b-45c3-91a6-b33d8bb13ee0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:10:16.206669', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1955048c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.411375591, 'message_signature': '8f64f121a5f8f136a4fd176bfc7b9f306f7f54edca43c4b0d83a31246cbf5222'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:10:16.206669', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19551094-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.411375591, 'message_signature': '0bdfdf1ec6f18e86075e9fbda7f8789d6a2d5a65d3f1144e8bcc0c7ae6b0a7ce'}]}, 'timestamp': '2025-12-02 10:10:16.207306', '_unique_id': '3246ea5dc23d461c92bcba075c706eef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a79b561-8f99-488a-8d6a-dcb44ebe3667', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.209062', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '19556210-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': 'cd439f880027198127fc70e969b9b7d01305a0f62c1bd17e505c98113ee07757'}]}, 'timestamp': '2025-12-02 10:10:16.209410', '_unique_id': '5af9199f1182440090a092eb7393a67b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.211 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.211 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82563115-f345-4e5d-bbd4-ca1fc289155c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:10:16.211180', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1955b4a4-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.411375591, 'message_signature': '62cd4db41904eb24d16af2d7f04e9cbe5c68cbb706a7729b2055f868adf9944c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:10:16.211180', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1955c1ce-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.411375591, 'message_signature': '2b9933e03a3c07178d477e1608893d03de38c64bf5405756a2f1cb115e803d4e'}]}, 'timestamp': '2025-12-02 10:10:16.211843', '_unique_id': '907c7fe5ad8d4c19b0a56aeb89ebb6dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.213 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.213 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.214 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5803ee1-3a98-433b-b77e-8dc351808aa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:10:16.213763', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19561980-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': '8dddafe28d3db9fbfbdb8df783c0d1b176281c59b99d1712a546810d09cc3c6f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:10:16.213763', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19562588-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.365295593, 'message_signature': 'f6d285573e50887393a753ea10e656b13d08642676cd161f150c3117983c38e1'}]}, 'timestamp': '2025-12-02 10:10:16.214400', '_unique_id': 'aab4a98b5bfd4512b0ab8691b796b759'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.216 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e05347e7-99af-41fd-b47d-ac50f2a9eb86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.216128', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '195675f6-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': '756642f5eb1d8a3293c40eef4444a4455a1477545c652bb1b07a2ec2417e7471'}]}, 'timestamp': '2025-12-02 10:10:16.216475', '_unique_id': '02b468c066004b8fa148fd585bc334f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.218 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3207bf1-1d67-4835-ae26-adfc0c91c59f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.218669', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '1956d974-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': 'b2037c26d5ed59e4cf3ac64e3593c1dc07b7087cb7853fe15a9f4bec2ebf1407'}]}, 'timestamp': '2025-12-02 10:10:16.219018', '_unique_id': '6acb1915bf534975ad99243255ba02cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.220 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 18350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '595557c5-81bc-4f96-b5e2-83db8d91fd6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18350000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:10:16.220711', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '195728fc-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.352466151, 'message_signature': 'a88c92a94ec3dacdb023d2ed4d4253147511e4c47af27050cb4064b4d68d2219'}]}, 'timestamp': '2025-12-02 10:10:16.221045', '_unique_id': '4cadc3cf70fe47168a040646145d0ac6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.222 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74e5ce43-8e94-443d-b3c7-166cec3b6c93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.222804', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '19577ac8-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': 'f365758e86ee1318a75ce5baeec0f085071b2ee4c8a512948bed260afbfd5229'}]}, 'timestamp': '2025-12-02 10:10:16.223149', '_unique_id': 'ed5c709234a44f759c3167f7cf26c4f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.224 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bc7d9d6-d016-47a4-8c33-d5f58031cd3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.224981', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '1957cfb4-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': '216c95df45552602df9ea3fc209e156b2ffad9c4017290baad6c6ea97fc07ead'}]}, 'timestamp': '2025-12-02 10:10:16.225324', '_unique_id': '3fa9a0b3c7484aa59269f5467c3efb70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2661f453-9c28-4642-938f-5e830e2bc664', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:10:16.227047', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '19582072-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12378.357573517, 'message_signature': 'b52d7ca92e2c7b415a71ee804bfe3dc7a16b8c996554489109173ec874d30510'}]}, 'timestamp': '2025-12-02 10:10:16.227393', '_unique_id': '219f45f592b7440395ece93256e1d984'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:10:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:10:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:10:16 np0005541913.localdomain podman[325889]: 
Dec 02 10:10:16 np0005541913.localdomain podman[325889]: 2025-12-02 10:10:16.997827267 +0000 UTC m=+0.115258394 container create 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:10:17 np0005541913.localdomain podman[325889]: 2025-12-02 10:10:16.913518039 +0000 UTC m=+0.030949176 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:17 np0005541913.localdomain systemd[1]: Started libpod-conmon-9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c.scope.
Dec 02 10:10:17 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:17 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47b13ae10e11528f1ef963d551ebcf95dfcf3157f75d67b00cb1cbdcb5fd574f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:17 np0005541913.localdomain podman[325889]: 2025-12-02 10:10:17.068285634 +0000 UTC m=+0.185716761 container init 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:17 np0005541913.localdomain dnsmasq[325907]: started, version 2.85 cachesize 150
Dec 02 10:10:17 np0005541913.localdomain dnsmasq[325907]: DNS service limited to local subnets
Dec 02 10:10:17 np0005541913.localdomain podman[325889]: 2025-12-02 10:10:17.078877276 +0000 UTC m=+0.196308443 container start 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:10:17 np0005541913.localdomain dnsmasq[325907]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:17 np0005541913.localdomain dnsmasq[325907]: warning: no upstream servers configured
Dec 02 10:10:17 np0005541913.localdomain dnsmasq-dhcp[325907]: DHCP, static leases only on 10.101.0.0, lease time 1d
Dec 02 10:10:17 np0005541913.localdomain dnsmasq[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/addn_hosts - 0 addresses
Dec 02 10:10:17 np0005541913.localdomain dnsmasq-dhcp[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/host
Dec 02 10:10:17 np0005541913.localdomain dnsmasq-dhcp[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/opts
Dec 02 10:10:17 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:17.242 263406 INFO neutron.agent.dhcp.agent [None req-19ca4539-dc74-4ccb-8f45-f7f7345f2b18 - - - - - -] DHCP configuration for ports {'38d8cd53-d8f6-48a6-8f6b-757990efd71c'} is completed
Dec 02 10:10:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:18.134 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:10:17Z, description=, device_id=ccc84569-b123-40e7-b4b7-ca02e8eac496, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089969d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908806640>], id=70c458f3-a908-4c47-aabc-6babc0f38a51, ip_allocation=immediate, mac_address=fa:16:3e:37:d6:0e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:10:13Z, description=, dns_domain=, id=39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-517243776, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41799, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2424, status=ACTIVE, subnets=['62baa615-eade-454c-b324-6bcc3b621528'], tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:10:14Z, vlan_transparent=None, network_id=39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, port_security_enabled=False, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2453, status=DOWN, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:10:17Z on network 39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07
Dec 02 10:10:18 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:18.245 2 INFO neutron.agent.securitygroups_rpc [None req-46c6c36b-2a47-445c-9836-d1e79e5b14a9 8d2b383649fa45f2821f6e290127374a 84fd536b8b4d489f944ed3e4bbfaeb5b - - default default] Security group rule updated ['d6dcbb7b-b610-4062-87d4-37eec03c1ecf']
Dec 02 10:10:18 np0005541913.localdomain ceph-mon[298296]: pgmap v372: 177 pgs: 177 active+clean; 291 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 12 MiB/s wr, 37 op/s
Dec 02 10:10:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:10:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:10:18 np0005541913.localdomain dnsmasq[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/addn_hosts - 1 addresses
Dec 02 10:10:18 np0005541913.localdomain dnsmasq-dhcp[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/host
Dec 02 10:10:18 np0005541913.localdomain dnsmasq-dhcp[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/opts
Dec 02 10:10:18 np0005541913.localdomain podman[325926]: 2025-12-02 10:10:18.406779834 +0000 UTC m=+0.072552914 container kill 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:10:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:18.754 263406 INFO neutron.agent.dhcp.agent [None req-c78c9559-dad7-4d8d-bd13-089df71025af - - - - - -] DHCP configuration for ports {'70c458f3-a908-4c47-aabc-6babc0f38a51'} is completed
Dec 02 10:10:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:10:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:10:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:10:19 np0005541913.localdomain systemd[1]: tmp-crun.T30d33.mount: Deactivated successfully.
Dec 02 10:10:19 np0005541913.localdomain podman[325948]: 2025-12-02 10:10:19.459283032 +0000 UTC m=+0.094948761 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:10:19 np0005541913.localdomain podman[325948]: 2025-12-02 10:10:19.526256507 +0000 UTC m=+0.161922196 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm)
Dec 02 10:10:19 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:10:20 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:20.233 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:10:17Z, description=, device_id=ccc84569-b123-40e7-b4b7-ca02e8eac496, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087ee5e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087eea30>], id=70c458f3-a908-4c47-aabc-6babc0f38a51, ip_allocation=immediate, mac_address=fa:16:3e:37:d6:0e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:10:13Z, description=, dns_domain=, id=39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-517243776, port_security_enabled=True, project_id=7dffef2e74844a7ebb6ee68826fb7e57, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41799, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2424, status=ACTIVE, subnets=['62baa615-eade-454c-b324-6bcc3b621528'], tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:10:14Z, vlan_transparent=None, network_id=39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, port_security_enabled=False, project_id=7dffef2e74844a7ebb6ee68826fb7e57, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2453, status=DOWN, tags=[], tenant_id=7dffef2e74844a7ebb6ee68826fb7e57, updated_at=2025-12-02T10:10:17Z on network 39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07
Dec 02 10:10:20 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:20.292 2 INFO neutron.agent.securitygroups_rpc [None req-0a3dbc5e-28c6-4790-aae5-682551e66674 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:20 np0005541913.localdomain dnsmasq[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/addn_hosts - 1 addresses
Dec 02 10:10:20 np0005541913.localdomain dnsmasq-dhcp[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/host
Dec 02 10:10:20 np0005541913.localdomain dnsmasq-dhcp[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/opts
Dec 02 10:10:20 np0005541913.localdomain podman[325982]: 2025-12-02 10:10:20.473193932 +0000 UTC m=+0.070758726 container kill 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:10:20 np0005541913.localdomain systemd[1]: tmp-crun.HYftgK.mount: Deactivated successfully.
Dec 02 10:10:20 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:10:20 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "format": "json"}]: dispatch
Dec 02 10:10:20 np0005541913.localdomain ceph-mon[298296]: pgmap v373: 177 pgs: 177 active+clean; 435 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 29 KiB/s rd, 24 MiB/s wr, 54 op/s
Dec 02 10:10:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:20.700 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:21.039 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:21.087 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.201 263406 INFO neutron.agent.dhcp.agent [None req-a59c5026-9011-4028-a90d-553fcad66b1b - - - - - -] DHCP configuration for ports {'70c458f3-a908-4c47-aabc-6babc0f38a51'} is completed
Dec 02 10:10:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent [None req-7847b949-fdf3-4943-985f-b94a01502df1 - - - - - -] Unable to enable dhcp for c9b0342d-5faa-4588-b4a8-0132ddc49133.: oslo_messaging.rpc.client.RemoteError: Remote error: MechanismDriverError 
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: ['Traceback (most recent call last):\n', '  File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming\n    res = self.dispatcher.dispatch(message)\n', '  File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch\n    return self._do_dispatch(endpoint, method, ctxt, args)\n', '  File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch\n    result = func(ctxt, **new_args)\n', '  File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 244, in inner\n    return func(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n    setattr(e, \'_RETRY_EXCEEDED\', True)\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n    ectxt.value = e.inner_exc\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n    context_reference.session.rollback()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n    return f(*dup_args, **dup_kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron/quota/resource_registry.py", line 95, in wrapper\n    ret_val = f(_self, context, *args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron/api/rpc/handlers/dhcp_rpc.py", line 292, in create_dhcp_port\n    return self._port_action(plugin, context, port, \'create_port\')\n', '  File "/usr/lib/python3.9/site-packages/neutron/api/rpc/handlers/dhcp_rpc.py", line 118, in _port_action\n    return p_utils.create_port(plugin, context, port)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/plugins/utils.py", line 338, in create_port\n    return core_plugin.create_port(\n', '  File "/usr/lib/python3.9/site-packages/neutron/common/utils.py", line 728, in inner\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 226, in wrapped\n    return f_with_retry(*args, **kwargs,\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n    setattr(e, \'_RETRY_EXCEEDED\', True)\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n    ectxt.value = e.inner_exc\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n    context_reference.session.rollback()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n    return f(*dup_args, **dup_kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1583, in create_port\n    return self._after_create_port(context, result, mech_context)\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1607, in _after_create_port\n    self.delete_port(context, result[\'id\'], l3_port_check=False)\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1602, in _after_create_port\n    bound_context = self._bind_port_if_needed(mech_context)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n    setattr(e, \'_RETRY_EXCEEDED\', True)\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n    ectxt.value = e.inner_exc\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n    context_reference.session.rollback()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n    return f(*dup_args, **dup_kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 607, in _bind_port_if_needed\n    self._commit_port_binding(context, bind_context,\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 861, in _commit_port_binding\n    self.mechanism_manager.update_port_postcommit(cur_context)\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/managers.py", line 764, in update_port_postcommit\n    self._call_on_drivers("update_port_postcommit", context,\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/managers.py", line 513, in _call_on_drivers\n    raise ml2_exc.MechanismDriverError(\n', 'neutron.plugins.ml2.common.exceptions.MechanismDriverError\n'].
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 324, in enable
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     common_utils.wait_until_true(self._enable, timeout=300)
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/common/utils.py", line 744, in wait_until_true
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     while not predicate():
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 336, in _enable
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     interface_name = self.device_manager.setup(
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1825, in setup
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     self.cleanup_stale_devices(network, dhcp_port=None)
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     self.force_reraise()
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     raise self.value
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1820, in setup
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     port = self.setup_dhcp_port(network, segment)
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1755, in setup_dhcp_port
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     dhcp_port = setup_method(network, device_id, dhcp_subnets)
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1703, in _setup_new_dhcp_port
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     return self.plugin.create_dhcp_port({'port': port_dict})
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 893, in create_dhcp_port
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     port = cctxt.call(self.context, 'create_dhcp_port',
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron_lib/rpc.py", line 157, in call
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     return self._original_context.call(ctxt, method, **kwargs)
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 190, in call
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     result = self.transport._send(
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     return self._driver.send(target, ctxt, message,
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent     raise result
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent oslo_messaging.rpc.client.RemoteError: Remote error: MechanismDriverError 
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent ['Traceback (most recent call last):\n', '  File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming\n    res = self.dispatcher.dispatch(message)\n', '  File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch\n    return self._do_dispatch(endpoint, method, ctxt, args)\n', '  File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch\n    result = func(ctxt, **new_args)\n', '  File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 244, in inner\n    return func(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n    setattr(e, \'_RETRY_EXCEEDED\', True)\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n    ectxt.value = e.inner_exc\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n    context_reference.session.rollback()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n    return f(*dup_args, **dup_kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron/quota/resource_registry.py", line 95, in wrapper\n    ret_val = f(_self, context, *args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron/api/rpc/handlers/dhcp_rpc.py", line 292, in create_dhcp_port\n    return self._port_action(plugin, context, port, \'create_port\')\n', '  File "/usr/lib/python3.9/site-packages/neutron/api/rpc/handlers/dhcp_rpc.py", line 118, in _port_action\n    return p_utils.create_port(plugin, context, port)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/plugins/utils.py", line 338, in create_port\n    return core_plugin.create_port(\n', '  File "/usr/lib/python3.9/site-packages/neutron/common/utils.py", line 728, in inner\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 226, in wrapped\n    return f_with_retry(*args, **kwargs,\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n    setattr(e, \'_RETRY_EXCEEDED\', True)\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n    ectxt.value = e.inner_exc\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n    context_reference.session.rollback()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n    return f(*dup_args, **dup_kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1583, in create_port\n    return self._after_create_port(context, result, mech_context)\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1607, in _after_create_port\n    self.delete_port(context, result[\'id\'], l3_port_check=False)\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1602, in _after_create_port\n    bound_context = self._bind_port_if_needed(mech_context)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n    setattr(e, \'_RETRY_EXCEEDED\', True)\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n    ectxt.value = e.inner_exc\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n    return f(*args, **kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n    context_reference.session.rollback()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n    return f(*dup_args, **dup_kwargs)\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 607, in _bind_port_if_needed\n    self._commit_port_binding(context, bind_context,\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 861, in _commit_port_binding\n    self.mechanism_manager.update_port_postcommit(cur_context)\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/managers.py", line 764, in update_port_postcommit\n    self._call_on_drivers("update_port_postcommit", context,\n', '  File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/managers.py", line 513, in _call_on_drivers\n    raise ml2_exc.MechanismDriverError(\n', 'neutron.plugins.ml2.common.exceptions.MechanismDriverError\n'].
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.230 263406 ERROR neutron.agent.dhcp.agent 
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.232 263406 INFO neutron.agent.dhcp.agent [None req-df0a516e-2ba9-46d1-bdd1-2505dc3dca33 - - - - - -] Synchronizing state
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.447 263406 INFO neutron.agent.dhcp.agent [None req-3717f4b1-77ba-425c-b108-011d16bbdeb6 - - - - - -] All active networks have been fetched through RPC.
Dec 02 10:10:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:21.448 263406 INFO neutron.agent.dhcp.agent [-] Starting network c9b0342d-5faa-4588-b4a8-0132ddc49133 dhcp configuration
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.332 263406 INFO neutron.agent.dhcp.agent [None req-b71ad753-eb9b-4d15-afb8-18fb045a258a - - - - - -] Finished network c9b0342d-5faa-4588-b4a8-0132ddc49133 dhcp configuration
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.333 263406 INFO neutron.agent.dhcp.agent [None req-3717f4b1-77ba-425c-b108-011d16bbdeb6 - - - - - -] Synchronizing state complete
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.334 263406 INFO neutron.agent.dhcp.agent [None req-3717f4b1-77ba-425c-b108-011d16bbdeb6 - - - - - -] Synchronizing state
Dec 02 10:10:22 np0005541913.localdomain ceph-mon[298296]: pgmap v374: 177 pgs: 177 active+clean; 435 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 20 KiB/s rd, 23 MiB/s wr, 37 op/s
Dec 02 10:10:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:10:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:10:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:10:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:10:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 02 10:10:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:10:22 np0005541913.localdomain podman[326003]: 2025-12-02 10:10:22.462090545 +0000 UTC m=+0.087933484 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:10:22 np0005541913.localdomain podman[326003]: 2025-12-02 10:10:22.472347899 +0000 UTC m=+0.098190838 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:10:22 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.678 263406 INFO neutron.agent.dhcp.agent [None req-2ad2cb68-84bf-4287-b213-7d22078cddbd - - - - - -] All active networks have been fetched through RPC.
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.679 263406 INFO neutron.agent.dhcp.agent [-] Starting network c9b0342d-5faa-4588-b4a8-0132ddc49133 dhcp configuration
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.679 263406 INFO neutron.agent.dhcp.agent [-] Finished network c9b0342d-5faa-4588-b4a8-0132ddc49133 dhcp configuration
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.680 263406 INFO neutron.agent.dhcp.agent [None req-2ad2cb68-84bf-4287-b213-7d22078cddbd - - - - - -] Synchronizing state complete
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.680 263406 INFO neutron.agent.dhcp.agent [None req-7847b949-fdf3-4943-985f-b94a01502df1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.681 263406 INFO neutron.agent.dhcp.agent [None req-7847b949-fdf3-4943-985f-b94a01502df1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:22.916 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:10:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "aacf1e5d-1b53-42f1-b3a7-45f0acb43c13", "format": "json"}]: dispatch
Dec 02 10:10:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:24.038 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:10:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:10:24 np0005541913.localdomain ceph-mon[298296]: pgmap v375: 177 pgs: 177 active+clean; 563 MiB data, 2.0 GiB used, 40 GiB / 42 GiB avail; 27 KiB/s rd, 33 MiB/s wr, 54 op/s
Dec 02 10:10:24 np0005541913.localdomain podman[326021]: 2025-12-02 10:10:24.455100617 +0000 UTC m=+0.094890449 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, container_name=openstack_network_exporter)
Dec 02 10:10:24 np0005541913.localdomain podman[326022]: 2025-12-02 10:10:24.544289623 +0000 UTC m=+0.177519461 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:10:24 np0005541913.localdomain podman[326021]: 2025-12-02 10:10:24.549193884 +0000 UTC m=+0.188983726 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec 02 10:10:24 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:10:24 np0005541913.localdomain podman[326022]: 2025-12-02 10:10:24.603496702 +0000 UTC m=+0.236726500 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:10:24 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:10:24 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:24.933 2 INFO neutron.agent.securitygroups_rpc [None req-3ae6da31-b902-4566-8baa-11e094d2ee12 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:25 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:10:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:10:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:10:25 np0005541913.localdomain sudo[326062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:10:25 np0005541913.localdomain sudo[326062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:10:25 np0005541913.localdomain sudo[326062]: pam_unix(sudo:session): session closed for user root
Dec 02 10:10:25 np0005541913.localdomain sudo[326080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 10:10:25 np0005541913.localdomain sudo[326080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:10:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:26.042 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:26.089 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:26 np0005541913.localdomain ceph-mon[298296]: pgmap v376: 177 pgs: 177 active+clean; 563 MiB data, 2.0 GiB used, 40 GiB / 42 GiB avail; 16 KiB/s rd, 23 MiB/s wr, 33 op/s
Dec 02 10:10:26 np0005541913.localdomain podman[326171]: 2025-12-02 10:10:26.628438443 +0000 UTC m=+0.102064720 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, vcs-type=git, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Dec 02 10:10:26 np0005541913.localdomain podman[326171]: 2025-12-02 10:10:26.813398603 +0000 UTC m=+0.287024870 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, release=1763362218, RELEASE=main, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 10:10:27 np0005541913.localdomain sudo[326080]: pam_unix(sudo:session): session closed for user root
Dec 02 10:10:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "99d59f6b-cf2c-47ae-b465-7c2965afd103", "format": "json"}]: dispatch
Dec 02 10:10:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:10:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:10:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:10:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:10:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:10:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:27.539 263406 INFO neutron.agent.linux.ip_lib [None req-c37d69a8-486c-4e50-8e89-8e4f125d8676 - - - - - -] Device tap6305f6b8-f6 cannot be used as it has no MAC address
Dec 02 10:10:27 np0005541913.localdomain sudo[326292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:10:27 np0005541913.localdomain sudo[326292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:10:27 np0005541913.localdomain sudo[326292]: pam_unix(sudo:session): session closed for user root
Dec 02 10:10:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:27.573 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:27 np0005541913.localdomain kernel: device tap6305f6b8-f6 entered promiscuous mode
Dec 02 10:10:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:27.585 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:27 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670227.5854] manager: (tap6305f6b8-f6): new Generic device (/org/freedesktop/NetworkManager/Devices/73)
Dec 02 10:10:27 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:27Z|00452|binding|INFO|Claiming lport 6305f6b8-f6d1-42c8-8da0-74c67d8b4998 for this chassis.
Dec 02 10:10:27 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:27Z|00453|binding|INFO|6305f6b8-f6d1-42c8-8da0-74c67d8b4998: Claiming unknown
Dec 02 10:10:27 np0005541913.localdomain systemd-udevd[326323]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:10:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:27.597 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-ba9b74ca-c826-47d9-9b2c-806aa0652611', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba9b74ca-c826-47d9-9b2c-806aa0652611', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae8275bd-608b-4d44-bec9-32778c15dfb9, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=6305f6b8-f6d1-42c8-8da0-74c67d8b4998) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:27.598 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 6305f6b8-f6d1-42c8-8da0-74c67d8b4998 in datapath ba9b74ca-c826-47d9-9b2c-806aa0652611 bound to our chassis
Dec 02 10:10:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:27.598 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ba9b74ca-c826-47d9-9b2c-806aa0652611 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:10:27 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:27.600 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[4335b6b4-9a2d-4f5a-bd08-72b05566de47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:27.618 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:27 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap6305f6b8-f6: No such device
Dec 02 10:10:27 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:27Z|00454|binding|INFO|Setting lport 6305f6b8-f6d1-42c8-8da0-74c67d8b4998 ovn-installed in OVS
Dec 02 10:10:27 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:27Z|00455|binding|INFO|Setting lport 6305f6b8-f6d1-42c8-8da0-74c67d8b4998 up in Southbound
Dec 02 10:10:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:27.624 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:27 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap6305f6b8-f6: No such device
Dec 02 10:10:27 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap6305f6b8-f6: No such device
Dec 02 10:10:27 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap6305f6b8-f6: No such device
Dec 02 10:10:27 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap6305f6b8-f6: No such device
Dec 02 10:10:27 np0005541913.localdomain sudo[326313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:10:27 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap6305f6b8-f6: No such device
Dec 02 10:10:27 np0005541913.localdomain sudo[326313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:10:27 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap6305f6b8-f6: No such device
Dec 02 10:10:27 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap6305f6b8-f6: No such device
Dec 02 10:10:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:27.675 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:27.724 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:28 np0005541913.localdomain sudo[326313]: pam_unix(sudo:session): session closed for user root
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: pgmap v377: 177 pgs: 177 active+clean; 563 MiB data, 2.0 GiB used, 40 GiB / 42 GiB avail; 16 KiB/s rd, 23 MiB/s wr, 33 op/s
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:10:28 np0005541913.localdomain podman[326435]: 
Dec 02 10:10:28 np0005541913.localdomain podman[326435]: 2025-12-02 10:10:28.703157363 +0000 UTC m=+0.125010472 container create 5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ba9b74ca-c826-47d9-9b2c-806aa0652611, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:28 np0005541913.localdomain podman[326435]: 2025-12-02 10:10:28.628320359 +0000 UTC m=+0.050173488 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:28 np0005541913.localdomain sudo[326448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:10:28 np0005541913.localdomain systemd[1]: Started libpod-conmon-5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf.scope.
Dec 02 10:10:28 np0005541913.localdomain sudo[326448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:10:28 np0005541913.localdomain sudo[326448]: pam_unix(sudo:session): session closed for user root
Dec 02 10:10:28 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:28 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/846af1f535196c2b1f3fa28965eaa27fb6ef8e77dfc94da176dafd06a1387634/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:28 np0005541913.localdomain podman[326435]: 2025-12-02 10:10:28.793974303 +0000 UTC m=+0.215827422 container init 5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ba9b74ca-c826-47d9-9b2c-806aa0652611, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 02 10:10:28 np0005541913.localdomain podman[326435]: 2025-12-02 10:10:28.803435126 +0000 UTC m=+0.225288235 container start 5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ba9b74ca-c826-47d9-9b2c-806aa0652611, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:10:28 np0005541913.localdomain dnsmasq[326471]: started, version 2.85 cachesize 150
Dec 02 10:10:28 np0005541913.localdomain dnsmasq[326471]: DNS service limited to local subnets
Dec 02 10:10:28 np0005541913.localdomain dnsmasq[326471]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:28 np0005541913.localdomain dnsmasq[326471]: warning: no upstream servers configured
Dec 02 10:10:28 np0005541913.localdomain dnsmasq-dhcp[326471]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:10:28 np0005541913.localdomain dnsmasq[326471]: read /var/lib/neutron/dhcp/ba9b74ca-c826-47d9-9b2c-806aa0652611/addn_hosts - 0 addresses
Dec 02 10:10:28 np0005541913.localdomain dnsmasq-dhcp[326471]: read /var/lib/neutron/dhcp/ba9b74ca-c826-47d9-9b2c-806aa0652611/host
Dec 02 10:10:28 np0005541913.localdomain dnsmasq-dhcp[326471]: read /var/lib/neutron/dhcp/ba9b74ca-c826-47d9-9b2c-806aa0652611/opts
Dec 02 10:10:29 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:29.026 263406 INFO neutron.agent.dhcp.agent [None req-07cc9c98-c19c-48d9-aa3f-f26314ee2cd6 - - - - - -] DHCP configuration for ports {'b38cddf9-e4fe-47e0-9ffd-5e33d69bbc25'} is completed
Dec 02 10:10:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:29.059 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541914.localdomain to 836.6M
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541912.localdomain to 836.6M
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541913.localdomain to 836.6M
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541914.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541912.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541913.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:10:29 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:10:30 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:30.066 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:30 np0005541913.localdomain ceph-mon[298296]: pgmap v378: 177 pgs: 177 active+clean; 675 MiB data, 2.4 GiB used, 40 GiB / 42 GiB avail; 23 KiB/s rd, 32 MiB/s wr, 48 op/s
Dec 02 10:10:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:31.046 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:31.128 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "99d59f6b-cf2c-47ae-b465-7c2965afd103_827df7ca-4b70-438d-9e17-f06019bfe5e4", "force": true, "format": "json"}]: dispatch
Dec 02 10:10:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "99d59f6b-cf2c-47ae-b465-7c2965afd103", "force": true, "format": "json"}]: dispatch
Dec 02 10:10:31 np0005541913.localdomain ceph-mon[298296]: pgmap v379: 177 pgs: 177 active+clean; 675 MiB data, 2.4 GiB used, 40 GiB / 42 GiB avail; 15 KiB/s rd, 20 MiB/s wr, 31 op/s
Dec 02 10:10:32 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:10:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:10:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:10:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:10:32 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:32.835 2 INFO neutron.agent.securitygroups_rpc [None req-b8b78442-432d-4c51-90c0-6b0763587b8d 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['1794fecb-60a8-41cc-838d-a48dc5474875']
Dec 02 10:10:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e156 e156: 6 total, 6 up, 6 in
Dec 02 10:10:33 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:10:33 np0005541913.localdomain podman[326472]: 2025-12-02 10:10:33.451783988 +0000 UTC m=+0.082470708 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Dec 02 10:10:33 np0005541913.localdomain podman[326472]: 2025-12-02 10:10:33.490031648 +0000 UTC m=+0.120718368 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:33 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:10:33 np0005541913.localdomain ceph-mon[298296]: osdmap e156: 6 total, 6 up, 6 in
Dec 02 10:10:33 np0005541913.localdomain ceph-mon[298296]: pgmap v381: 177 pgs: 8 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 167 active+clean; 787 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 16 KiB/s rd, 22 MiB/s wr, 36 op/s
Dec 02 10:10:33 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "7a4c64a8-f75e-4eb8-8104-489d2e71f23a", "format": "json"}]: dispatch
Dec 02 10:10:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:10:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:10:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:10:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:10:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:10:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:10:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:10:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:10:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:34.609 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:34.611 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:10:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:34.610 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:35 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:35.371 2 INFO neutron.agent.securitygroups_rpc [None req-d462b9e0-1fd6-4bb8-aa5c-65cdc1c34ce0 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['1bd96bc4-2204-473c-8b88-08bb385e4850', '1794fecb-60a8-41cc-838d-a48dc5474875']
Dec 02 10:10:35 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:35.997 263406 INFO neutron.agent.linux.ip_lib [None req-30ed4756-1c6a-4a9b-8207-822095a9eb63 - - - - - -] Device tap998458b3-d6 cannot be used as it has no MAC address
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.020 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain kernel: device tap998458b3-d6 entered promiscuous mode
Dec 02 10:10:36 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670236.0297] manager: (tap998458b3-d6): new Generic device (/org/freedesktop/NetworkManager/Devices/74)
Dec 02 10:10:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:36Z|00456|binding|INFO|Claiming lport 998458b3-d6cd-4ecd-850f-289ca92b1da7 for this chassis.
Dec 02 10:10:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:36Z|00457|binding|INFO|998458b3-d6cd-4ecd-850f-289ca92b1da7: Claiming unknown
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.031 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain systemd-udevd[326501]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:10:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:36.038 2 INFO neutron.agent.securitygroups_rpc [None req-d3e51cbf-aa8e-4108-aa2d-8a899b386ca8 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['1bd96bc4-2204-473c-8b88-08bb385e4850']
Dec 02 10:10:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:10:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:10:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap998458b3-d6: No such device
Dec 02 10:10:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:36Z|00458|binding|INFO|Setting lport 998458b3-d6cd-4ecd-850f-289ca92b1da7 ovn-installed in OVS
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.066 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap998458b3-d6: No such device
Dec 02 10:10:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap998458b3-d6: No such device
Dec 02 10:10:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:10:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159748 "" "Go-http-client/1.1"
Dec 02 10:10:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap998458b3-d6: No such device
Dec 02 10:10:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap998458b3-d6: No such device
Dec 02 10:10:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap998458b3-d6: No such device
Dec 02 10:10:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap998458b3-d6: No such device
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.104 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap998458b3-d6: No such device
Dec 02 10:10:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:10:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20212 "" "Go-http-client/1.1"
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.129 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.135 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain ceph-mon[298296]: pgmap v382: 177 pgs: 8 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 167 active+clean; 787 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 16 KiB/s rd, 22 MiB/s wr, 36 op/s
Dec 02 10:10:36 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:10:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:10:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:10:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:10:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 02 10:10:36 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:10:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:36.265 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-2880e5f6-e139-4f3f-a855-f230a91f9ae2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2880e5f6-e139-4f3f-a855-f230a91f9ae2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4caf983d-dc6c-4268-9c5b-d4a14993b754, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=998458b3-d6cd-4ecd-850f-289ca92b1da7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:36Z|00459|binding|INFO|Setting lport 998458b3-d6cd-4ecd-850f-289ca92b1da7 up in Southbound
Dec 02 10:10:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:36.270 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 998458b3-d6cd-4ecd-850f-289ca92b1da7 in datapath 2880e5f6-e139-4f3f-a855-f230a91f9ae2 bound to our chassis
Dec 02 10:10:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:36.272 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2880e5f6-e139-4f3f-a855-f230a91f9ae2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:10:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:36.273 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ddeede58-ce2c-469c-a38e-c5bdbd28db48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:36 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:36.559 263406 INFO neutron.agent.linux.ip_lib [None req-e3d44190-c3fe-477f-b41e-ca7c586a4206 - - - - - -] Device tap79d1b462-4e cannot be used as it has no MAC address
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.640 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain kernel: device tap79d1b462-4e entered promiscuous mode
Dec 02 10:10:36 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670236.6443] manager: (tap79d1b462-4e): new Generic device (/org/freedesktop/NetworkManager/Devices/75)
Dec 02 10:10:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:36Z|00460|binding|INFO|Claiming lport 79d1b462-4e0f-4b98-8dd4-56658187af29 for this chassis.
Dec 02 10:10:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:36Z|00461|binding|INFO|79d1b462-4e0f-4b98-8dd4-56658187af29: Claiming unknown
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.646 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.658 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:36Z|00462|binding|INFO|Setting lport 79d1b462-4e0f-4b98-8dd4-56658187af29 ovn-installed in OVS
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.660 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.678 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.701 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:36.718 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:36Z|00463|binding|INFO|Setting lport 79d1b462-4e0f-4b98-8dd4-56658187af29 up in Southbound
Dec 02 10:10:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:36.824 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-0563b4b4-439a-4655-9225-28a24ad09db2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0563b4b4-439a-4655-9225-28a24ad09db2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f39b5ca1adf344dd9239d3d0131792d4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0e1d319-f053-4ffe-b337-109ee69f3933, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=79d1b462-4e0f-4b98-8dd4-56658187af29) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:36.826 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 79d1b462-4e0f-4b98-8dd4-56658187af29 in datapath 0563b4b4-439a-4655-9225-28a24ad09db2 bound to our chassis
Dec 02 10:10:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:36.827 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0563b4b4-439a-4655-9225-28a24ad09db2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:10:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:36.828 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[17120cb9-902c-4ef5-be67-9188b275ca3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:37 np0005541913.localdomain podman[326598]: 
Dec 02 10:10:37 np0005541913.localdomain podman[326598]: 2025-12-02 10:10:37.022560975 +0000 UTC m=+0.087975616 container create c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2880e5f6-e139-4f3f-a855-f230a91f9ae2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: Started libpod-conmon-c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa.scope.
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:37 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e3c6b4d7a0b177b297b4a6dc780781660327bdcf3b0f523d9debbe50858c72e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:37 np0005541913.localdomain podman[326598]: 2025-12-02 10:10:36.991018014 +0000 UTC m=+0.056432695 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:37 np0005541913.localdomain podman[326598]: 2025-12-02 10:10:37.09253281 +0000 UTC m=+0.157947451 container init c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2880e5f6-e139-4f3f-a855-f230a91f9ae2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:10:37 np0005541913.localdomain podman[326598]: 2025-12-02 10:10:37.099170536 +0000 UTC m=+0.164585177 container start c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2880e5f6-e139-4f3f-a855-f230a91f9ae2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326623]: started, version 2.85 cachesize 150
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326623]: DNS service limited to local subnets
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326623]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326623]: warning: no upstream servers configured
Dec 02 10:10:37 np0005541913.localdomain dnsmasq-dhcp[326623]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326623]: read /var/lib/neutron/dhcp/2880e5f6-e139-4f3f-a855-f230a91f9ae2/addn_hosts - 0 addresses
Dec 02 10:10:37 np0005541913.localdomain dnsmasq-dhcp[326623]: read /var/lib/neutron/dhcp/2880e5f6-e139-4f3f-a855-f230a91f9ae2/host
Dec 02 10:10:37 np0005541913.localdomain dnsmasq-dhcp[326623]: read /var/lib/neutron/dhcp/2880e5f6-e139-4f3f-a855-f230a91f9ae2/opts
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:10:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:37.445 263406 INFO neutron.agent.dhcp.agent [None req-8f45a370-4c5a-48bd-a761-104fd5b51ec1 - - - - - -] DHCP configuration for ports {'619e786c-c344-438b-b31b-289b481ea916'} is completed
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:10:37 np0005541913.localdomain podman[326637]: 2025-12-02 10:10:37.505631878 +0000 UTC m=+0.145734054 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:10:37 np0005541913.localdomain podman[326637]: 2025-12-02 10:10:37.518244145 +0000 UTC m=+0.158346321 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:10:37 np0005541913.localdomain podman[326660]: 
Dec 02 10:10:37 np0005541913.localdomain podman[326677]: 2025-12-02 10:10:37.591937778 +0000 UTC m=+0.081100662 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 10:10:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:37.612 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:10:37 np0005541913.localdomain podman[326660]: 2025-12-02 10:10:37.523595967 +0000 UTC m=+0.088418387 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:37 np0005541913.localdomain podman[326677]: 2025-12-02 10:10:37.641115478 +0000 UTC m=+0.130278352 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:10:37 np0005541913.localdomain podman[326660]: 2025-12-02 10:10:37.67080975 +0000 UTC m=+0.235632100 container create 0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0563b4b4-439a-4655-9225-28a24ad09db2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326623]: exiting on receipt of SIGTERM
Dec 02 10:10:37 np0005541913.localdomain podman[326722]: 2025-12-02 10:10:37.702288939 +0000 UTC m=+0.055604273 container kill c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2880e5f6-e139-4f3f-a855-f230a91f9ae2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: Started libpod-conmon-0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223.scope.
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: libpod-c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa.scope: Deactivated successfully.
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:37 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78a8e6dcaae55003fe5ded5cdde675da092aada4d48b494b157ce714a6ab4dbb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:10:37 np0005541913.localdomain podman[326743]: 2025-12-02 10:10:37.766094119 +0000 UTC m=+0.044339912 container died c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2880e5f6-e139-4f3f-a855-f230a91f9ae2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:10:37 np0005541913.localdomain podman[326660]: 2025-12-02 10:10:37.775052608 +0000 UTC m=+0.339875228 container init 0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0563b4b4-439a-4655-9225-28a24ad09db2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:10:37 np0005541913.localdomain podman[326660]: 2025-12-02 10:10:37.784200272 +0000 UTC m=+0.349022602 container start 0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0563b4b4-439a-4655-9225-28a24ad09db2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326766]: started, version 2.85 cachesize 150
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326766]: DNS service limited to local subnets
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326766]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326766]: warning: no upstream servers configured
Dec 02 10:10:37 np0005541913.localdomain dnsmasq-dhcp[326766]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:10:37 np0005541913.localdomain dnsmasq[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/addn_hosts - 0 addresses
Dec 02 10:10:37 np0005541913.localdomain dnsmasq-dhcp[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/host
Dec 02 10:10:37 np0005541913.localdomain dnsmasq-dhcp[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/opts
Dec 02 10:10:37 np0005541913.localdomain podman[326743]: 2025-12-02 10:10:37.854057604 +0000 UTC m=+0.132303377 container remove c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2880e5f6-e139-4f3f-a855-f230a91f9ae2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:37 np0005541913.localdomain systemd[1]: libpod-conmon-c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa.scope: Deactivated successfully.
Dec 02 10:10:37 np0005541913.localdomain kernel: device tap998458b3-d6 left promiscuous mode
Dec 02 10:10:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:37Z|00464|binding|INFO|Releasing lport 998458b3-d6cd-4ecd-850f-289ca92b1da7 from this chassis (sb_readonly=0)
Dec 02 10:10:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:37Z|00465|binding|INFO|Setting lport 998458b3-d6cd-4ecd-850f-289ca92b1da7 down in Southbound
Dec 02 10:10:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:37.894 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:37.915 263406 INFO neutron.agent.dhcp.agent [None req-42995013-df4c-4a2c-8059-f99844bc4895 - - - - - -] DHCP configuration for ports {'1257c5f2-830b-4520-89c5-ef86a571e196'} is completed
Dec 02 10:10:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:37.921 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:37.923 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0e3c6b4d7a0b177b297b4a6dc780781660327bdcf3b0f523d9debbe50858c72e-merged.mount: Deactivated successfully.
Dec 02 10:10:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c28087d52b7261a89e3f0fa0146db9a3070cf621a5308f9abf66bc716b3078fa-userdata-shm.mount: Deactivated successfully.
Dec 02 10:10:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:38.070 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-2880e5f6-e139-4f3f-a855-f230a91f9ae2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2880e5f6-e139-4f3f-a855-f230a91f9ae2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4caf983d-dc6c-4268-9c5b-d4a14993b754, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=998458b3-d6cd-4ecd-850f-289ca92b1da7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:38.072 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 998458b3-d6cd-4ecd-850f-289ca92b1da7 in datapath 2880e5f6-e139-4f3f-a855-f230a91f9ae2 unbound from our chassis
Dec 02 10:10:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:38.074 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2880e5f6-e139-4f3f-a855-f230a91f9ae2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:10:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:38.074 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[1318bcc3-7b23-4545-b43c-c917dad167ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:38 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d2880e5f6\x2de139\x2d4f3f\x2da855\x2df230a91f9ae2.mount: Deactivated successfully.
Dec 02 10:10:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:38.091 263406 INFO neutron.agent.dhcp.agent [None req-bfd7d0b5-41be-47d8-9379-fcbf85a32212 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:38.133 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:38 np0005541913.localdomain ceph-mon[298296]: pgmap v383: 177 pgs: 8 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 167 active+clean; 787 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 16 KiB/s rd, 22 MiB/s wr, 36 op/s
Dec 02 10:10:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "7a4c64a8-f75e-4eb8-8104-489d2e71f23a_bcca2fdb-3954-4db7-bf24-5e38c29448c7", "force": true, "format": "json"}]: dispatch
Dec 02 10:10:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "7a4c64a8-f75e-4eb8-8104-489d2e71f23a", "force": true, "format": "json"}]: dispatch
Dec 02 10:10:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:38Z|00466|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:10:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:39.058 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:39 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:39.563 2 INFO neutron.agent.securitygroups_rpc [None req-b887e192-fbe8-4997-9fd8-8fe0e62f2ad3 ffc28dac62f4495c9452fce17050d09a 16ae7f5f159c4b10a1539c2d9b52fce5 - - default default] Security group rule updated ['2409236f-431b-4039-840f-bb40e7858355']
Dec 02 10:10:39 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:10:39 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:39 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:39 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:10:40 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:10:40 np0005541913.localdomain ceph-mon[298296]: pgmap v384: 177 pgs: 177 active+clean; 907 MiB data, 3.1 GiB used, 39 GiB / 42 GiB avail; 15 KiB/s rd, 23 MiB/s wr, 34 op/s
Dec 02 10:10:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:41.075 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:41.130 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e157 e157: 6 total, 6 up, 6 in
Dec 02 10:10:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:42.194 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "5a8eab8a-1e6c-4298-b827-66849539d417", "format": "json"}]: dispatch
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: pgmap v385: 177 pgs: 177 active+clean; 907 MiB data, 3.1 GiB used, 39 GiB / 42 GiB avail; 15 KiB/s rd, 23 MiB/s wr, 34 op/s
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: osdmap e157: 6 total, 6 up, 6 in
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.555160) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670242555241, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2428, "num_deletes": 253, "total_data_size": 4173874, "memory_usage": 4352720, "flush_reason": "Manual Compaction"}
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e158 e158: 6 total, 6 up, 6 in
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670242571360, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2731233, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25980, "largest_seqno": 28403, "table_properties": {"data_size": 2721883, "index_size": 5663, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 22659, "raw_average_key_size": 21, "raw_value_size": 2701971, "raw_average_value_size": 2610, "num_data_blocks": 241, "num_entries": 1035, "num_filter_entries": 1035, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670111, "oldest_key_time": 1764670111, "file_creation_time": 1764670242, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 16247 microseconds, and 7234 cpu microseconds.
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.571415) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2731233 bytes OK
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.571441) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.573691) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.573711) EVENT_LOG_v1 {"time_micros": 1764670242573705, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.573736) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 4162457, prev total WAL file size 4162498, number of live WAL files 2.
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.574779) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end)
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2667KB)], [45(14MB)]
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670242574823, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 18099848, "oldest_snapshot_seqno": -1}
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12950 keys, 16965663 bytes, temperature: kUnknown
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670242674088, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 16965663, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16893176, "index_size": 39042, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32389, "raw_key_size": 347789, "raw_average_key_size": 26, "raw_value_size": 16673922, "raw_average_value_size": 1287, "num_data_blocks": 1470, "num_entries": 12950, "num_filter_entries": 12950, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670242, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.674393) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 16965663 bytes
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.676025) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 182.2 rd, 170.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 14.7 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(12.8) write-amplify(6.2) OK, records in: 13490, records dropped: 540 output_compression: NoCompression
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.676055) EVENT_LOG_v1 {"time_micros": 1764670242676042, "job": 26, "event": "compaction_finished", "compaction_time_micros": 99354, "compaction_time_cpu_micros": 47518, "output_level": 6, "num_output_files": 1, "total_output_size": 16965663, "num_input_records": 13490, "num_output_records": 12950, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670242676512, "job": 26, "event": "table_file_deletion", "file_number": 47}
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670242678999, "job": 26, "event": "table_file_deletion", "file_number": 45}
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.574680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.679096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.679102) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.679105) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.679108) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:10:42 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:10:42.679111) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:10:43 np0005541913.localdomain ceph-mon[298296]: osdmap e158: 6 total, 6 up, 6 in
Dec 02 10:10:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:10:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:10:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:10:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 02 10:10:44 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:10:44 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:10:44 np0005541913.localdomain ceph-mon[298296]: pgmap v388: 177 pgs: 177 active+clean; 1.0 GiB data, 3.4 GiB used, 39 GiB / 42 GiB avail; 20 KiB/s rd, 31 MiB/s wr, 44 op/s
Dec 02 10:10:44 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:44.894 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:10:42Z, description=, device_id=d7b55be3-df8b-4a7a-a053-0a870505d24f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b91fa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a2d070>], id=965bd720-26a0-4a5b-8f5d-737e33ccfa28, ip_allocation=immediate, mac_address=fa:16:3e:5f:f6:d1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:10:32Z, description=, dns_domain=, id=0563b4b4-439a-4655-9225-28a24ad09db2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-2025865746-network, port_security_enabled=True, project_id=f39b5ca1adf344dd9239d3d0131792d4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22033, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2548, status=ACTIVE, subnets=['17eb0bb5-c5da-450e-9a0e-764bbc851d5d'], tags=[], tenant_id=f39b5ca1adf344dd9239d3d0131792d4, updated_at=2025-12-02T10:10:34Z, vlan_transparent=None, network_id=0563b4b4-439a-4655-9225-28a24ad09db2, port_security_enabled=False, project_id=f39b5ca1adf344dd9239d3d0131792d4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2583, status=DOWN, tags=[], tenant_id=f39b5ca1adf344dd9239d3d0131792d4, updated_at=2025-12-02T10:10:42Z on network 0563b4b4-439a-4655-9225-28a24ad09db2
Dec 02 10:10:45 np0005541913.localdomain podman[326784]: 2025-12-02 10:10:45.733030506 +0000 UTC m=+0.060976956 container kill 0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0563b4b4-439a-4655-9225-28a24ad09db2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 02 10:10:45 np0005541913.localdomain dnsmasq[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/addn_hosts - 1 addresses
Dec 02 10:10:45 np0005541913.localdomain dnsmasq-dhcp[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/host
Dec 02 10:10:45 np0005541913.localdomain dnsmasq-dhcp[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/opts
Dec 02 10:10:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:46.074 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:46.132 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:46.310 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:46.347 263406 INFO neutron.agent.dhcp.agent [None req-93478d3f-813b-4385-8c0e-1773ffb1ec40 - - - - - -] DHCP configuration for ports {'965bd720-26a0-4a5b-8f5d-737e33ccfa28'} is completed
Dec 02 10:10:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e159 e159: 6 total, 6 up, 6 in
Dec 02 10:10:46 np0005541913.localdomain ceph-mon[298296]: pgmap v389: 177 pgs: 177 active+clean; 1.0 GiB data, 3.4 GiB used, 39 GiB / 42 GiB avail; 20 KiB/s rd, 31 MiB/s wr, 44 op/s
Dec 02 10:10:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:10:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:10:46 np0005541913.localdomain ceph-mon[298296]: osdmap e159: 6 total, 6 up, 6 in
Dec 02 10:10:47 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e160 e160: 6 total, 6 up, 6 in
Dec 02 10:10:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "5a8eab8a-1e6c-4298-b827-66849539d417_de2c1857-d908-4600-af3d-2ff1f2d9e3dc", "force": true, "format": "json"}]: dispatch
Dec 02 10:10:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "5a8eab8a-1e6c-4298-b827-66849539d417", "force": true, "format": "json"}]: dispatch
Dec 02 10:10:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:10:47 np0005541913.localdomain ceph-mon[298296]: osdmap e160: 6 total, 6 up, 6 in
Dec 02 10:10:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:47.829 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:10:42Z, description=, device_id=d7b55be3-df8b-4a7a-a053-0a870505d24f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990884aa60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990884a2b0>], id=965bd720-26a0-4a5b-8f5d-737e33ccfa28, ip_allocation=immediate, mac_address=fa:16:3e:5f:f6:d1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:10:32Z, description=, dns_domain=, id=0563b4b4-439a-4655-9225-28a24ad09db2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-2025865746-network, port_security_enabled=True, project_id=f39b5ca1adf344dd9239d3d0131792d4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22033, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2548, status=ACTIVE, subnets=['17eb0bb5-c5da-450e-9a0e-764bbc851d5d'], tags=[], tenant_id=f39b5ca1adf344dd9239d3d0131792d4, updated_at=2025-12-02T10:10:34Z, vlan_transparent=None, network_id=0563b4b4-439a-4655-9225-28a24ad09db2, port_security_enabled=False, project_id=f39b5ca1adf344dd9239d3d0131792d4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2583, status=DOWN, tags=[], tenant_id=f39b5ca1adf344dd9239d3d0131792d4, updated_at=2025-12-02T10:10:42Z on network 0563b4b4-439a-4655-9225-28a24ad09db2
Dec 02 10:10:48 np0005541913.localdomain podman[326820]: 2025-12-02 10:10:48.373741897 +0000 UTC m=+0.058966892 container kill 0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0563b4b4-439a-4655-9225-28a24ad09db2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:10:48 np0005541913.localdomain dnsmasq[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/addn_hosts - 1 addresses
Dec 02 10:10:48 np0005541913.localdomain dnsmasq-dhcp[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/host
Dec 02 10:10:48 np0005541913.localdomain dnsmasq-dhcp[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/opts
Dec 02 10:10:48 np0005541913.localdomain ceph-mon[298296]: pgmap v391: 177 pgs: 177 active+clean; 1.0 GiB data, 3.4 GiB used, 39 GiB / 42 GiB avail; 14 KiB/s rd, 21 MiB/s wr, 31 op/s
Dec 02 10:10:48 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e161 e161: 6 total, 6 up, 6 in
Dec 02 10:10:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:48.764 263406 INFO neutron.agent.dhcp.agent [None req-ce3a0d39-3ba8-47e1-a6b6-7758e74847cd - - - - - -] DHCP configuration for ports {'965bd720-26a0-4a5b-8f5d-737e33ccfa28'} is completed
Dec 02 10:10:48 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:48.917 2 INFO neutron.agent.securitygroups_rpc [None req-c5737a18-0087-499e-be42-7eb006bdc7a0 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['d54da663-bbdd-4967-b64b-8a9f95f589dd']
Dec 02 10:10:48 np0005541913.localdomain snmpd[69635]: empty variable list in _query
Dec 02 10:10:49 np0005541913.localdomain ceph-mon[298296]: osdmap e161: 6 total, 6 up, 6 in
Dec 02 10:10:49 np0005541913.localdomain ceph-mon[298296]: pgmap v394: 177 pgs: 177 active+clean; 1.1 GiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 16 KiB/s rd, 23 MiB/s wr, 33 op/s
Dec 02 10:10:49 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "0971658b-39fb-4b1f-bbaf-63a2efed16bf", "format": "json"}]: dispatch
Dec 02 10:10:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:10:50 np0005541913.localdomain systemd[1]: tmp-crun.VFUnNN.mount: Deactivated successfully.
Dec 02 10:10:50 np0005541913.localdomain podman[326841]: 2025-12-02 10:10:50.455877694 +0000 UTC m=+0.095182257 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Dec 02 10:10:50 np0005541913.localdomain podman[326841]: 2025-12-02 10:10:50.468314835 +0000 UTC m=+0.107619368 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 10:10:50 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:10:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e162 e162: 6 total, 6 up, 6 in
Dec 02 10:10:50 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:10:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:10:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:10:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:10:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 02 10:10:50 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:10:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:51.078 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:51.134 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e163 e163: 6 total, 6 up, 6 in
Dec 02 10:10:51 np0005541913.localdomain ceph-mon[298296]: osdmap e162: 6 total, 6 up, 6 in
Dec 02 10:10:51 np0005541913.localdomain ceph-mon[298296]: pgmap v396: 177 pgs: 177 active+clean; 1.1 GiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 21 KiB/s rd, 30 MiB/s wr, 44 op/s
Dec 02 10:10:51 np0005541913.localdomain ceph-mon[298296]: osdmap e163: 6 total, 6 up, 6 in
Dec 02 10:10:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1465263749' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:10:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1465263749' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:10:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:10:53 np0005541913.localdomain systemd[1]: tmp-crun.cckbJE.mount: Deactivated successfully.
Dec 02 10:10:53 np0005541913.localdomain podman[326860]: 2025-12-02 10:10:53.450434887 +0000 UTC m=+0.093348299 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 02 10:10:53 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:53.457 2 INFO neutron.agent.securitygroups_rpc [None req-f5ab8c31-1c32-4d96-896a-5e2487d3e658 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['d54da663-bbdd-4967-b64b-8a9f95f589dd', '475d5c6b-fba4-44ef-b012-03f922f307d8', 'c4cadb1e-8d38-4a3c-b1f8-f6d93fbe5968']
Dec 02 10:10:53 np0005541913.localdomain podman[326860]: 2025-12-02 10:10:53.458989905 +0000 UTC m=+0.101903327 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:10:53 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:10:53 np0005541913.localdomain ceph-mon[298296]: pgmap v398: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 82 KiB/s rd, 24 MiB/s wr, 152 op/s
Dec 02 10:10:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:10:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:10:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:10:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:10:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:53.830 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:10:53 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:53.845 2 INFO neutron.agent.securitygroups_rpc [None req-054f4ed0-bf56-489e-9f2e-7d08aad333fe 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['475d5c6b-fba4-44ef-b012-03f922f307d8', 'c4cadb1e-8d38-4a3c-b1f8-f6d93fbe5968']
Dec 02 10:10:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:10:54 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1771883914' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:10:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:10:54 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1771883914' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:10:54 np0005541913.localdomain podman[326893]: 2025-12-02 10:10:54.602665022 +0000 UTC m=+0.064108739 container kill ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:10:54 np0005541913.localdomain dnsmasq[324773]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/addn_hosts - 0 addresses
Dec 02 10:10:54 np0005541913.localdomain dnsmasq-dhcp[324773]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/host
Dec 02 10:10:54 np0005541913.localdomain dnsmasq-dhcp[324773]: read /var/lib/neutron/dhcp/5f48cce7-247c-4b5d-8287-ac14f7453254/opts
Dec 02 10:10:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:10:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:10:54 np0005541913.localdomain podman[326906]: 2025-12-02 10:10:54.727113689 +0000 UTC m=+0.092021733 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Dec 02 10:10:54 np0005541913.localdomain podman[326906]: 2025-12-02 10:10:54.743152926 +0000 UTC m=+0.108061030 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 10:10:54 np0005541913.localdomain podman[326907]: 2025-12-02 10:10:54.7916898 +0000 UTC m=+0.149017843 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:10:54 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:10:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "0971658b-39fb-4b1f-bbaf-63a2efed16bf_856cf5e4-324a-4059-b9e0-23839724525d", "force": true, "format": "json"}]: dispatch
Dec 02 10:10:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "0971658b-39fb-4b1f-bbaf-63a2efed16bf", "force": true, "format": "json"}]: dispatch
Dec 02 10:10:54 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1771883914' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:10:54 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1771883914' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:10:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:54Z|00467|binding|INFO|Releasing lport b35a7019-cd13-49ba-ae0b-aa70d3ce3b27 from this chassis (sb_readonly=0)
Dec 02 10:10:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:54.816 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:54Z|00468|binding|INFO|Setting lport b35a7019-cd13-49ba-ae0b-aa70d3ce3b27 down in Southbound
Dec 02 10:10:54 np0005541913.localdomain kernel: device tapb35a7019-cd left promiscuous mode
Dec 02 10:10:54 np0005541913.localdomain podman[326907]: 2025-12-02 10:10:54.823031195 +0000 UTC m=+0.180359198 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:10:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:54.826 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-5f48cce7-247c-4b5d-8287-ac14f7453254', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f48cce7-247c-4b5d-8287-ac14f7453254', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bad680c763640dba71a7865b355817c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=beadeea7-0616-4ea7-b4f9-7f4239a4c055, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=b35a7019-cd13-49ba-ae0b-aa70d3ce3b27) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:54.828 160221 INFO neutron.agent.ovn.metadata.agent [-] Port b35a7019-cd13-49ba-ae0b-aa70d3ce3b27 in datapath 5f48cce7-247c-4b5d-8287-ac14f7453254 unbound from our chassis
Dec 02 10:10:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:54.832 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5f48cce7-247c-4b5d-8287-ac14f7453254, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:10:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:54.834 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5556c035-5726-4174-a7c5-a53f76a7d8d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:54 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:10:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:54.841 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:54.842 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:55.400 263406 INFO neutron.agent.linux.ip_lib [None req-b4c5dd66-fb5c-4d1d-819f-f64986343de9 - - - - - -] Device tapf3b02d29-75 cannot be used as it has no MAC address
Dec 02 10:10:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:55.461 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:55 np0005541913.localdomain kernel: device tapf3b02d29-75 entered promiscuous mode
Dec 02 10:10:55 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670255.4715] manager: (tapf3b02d29-75): new Generic device (/org/freedesktop/NetworkManager/Devices/76)
Dec 02 10:10:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:55Z|00469|binding|INFO|Claiming lport f3b02d29-7542-44d0-a991-a01ec607868c for this chassis.
Dec 02 10:10:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:55Z|00470|binding|INFO|f3b02d29-7542-44d0-a991-a01ec607868c: Claiming unknown
Dec 02 10:10:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:55.473 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:55 np0005541913.localdomain systemd-udevd[326968]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:10:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:55.481 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-d46489f9-a47b-465f-b68c-fdf4256b1786', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d46489f9-a47b-465f-b68c-fdf4256b1786', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c09a5d01-8e4d-42c4-b32a-41401f5c5328, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=f3b02d29-7542-44d0-a991-a01ec607868c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:10:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:55.483 160221 INFO neutron.agent.ovn.metadata.agent [-] Port f3b02d29-7542-44d0-a991-a01ec607868c in datapath d46489f9-a47b-465f-b68c-fdf4256b1786 bound to our chassis
Dec 02 10:10:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:55.484 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d46489f9-a47b-465f-b68c-fdf4256b1786 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:10:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:10:55.484 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a5f33d-b37a-4949-a3f8-6970bda5fdb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:10:55 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf3b02d29-75: No such device
Dec 02 10:10:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:55Z|00471|binding|INFO|Setting lport f3b02d29-7542-44d0-a991-a01ec607868c ovn-installed in OVS
Dec 02 10:10:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:55Z|00472|binding|INFO|Setting lport f3b02d29-7542-44d0-a991-a01ec607868c up in Southbound
Dec 02 10:10:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:55.515 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:55 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf3b02d29-75: No such device
Dec 02 10:10:55 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf3b02d29-75: No such device
Dec 02 10:10:55 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf3b02d29-75: No such device
Dec 02 10:10:55 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf3b02d29-75: No such device
Dec 02 10:10:55 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf3b02d29-75: No such device
Dec 02 10:10:55 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf3b02d29-75: No such device
Dec 02 10:10:55 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf3b02d29-75: No such device
Dec 02 10:10:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:55.553 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:55.580 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:55 np0005541913.localdomain ceph-mon[298296]: pgmap v399: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 56 KiB/s rd, 26 KiB/s wr, 99 op/s
Dec 02 10:10:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:56.081 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:56.136 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:10:56 np0005541913.localdomain podman[327039]: 
Dec 02 10:10:56 np0005541913.localdomain podman[327039]: 2025-12-02 10:10:56.433442331 +0000 UTC m=+0.090232496 container create 804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:56 np0005541913.localdomain systemd[1]: Started libpod-conmon-804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01.scope.
Dec 02 10:10:56 np0005541913.localdomain podman[327039]: 2025-12-02 10:10:56.387832345 +0000 UTC m=+0.044622510 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:56 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e164 e164: 6 total, 6 up, 6 in
Dec 02 10:10:56 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79ccbfaaa8ec4c595ccd6601936f664b8c09718d9046a5213589c5746d0d8e14/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:56 np0005541913.localdomain podman[327039]: 2025-12-02 10:10:56.532795678 +0000 UTC m=+0.189585873 container init 804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:10:56 np0005541913.localdomain podman[327039]: 2025-12-02 10:10:56.548881147 +0000 UTC m=+0.205671322 container start 804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:10:56 np0005541913.localdomain dnsmasq[327058]: started, version 2.85 cachesize 150
Dec 02 10:10:56 np0005541913.localdomain dnsmasq[327058]: DNS service limited to local subnets
Dec 02 10:10:56 np0005541913.localdomain dnsmasq[327058]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:56 np0005541913.localdomain dnsmasq[327058]: warning: no upstream servers configured
Dec 02 10:10:56 np0005541913.localdomain dnsmasq-dhcp[327058]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:10:56 np0005541913.localdomain dnsmasq[327058]: read /var/lib/neutron/dhcp/d46489f9-a47b-465f-b68c-fdf4256b1786/addn_hosts - 0 addresses
Dec 02 10:10:56 np0005541913.localdomain dnsmasq-dhcp[327058]: read /var/lib/neutron/dhcp/d46489f9-a47b-465f-b68c-fdf4256b1786/host
Dec 02 10:10:56 np0005541913.localdomain dnsmasq-dhcp[327058]: read /var/lib/neutron/dhcp/d46489f9-a47b-465f-b68c-fdf4256b1786/opts
Dec 02 10:10:56 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:56.725 263406 INFO neutron.agent.dhcp.agent [None req-f35455d8-d91f-4960-bdad-e090c82bb305 - - - - - -] DHCP configuration for ports {'7df0cc5b-0d4a-48fd-ae58-4a75ac1c28cb'} is completed
Dec 02 10:10:56 np0005541913.localdomain dnsmasq[327058]: exiting on receipt of SIGTERM
Dec 02 10:10:56 np0005541913.localdomain podman[327076]: 2025-12-02 10:10:56.932371837 +0000 UTC m=+0.064211223 container kill 804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:10:56 np0005541913.localdomain systemd[1]: libpod-804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01.scope: Deactivated successfully.
Dec 02 10:10:57 np0005541913.localdomain podman[327090]: 2025-12-02 10:10:57.011810863 +0000 UTC m=+0.059665820 container died 804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01-userdata-shm.mount: Deactivated successfully.
Dec 02 10:10:57 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-79ccbfaaa8ec4c595ccd6601936f664b8c09718d9046a5213589c5746d0d8e14-merged.mount: Deactivated successfully.
Dec 02 10:10:57 np0005541913.localdomain podman[327090]: 2025-12-02 10:10:57.066018489 +0000 UTC m=+0.113873416 container remove 804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:10:57 np0005541913.localdomain systemd[1]: libpod-conmon-804b260e4a7ae9950c52ed2ac004e40a0edd83e1847320032f14759403604a01.scope: Deactivated successfully.
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: osdmap e164: 6 total, 6 up, 6 in
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1722968383' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1722968383' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:10:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e165 e165: 6 total, 6 up, 6 in
Dec 02 10:10:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:57Z|00473|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:10:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:57.948 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:58 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:10:58Z|00474|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:10:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:58.538 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:10:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:10:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:10:58 np0005541913.localdomain ceph-mon[298296]: pgmap v401: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 56 KiB/s rd, 26 KiB/s wr, 100 op/s
Dec 02 10:10:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1722968383' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:10:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1722968383' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:10:58 np0005541913.localdomain ceph-mon[298296]: osdmap e165: 6 total, 6 up, 6 in
Dec 02 10:10:58 np0005541913.localdomain podman[327167]: 
Dec 02 10:10:58 np0005541913.localdomain podman[327167]: 2025-12-02 10:10:58.788814249 +0000 UTC m=+0.078663048 container create 429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:10:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:58.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:10:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:58.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:10:58 np0005541913.localdomain systemd[1]: Started libpod-conmon-429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b.scope.
Dec 02 10:10:58 np0005541913.localdomain systemd[1]: tmp-crun.8bmdDT.mount: Deactivated successfully.
Dec 02 10:10:58 np0005541913.localdomain podman[327167]: 2025-12-02 10:10:58.746828401 +0000 UTC m=+0.036677240 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:10:58 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:10:58 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b03b76d06f065d064f053ee52b9bc33a267b704dd60e482f157855766fd952e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:10:58 np0005541913.localdomain podman[327167]: 2025-12-02 10:10:58.869764366 +0000 UTC m=+0.159613165 container init 429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:58 np0005541913.localdomain podman[327167]: 2025-12-02 10:10:58.894074464 +0000 UTC m=+0.183923273 container start 429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:10:58 np0005541913.localdomain dnsmasq[327200]: started, version 2.85 cachesize 150
Dec 02 10:10:58 np0005541913.localdomain dnsmasq[327200]: DNS service limited to local subnets
Dec 02 10:10:58 np0005541913.localdomain dnsmasq[327200]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:10:58 np0005541913.localdomain dnsmasq[327200]: warning: no upstream servers configured
Dec 02 10:10:58 np0005541913.localdomain dnsmasq-dhcp[327200]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:10:58 np0005541913.localdomain dnsmasq-dhcp[327200]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:10:58 np0005541913.localdomain dnsmasq[327200]: read /var/lib/neutron/dhcp/d46489f9-a47b-465f-b68c-fdf4256b1786/addn_hosts - 0 addresses
Dec 02 10:10:58 np0005541913.localdomain dnsmasq-dhcp[327200]: read /var/lib/neutron/dhcp/d46489f9-a47b-465f-b68c-fdf4256b1786/host
Dec 02 10:10:58 np0005541913.localdomain dnsmasq-dhcp[327200]: read /var/lib/neutron/dhcp/d46489f9-a47b-465f-b68c-fdf4256b1786/opts
Dec 02 10:10:58 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:10:58.928 2 INFO neutron.agent.securitygroups_rpc [None req-84a55dcc-5035-483d-9948-fd4c09f198da 57832728fce14260b03b0f06122d5897 aae5e2dae10d49c38d5d63835c7677e3 - - default default] Security group member updated ['e8ea3695-3b79-4d4a-ada7-8279c4be34cf']
Dec 02 10:10:59 np0005541913.localdomain dnsmasq[324773]: exiting on receipt of SIGTERM
Dec 02 10:10:59 np0005541913.localdomain podman[327202]: 2025-12-02 10:10:59.018363696 +0000 UTC m=+0.049906550 container kill ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:10:59 np0005541913.localdomain systemd[1]: libpod-ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829.scope: Deactivated successfully.
Dec 02 10:10:59 np0005541913.localdomain podman[327216]: 2025-12-02 10:10:59.094939477 +0000 UTC m=+0.059845386 container died ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:10:59 np0005541913.localdomain podman[327216]: 2025-12-02 10:10:59.127145035 +0000 UTC m=+0.092050894 container cleanup ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:10:59 np0005541913.localdomain systemd[1]: libpod-conmon-ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829.scope: Deactivated successfully.
Dec 02 10:10:59 np0005541913.localdomain podman[327217]: 2025-12-02 10:10:59.180933448 +0000 UTC m=+0.141741758 container remove ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5f48cce7-247c-4b5d-8287-ac14f7453254, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:10:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:59.195 263406 INFO neutron.agent.dhcp.agent [None req-5b768297-a8ca-412b-8751-d245a5ad872e - - - - - -] DHCP configuration for ports {'7df0cc5b-0d4a-48fd-ae58-4a75ac1c28cb', 'f3b02d29-7542-44d0-a991-a01ec607868c'} is completed
Dec 02 10:10:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:59.224 263406 INFO neutron.agent.dhcp.agent [None req-9f4ce02d-8b26-44d8-86a0-6a469b69a03e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:10:59.225 263406 INFO neutron.agent.dhcp.agent [None req-9f4ce02d-8b26-44d8-86a0-6a469b69a03e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:10:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "299defe1-3a6d-4652-876c-cda1688f998a", "format": "json"}]: dispatch
Dec 02 10:10:59 np0005541913.localdomain systemd[1]: tmp-crun.Zz9J8Y.mount: Deactivated successfully.
Dec 02 10:10:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-615057096d3c152d4d1f671a5bd383cdd6ec5d4ac33268b4ace59e5fc7761d1b-merged.mount: Deactivated successfully.
Dec 02 10:10:59 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ef222e4c070887caa7adf3962619919fb8d0edaea9899146270f130a7f3ca829-userdata-shm.mount: Deactivated successfully.
Dec 02 10:10:59 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d5f48cce7\x2d247c\x2d4b5d\x2d8287\x2dac14f7453254.mount: Deactivated successfully.
Dec 02 10:10:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:59.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:10:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:59.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:10:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:10:59.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:11:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:00.010 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:11:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:00.011 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:11:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:00.011 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:11:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:00.011 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:11:00 np0005541913.localdomain ceph-mon[298296]: pgmap v403: 177 pgs: 177 active+clean; 148 MiB data, 910 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 61 KiB/s wr, 136 op/s
Dec 02 10:11:00 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1955612142' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:00 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1955612142' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:11:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:00 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.084 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.137 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:01 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.773 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.812 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.812 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.813 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.835 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.835 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.836 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.836 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:11:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:01.836 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:11:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:11:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3426892413' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.304 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.373 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.374 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:11:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:02Z|00475|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.602 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.625 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.627 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11199MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.627 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.628 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:11:02 np0005541913.localdomain ceph-mon[298296]: pgmap v404: 177 pgs: 177 active+clean; 148 MiB data, 910 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 38 KiB/s wr, 49 op/s
Dec 02 10:11:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3426892413' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.716 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.716 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.716 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:11:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:11:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/383165306' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:11:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/383165306' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:02.790 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:11:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:03.054 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:11:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:03.054 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:11:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:03.055 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:11:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:11:03 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2275295108' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:11:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:03.221 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:11:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:03.228 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:11:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:03.251 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:11:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:03.254 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:11:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:03.254 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:11:03 np0005541913.localdomain dnsmasq[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/addn_hosts - 0 addresses
Dec 02 10:11:03 np0005541913.localdomain dnsmasq-dhcp[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/host
Dec 02 10:11:03 np0005541913.localdomain dnsmasq-dhcp[325907]: read /var/lib/neutron/dhcp/39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07/opts
Dec 02 10:11:03 np0005541913.localdomain podman[327306]: 2025-12-02 10:11:03.306341047 +0000 UTC m=+0.054776990 container kill 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:03.491 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:03Z|00476|binding|INFO|Releasing lport 32986807-4a62-4af8-ad03-9336f56fbec0 from this chassis (sb_readonly=0)
Dec 02 10:11:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:03Z|00477|binding|INFO|Setting lport 32986807-4a62-4af8-ad03-9336f56fbec0 down in Southbound
Dec 02 10:11:03 np0005541913.localdomain kernel: device tap32986807-4a left promiscuous mode
Dec 02 10:11:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:03.511 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:03.634 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dffef2e74844a7ebb6ee68826fb7e57', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d25b6f5f-b086-4558-a0fb-fc54d0ecba34, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=32986807-4a62-4af8-ad03-9336f56fbec0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:03.636 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 32986807-4a62-4af8-ad03-9336f56fbec0 in datapath 39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07 unbound from our chassis
Dec 02 10:11:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:03.639 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:11:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:03.641 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e720ea-1510-4332-860f-3a5d13ecffa6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/383165306' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/383165306' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2275295108' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:11:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:11:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:11:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:11:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:11:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:11:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:11:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:11:04 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:11:04 np0005541913.localdomain podman[327332]: 2025-12-02 10:11:04.448087634 +0000 UTC m=+0.077617290 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:04 np0005541913.localdomain podman[327332]: 2025-12-02 10:11:04.461214073 +0000 UTC m=+0.090743759 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:04 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:11:04 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 02 10:11:04 np0005541913.localdomain dnsmasq[325907]: exiting on receipt of SIGTERM
Dec 02 10:11:04 np0005541913.localdomain podman[327361]: 2025-12-02 10:11:04.58002794 +0000 UTC m=+0.065025084 container kill 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:11:04 np0005541913.localdomain systemd[1]: libpod-9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c.scope: Deactivated successfully.
Dec 02 10:11:04 np0005541913.localdomain podman[327374]: 2025-12-02 10:11:04.662331963 +0000 UTC m=+0.063187724 container died 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "299defe1-3a6d-4652-876c-cda1688f998a_1df72d22-e70e-4f6b-877d-3a1cb60db11a", "force": true, "format": "json"}]: dispatch
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "299defe1-3a6d-4652-876c-cda1688f998a", "force": true, "format": "json"}]: dispatch
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: pgmap v405: 177 pgs: 177 active+clean; 149 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 78 KiB/s wr, 98 op/s
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/780445321' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/780445321' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:04 np0005541913.localdomain podman[327374]: 2025-12-02 10:11:04.703008008 +0000 UTC m=+0.103863699 container cleanup 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:11:04 np0005541913.localdomain systemd[1]: libpod-conmon-9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c.scope: Deactivated successfully.
Dec 02 10:11:04 np0005541913.localdomain podman[327375]: 2025-12-02 10:11:04.729989656 +0000 UTC m=+0.127443677 container remove 9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39b95b79-8fd6-45a1-b6ac-c6ee2cb0dc07, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:04.756 263406 INFO neutron.agent.dhcp.agent [None req-3890ab30-d1db-4803-bd8c-1221ebad6a83 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:04.988 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:05Z|00478|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:11:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:05.204 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:05.269 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:11:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-47b13ae10e11528f1ef963d551ebcf95dfcf3157f75d67b00cb1cbdcb5fd574f-merged.mount: Deactivated successfully.
Dec 02 10:11:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e7cbac2b42ae8161add606bed32cbd0f4b194a5a786c9dccdd0e6a6796ef82c-userdata-shm.mount: Deactivated successfully.
Dec 02 10:11:05 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d39b95b79\x2d8fd6\x2d45a1\x2db6ac\x2dc6ee2cb0dc07.mount: Deactivated successfully.
Dec 02 10:11:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:05.822 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:11:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:05.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:11:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:11:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:11:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:06.090 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:11:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159845 "" "Go-http-client/1.1"
Dec 02 10:11:06 np0005541913.localdomain ceph-mon[298296]: pgmap v406: 177 pgs: 177 active+clean; 149 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 73 KiB/s wr, 90 op/s
Dec 02 10:11:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:06.138 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:11:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20214 "" "Go-http-client/1.1"
Dec 02 10:11:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e166 e166: 6 total, 6 up, 6 in
Dec 02 10:11:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:06.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:11:07 np0005541913.localdomain ceph-mon[298296]: osdmap e166: 6 total, 6 up, 6 in
Dec 02 10:11:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/690375908' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/690375908' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3661005679' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:11:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e167 e167: 6 total, 6 up, 6 in
Dec 02 10:11:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:07.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:11:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:11:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:11:08 np0005541913.localdomain podman[327403]: 2025-12-02 10:11:08.473253711 +0000 UTC m=+0.083892587 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:11:08 np0005541913.localdomain podman[327403]: 2025-12-02 10:11:08.481324775 +0000 UTC m=+0.091963641 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:11:08 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4217679944' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:11:08 np0005541913.localdomain podman[327404]: 2025-12-02 10:11:08.539349421 +0000 UTC m=+0.148210360 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: pgmap v408: 177 pgs: 177 active+clean; 149 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 65 KiB/s wr, 81 op/s
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: osdmap e167: 6 total, 6 up, 6 in
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1405474493' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:11:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4217679944' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:11:08 np0005541913.localdomain podman[327404]: 2025-12-02 10:11:08.588576374 +0000 UTC m=+0.197437263 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:08 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:11:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e168 e168: 6 total, 6 up, 6 in
Dec 02 10:11:09 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:11:10 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:10.483 263406 INFO neutron.agent.linux.ip_lib [None req-cd73f1c7-b3e6-4c3a-83ee-0455e30c70d9 - - - - - -] Device tapa910a553-85 cannot be used as it has no MAC address
Dec 02 10:11:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:10.552 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:10 np0005541913.localdomain kernel: device tapa910a553-85 entered promiscuous mode
Dec 02 10:11:10 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670270.5605] manager: (tapa910a553-85): new Generic device (/org/freedesktop/NetworkManager/Devices/77)
Dec 02 10:11:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:10.562 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:10Z|00479|binding|INFO|Claiming lport a910a553-85b1-4284-b87b-d67a0455f7a3 for this chassis.
Dec 02 10:11:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:10Z|00480|binding|INFO|a910a553-85b1-4284-b87b-d67a0455f7a3: Claiming unknown
Dec 02 10:11:10 np0005541913.localdomain systemd-udevd[327461]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:11:10 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa910a553-85: No such device
Dec 02 10:11:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:10Z|00481|binding|INFO|Setting lport a910a553-85b1-4284-b87b-d67a0455f7a3 ovn-installed in OVS
Dec 02 10:11:10 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa910a553-85: No such device
Dec 02 10:11:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:10.602 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:10 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa910a553-85: No such device
Dec 02 10:11:10 np0005541913.localdomain ceph-mon[298296]: pgmap v410: 177 pgs: 177 active+clean; 149 MiB data, 947 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 59 KiB/s wr, 75 op/s
Dec 02 10:11:10 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "aacf1e5d-1b53-42f1-b3a7-45f0acb43c13_a93da211-cd1e-4fb4-ab83-001318ab16bf", "force": true, "format": "json"}]: dispatch
Dec 02 10:11:10 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "snap_name": "aacf1e5d-1b53-42f1-b3a7-45f0acb43c13", "force": true, "format": "json"}]: dispatch
Dec 02 10:11:10 np0005541913.localdomain ceph-mon[298296]: osdmap e168: 6 total, 6 up, 6 in
Dec 02 10:11:10 np0005541913.localdomain ceph-mon[298296]: mgrmap e49: np0005541914.lljzmk(active, since 11m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:11:10 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa910a553-85: No such device
Dec 02 10:11:10 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa910a553-85: No such device
Dec 02 10:11:10 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa910a553-85: No such device
Dec 02 10:11:10 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa910a553-85: No such device
Dec 02 10:11:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e169 e169: 6 total, 6 up, 6 in
Dec 02 10:11:10 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapa910a553-85: No such device
Dec 02 10:11:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:10.641 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:10.678 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:10Z|00482|binding|INFO|Setting lport a910a553-85b1-4284-b87b-d67a0455f7a3 up in Southbound
Dec 02 10:11:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:10.959 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8eea084241c14c5d9a6cc0d912041a21', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30cf73fc-9798-4d84-a408-4d3ceadffb42, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=a910a553-85b1-4284-b87b-d67a0455f7a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:10.961 160221 INFO neutron.agent.ovn.metadata.agent [-] Port a910a553-85b1-4284-b87b-d67a0455f7a3 in datapath b2aacd19-6fe6-44f4-8d3d-5e657d287b5b bound to our chassis
Dec 02 10:11:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:10.963 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b2aacd19-6fe6-44f4-8d3d-5e657d287b5b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:11:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:10.964 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[490a8496-1c48-4f63-9f9f-549e5afe87a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:11.090 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:11.140 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:11 np0005541913.localdomain podman[327532]: 
Dec 02 10:11:11 np0005541913.localdomain podman[327532]: 2025-12-02 10:11:11.566244296 +0000 UTC m=+0.101286201 container create ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:11:11 np0005541913.localdomain systemd[1]: Started libpod-conmon-ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647.scope.
Dec 02 10:11:11 np0005541913.localdomain podman[327532]: 2025-12-02 10:11:11.516458869 +0000 UTC m=+0.051500824 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:11:11 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:11:11 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1165e8d7b2376bb929e0428192dfe49a187f4a8ad512dcdd1d564a09da96bcf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:11:11 np0005541913.localdomain podman[327532]: 2025-12-02 10:11:11.641787269 +0000 UTC m=+0.176829184 container init ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e170 e170: 6 total, 6 up, 6 in
Dec 02 10:11:11 np0005541913.localdomain podman[327532]: 2025-12-02 10:11:11.654644341 +0000 UTC m=+0.189686256 container start ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:11:11 np0005541913.localdomain ceph-mon[298296]: osdmap e169: 6 total, 6 up, 6 in
Dec 02 10:11:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2736016088' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2736016088' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:11 np0005541913.localdomain dnsmasq[327550]: started, version 2.85 cachesize 150
Dec 02 10:11:11 np0005541913.localdomain dnsmasq[327550]: DNS service limited to local subnets
Dec 02 10:11:11 np0005541913.localdomain dnsmasq[327550]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:11:11 np0005541913.localdomain dnsmasq[327550]: warning: no upstream servers configured
Dec 02 10:11:11 np0005541913.localdomain dnsmasq-dhcp[327550]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:11:11 np0005541913.localdomain dnsmasq[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/addn_hosts - 0 addresses
Dec 02 10:11:11 np0005541913.localdomain dnsmasq-dhcp[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/host
Dec 02 10:11:11 np0005541913.localdomain dnsmasq-dhcp[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/opts
Dec 02 10:11:11 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:11.937 263406 INFO neutron.agent.dhcp.agent [None req-2c291abe-bd54-4429-9cd1-9696b4ea9169 - - - - - -] DHCP configuration for ports {'8fbe0409-bc61-4d2a-8af9-ace603962489'} is completed
Dec 02 10:11:12 np0005541913.localdomain ceph-mon[298296]: pgmap v413: 177 pgs: 177 active+clean; 149 MiB data, 947 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 32 KiB/s wr, 46 op/s
Dec 02 10:11:12 np0005541913.localdomain ceph-mon[298296]: osdmap e170: 6 total, 6 up, 6 in
Dec 02 10:11:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3619384774' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:11:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:11:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:11:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:11:12 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 02 10:11:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2917294181' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:11:12 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e171 e171: 6 total, 6 up, 6 in
Dec 02 10:11:13 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:11:13 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:11:13 np0005541913.localdomain ceph-mon[298296]: osdmap e171: 6 total, 6 up, 6 in
Dec 02 10:11:13 np0005541913.localdomain ceph-mon[298296]: pgmap v416: 177 pgs: 177 active+clean; 149 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 108 KiB/s wr, 99 op/s
Dec 02 10:11:13 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "format": "json"}]: dispatch
Dec 02 10:11:13 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "45fda55f-a67b-4a03-8e83-17717dd47f28", "force": true, "format": "json"}]: dispatch
Dec 02 10:11:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:14.138 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:11:13Z, description=, device_id=be2bd9ee-1025-4bde-b6f9-05c48824f4be, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b36670>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908813f70>], id=5f55fbe7-c9d6-4e59-88a0-e8f30657e90b, ip_allocation=immediate, mac_address=fa:16:3e:f2:41:9c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:11:04Z, description=, dns_domain=, id=b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1013659582, port_security_enabled=True, project_id=8eea084241c14c5d9a6cc0d912041a21, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=24948, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2632, status=ACTIVE, subnets=['82575f4f-e89e-4800-8beb-d49b38e1570e'], tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:11:08Z, vlan_transparent=None, network_id=b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, port_security_enabled=False, project_id=8eea084241c14c5d9a6cc0d912041a21, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2649, status=DOWN, tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:11:13Z on network b2aacd19-6fe6-44f4-8d3d-5e657d287b5b
Dec 02 10:11:14 np0005541913.localdomain dnsmasq[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/addn_hosts - 1 addresses
Dec 02 10:11:14 np0005541913.localdomain dnsmasq-dhcp[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/host
Dec 02 10:11:14 np0005541913.localdomain podman[327569]: 2025-12-02 10:11:14.361814625 +0000 UTC m=+0.065541178 container kill ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:11:14 np0005541913.localdomain dnsmasq-dhcp[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/opts
Dec 02 10:11:14 np0005541913.localdomain systemd[1]: tmp-crun.36Gsyd.mount: Deactivated successfully.
Dec 02 10:11:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:14.615 263406 INFO neutron.agent.dhcp.agent [None req-0c49c027-4db8-413b-9998-88181ac76b53 - - - - - -] DHCP configuration for ports {'5f55fbe7-c9d6-4e59-88a0-e8f30657e90b'} is completed
Dec 02 10:11:14 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e172 e172: 6 total, 6 up, 6 in
Dec 02 10:11:15 np0005541913.localdomain ceph-mon[298296]: osdmap e172: 6 total, 6 up, 6 in
Dec 02 10:11:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:11:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:11:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:11:15 np0005541913.localdomain ceph-mon[298296]: pgmap v418: 177 pgs: 177 active+clean; 149 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 95 KiB/s wr, 87 op/s
Dec 02 10:11:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:16.105 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:16.142 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e173 e173: 6 total, 6 up, 6 in
Dec 02 10:11:16 np0005541913.localdomain ceph-mon[298296]: mgrmap e50: np0005541914.lljzmk(active, since 11m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:11:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3191412397' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3191412397' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:16 np0005541913.localdomain ceph-mon[298296]: osdmap e173: 6 total, 6 up, 6 in
Dec 02 10:11:17 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e174 e174: 6 total, 6 up, 6 in
Dec 02 10:11:17 np0005541913.localdomain ceph-mon[298296]: pgmap v420: 177 pgs: 177 active+clean; 149 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 78 KiB/s wr, 72 op/s
Dec 02 10:11:18 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:18Z|00483|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:11:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:18.049 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:18.537 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:11:13Z, description=, device_id=be2bd9ee-1025-4bde-b6f9-05c48824f4be, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908858490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908858100>], id=5f55fbe7-c9d6-4e59-88a0-e8f30657e90b, ip_allocation=immediate, mac_address=fa:16:3e:f2:41:9c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:11:04Z, description=, dns_domain=, id=b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1013659582, port_security_enabled=True, project_id=8eea084241c14c5d9a6cc0d912041a21, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=24948, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2632, status=ACTIVE, subnets=['82575f4f-e89e-4800-8beb-d49b38e1570e'], tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:11:08Z, vlan_transparent=None, network_id=b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, port_security_enabled=False, project_id=8eea084241c14c5d9a6cc0d912041a21, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2649, status=DOWN, tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:11:13Z on network b2aacd19-6fe6-44f4-8d3d-5e657d287b5b
Dec 02 10:11:18 np0005541913.localdomain dnsmasq[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/addn_hosts - 1 addresses
Dec 02 10:11:18 np0005541913.localdomain podman[327608]: 2025-12-02 10:11:18.730051896 +0000 UTC m=+0.056118126 container kill ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:18 np0005541913.localdomain dnsmasq-dhcp[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/host
Dec 02 10:11:18 np0005541913.localdomain dnsmasq-dhcp[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/opts
Dec 02 10:11:18 np0005541913.localdomain ceph-mon[298296]: osdmap e174: 6 total, 6 up, 6 in
Dec 02 10:11:19 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:19.880 263406 INFO neutron.agent.dhcp.agent [None req-8ed8ad51-32e2-4ae0-ac77-d9dc9cd94d64 - - - - - -] DHCP configuration for ports {'5f55fbe7-c9d6-4e59-88a0-e8f30657e90b'} is completed
Dec 02 10:11:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2016593782' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2016593782' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:19 np0005541913.localdomain ceph-mon[298296]: pgmap v422: 177 pgs: 177 active+clean; 149 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 40 KiB/s wr, 88 op/s
Dec 02 10:11:20 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:11:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:11:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:11:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:11:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 02 10:11:20 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:11:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:21.141 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:21.145 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:11:21 np0005541913.localdomain podman[327630]: 2025-12-02 10:11:21.451904499 +0000 UTC m=+0.092001782 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:11:21 np0005541913.localdomain podman[327630]: 2025-12-02 10:11:21.488703201 +0000 UTC m=+0.128800484 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 02 10:11:21 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:11:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e175 e175: 6 total, 6 up, 6 in
Dec 02 10:11:22 np0005541913.localdomain ceph-mon[298296]: pgmap v423: 177 pgs: 177 active+clean; 149 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 38 KiB/s wr, 83 op/s
Dec 02 10:11:22 np0005541913.localdomain ceph-mon[298296]: osdmap e175: 6 total, 6 up, 6 in
Dec 02 10:11:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:11:24 np0005541913.localdomain systemd[1]: tmp-crun.xkkSdH.mount: Deactivated successfully.
Dec 02 10:11:24 np0005541913.localdomain podman[327649]: 2025-12-02 10:11:24.448910165 +0000 UTC m=+0.088772406 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 10:11:24 np0005541913.localdomain podman[327649]: 2025-12-02 10:11:24.452764588 +0000 UTC m=+0.092626849 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 02 10:11:24 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:11:24 np0005541913.localdomain ceph-mon[298296]: pgmap v425: 177 pgs: 177 active+clean; 149 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 72 KiB/s rd, 66 KiB/s wr, 104 op/s
Dec 02 10:11:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:11:25 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:11:25 np0005541913.localdomain systemd[1]: tmp-crun.vfLQy5.mount: Deactivated successfully.
Dec 02 10:11:25 np0005541913.localdomain podman[327668]: 2025-12-02 10:11:25.445835162 +0000 UTC m=+0.080407854 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, version=9.6, io.openshift.expose-services=)
Dec 02 10:11:25 np0005541913.localdomain podman[327668]: 2025-12-02 10:11:25.458034668 +0000 UTC m=+0.092607400 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Dec 02 10:11:25 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:11:25 np0005541913.localdomain podman[327669]: 2025-12-02 10:11:25.552447533 +0000 UTC m=+0.183699045 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:11:25 np0005541913.localdomain podman[327669]: 2025-12-02 10:11:25.607187363 +0000 UTC m=+0.238438955 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:11:25 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:11:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:11:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:11:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:26.145 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e176 e176: 6 total, 6 up, 6 in
Dec 02 10:11:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:11:26 np0005541913.localdomain ceph-mon[298296]: pgmap v426: 177 pgs: 177 active+clean; 149 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 54 KiB/s wr, 86 op/s
Dec 02 10:11:26 np0005541913.localdomain ceph-mon[298296]: osdmap e176: 6 total, 6 up, 6 in
Dec 02 10:11:28 np0005541913.localdomain dnsmasq[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/addn_hosts - 0 addresses
Dec 02 10:11:28 np0005541913.localdomain dnsmasq-dhcp[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/host
Dec 02 10:11:28 np0005541913.localdomain podman[327728]: 2025-12-02 10:11:28.228630912 +0000 UTC m=+0.067169611 container kill 0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0563b4b4-439a-4655-9225-28a24ad09db2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:11:28 np0005541913.localdomain dnsmasq-dhcp[326766]: read /var/lib/neutron/dhcp/0563b4b4-439a-4655-9225-28a24ad09db2/opts
Dec 02 10:11:28 np0005541913.localdomain ceph-mon[298296]: pgmap v428: 177 pgs: 177 active+clean; 149 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 24 KiB/s wr, 19 op/s
Dec 02 10:11:28 np0005541913.localdomain sudo[327748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:11:28 np0005541913.localdomain sudo[327748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:11:28 np0005541913.localdomain sudo[327748]: pam_unix(sudo:session): session closed for user root
Dec 02 10:11:29 np0005541913.localdomain sudo[327766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 02 10:11:29 np0005541913.localdomain sudo[327766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:11:29 np0005541913.localdomain sudo[327766]: pam_unix(sudo:session): session closed for user root
Dec 02 10:11:29 np0005541913.localdomain sudo[327805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:11:29 np0005541913.localdomain sudo[327805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:11:29 np0005541913.localdomain sudo[327805]: pam_unix(sudo:session): session closed for user root
Dec 02 10:11:29 np0005541913.localdomain sudo[327823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:11:29 np0005541913.localdomain sudo[327823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:11:30 np0005541913.localdomain sudo[327823]: pam_unix(sudo:session): session closed for user root
Dec 02 10:11:30 np0005541913.localdomain ceph-mon[298296]: pgmap v429: 177 pgs: 177 active+clean; 149 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 41 KiB/s wr, 22 op/s
Dec 02 10:11:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:11:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:11:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:11:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:11:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:11:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:11:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:11:30 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:11:30 np0005541913.localdomain sudo[327873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:11:30 np0005541913.localdomain sudo[327873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:11:30 np0005541913.localdomain sudo[327873]: pam_unix(sudo:session): session closed for user root
Dec 02 10:11:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:31.148 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:31 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:31Z|00484|binding|INFO|Releasing lport 79d1b462-4e0f-4b98-8dd4-56658187af29 from this chassis (sb_readonly=0)
Dec 02 10:11:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:31.265 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:31 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:31Z|00485|binding|INFO|Setting lport 79d1b462-4e0f-4b98-8dd4-56658187af29 down in Southbound
Dec 02 10:11:31 np0005541913.localdomain kernel: device tap79d1b462-4e left promiscuous mode
Dec 02 10:11:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:31.288 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:11:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:11:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:11:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:11:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:11:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 02 10:11:31 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:31.713 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-0563b4b4-439a-4655-9225-28a24ad09db2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0563b4b4-439a-4655-9225-28a24ad09db2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f39b5ca1adf344dd9239d3d0131792d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0e1d319-f053-4ffe-b337-109ee69f3933, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=79d1b462-4e0f-4b98-8dd4-56658187af29) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:31 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:31.715 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 79d1b462-4e0f-4b98-8dd4-56658187af29 in datapath 0563b4b4-439a-4655-9225-28a24ad09db2 unbound from our chassis
Dec 02 10:11:31 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:31.718 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0563b4b4-439a-4655-9225-28a24ad09db2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:11:31 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:31.719 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[30bff567-3fc5-4832-a3ce-646901b9ee41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:32 np0005541913.localdomain ceph-mon[298296]: pgmap v430: 177 pgs: 177 active+clean; 149 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 34 KiB/s wr, 18 op/s
Dec 02 10:11:32 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:11:32 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:11:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:11:33 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:33.000 263406 INFO neutron.agent.linux.ip_lib [None req-8c8c37d5-6503-49fe-bbd7-4b9d93fab532 - - - - - -] Device tapf7cbdc65-9f cannot be used as it has no MAC address
Dec 02 10:11:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:33.018 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:33 np0005541913.localdomain kernel: device tapf7cbdc65-9f entered promiscuous mode
Dec 02 10:11:33 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670293.0243] manager: (tapf7cbdc65-9f): new Generic device (/org/freedesktop/NetworkManager/Devices/78)
Dec 02 10:11:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:33.025 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:33 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:33Z|00486|binding|INFO|Claiming lport f7cbdc65-9f74-447e-81d0-f5b9eb66518d for this chassis.
Dec 02 10:11:33 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:33Z|00487|binding|INFO|f7cbdc65-9f74-447e-81d0-f5b9eb66518d: Claiming unknown
Dec 02 10:11:33 np0005541913.localdomain systemd-udevd[327903]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:11:33 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7cbdc65-9f: No such device
Dec 02 10:11:33 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7cbdc65-9f: No such device
Dec 02 10:11:33 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7cbdc65-9f: No such device
Dec 02 10:11:33 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:33Z|00488|binding|INFO|Setting lport f7cbdc65-9f74-447e-81d0-f5b9eb66518d ovn-installed in OVS
Dec 02 10:11:33 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:33Z|00489|binding|INFO|Setting lport f7cbdc65-9f74-447e-81d0-f5b9eb66518d up in Southbound
Dec 02 10:11:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:33.064 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-90303572-56ca-4145-bbbf-5a391d217194', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90303572-56ca-4145-bbbf-5a391d217194', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e49e0ab2-43b0-4f40-9f61-82b8dfd7e5a9, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=f7cbdc65-9f74-447e-81d0-f5b9eb66518d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:33.064 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:33.067 160221 INFO neutron.agent.ovn.metadata.agent [-] Port f7cbdc65-9f74-447e-81d0-f5b9eb66518d in datapath 90303572-56ca-4145-bbbf-5a391d217194 bound to our chassis
Dec 02 10:11:33 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7cbdc65-9f: No such device
Dec 02 10:11:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:33.069 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 90303572-56ca-4145-bbbf-5a391d217194 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:11:33 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7cbdc65-9f: No such device
Dec 02 10:11:33 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:33.070 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[b2025a03-1813-4dd6-baae-0ae98086b88e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:33 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7cbdc65-9f: No such device
Dec 02 10:11:33 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7cbdc65-9f: No such device
Dec 02 10:11:33 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf7cbdc65-9f: No such device
Dec 02 10:11:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:33.108 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:33 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:33.131 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:33 np0005541913.localdomain podman[327974]: 
Dec 02 10:11:33 np0005541913.localdomain podman[327974]: 2025-12-02 10:11:33.970315982 +0000 UTC m=+0.090783140 container create 1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:34 np0005541913.localdomain systemd[1]: Started libpod-conmon-1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702.scope.
Dec 02 10:11:34 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:11:34 np0005541913.localdomain podman[327974]: 2025-12-02 10:11:33.927284135 +0000 UTC m=+0.047751313 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:11:34 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d2f932238cf32e54e614fd98285fb06ac8759ba677ce563d29c2fc806ecc63b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:11:34 np0005541913.localdomain podman[327974]: 2025-12-02 10:11:34.043339088 +0000 UTC m=+0.163806236 container init 1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:11:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:11:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:11:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:11:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:11:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:11:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:11:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:11:34 np0005541913.localdomain dnsmasq[327992]: started, version 2.85 cachesize 150
Dec 02 10:11:34 np0005541913.localdomain dnsmasq[327992]: DNS service limited to local subnets
Dec 02 10:11:34 np0005541913.localdomain dnsmasq[327992]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:11:34 np0005541913.localdomain dnsmasq[327992]: warning: no upstream servers configured
Dec 02 10:11:34 np0005541913.localdomain dnsmasq-dhcp[327992]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:11:34 np0005541913.localdomain podman[327974]: 2025-12-02 10:11:34.06369673 +0000 UTC m=+0.184163888 container start 1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:34 np0005541913.localdomain dnsmasq[327992]: read /var/lib/neutron/dhcp/90303572-56ca-4145-bbbf-5a391d217194/addn_hosts - 0 addresses
Dec 02 10:11:34 np0005541913.localdomain dnsmasq-dhcp[327992]: read /var/lib/neutron/dhcp/90303572-56ca-4145-bbbf-5a391d217194/host
Dec 02 10:11:34 np0005541913.localdomain dnsmasq-dhcp[327992]: read /var/lib/neutron/dhcp/90303572-56ca-4145-bbbf-5a391d217194/opts
Dec 02 10:11:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:34.583 263406 INFO neutron.agent.dhcp.agent [None req-6e9e48ae-6429-421e-a3d4-49ee2f59c72c - - - - - -] DHCP configuration for ports {'f4ccb325-8298-4165-bdc6-b71985047423'} is completed
Dec 02 10:11:34 np0005541913.localdomain ceph-mon[298296]: pgmap v431: 177 pgs: 177 active+clean; 150 MiB data, 911 MiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 32 KiB/s wr, 4 op/s
Dec 02 10:11:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:11:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:11:34 np0005541913.localdomain dnsmasq[327992]: read /var/lib/neutron/dhcp/90303572-56ca-4145-bbbf-5a391d217194/addn_hosts - 0 addresses
Dec 02 10:11:34 np0005541913.localdomain dnsmasq-dhcp[327992]: read /var/lib/neutron/dhcp/90303572-56ca-4145-bbbf-5a391d217194/host
Dec 02 10:11:34 np0005541913.localdomain dnsmasq-dhcp[327992]: read /var/lib/neutron/dhcp/90303572-56ca-4145-bbbf-5a391d217194/opts
Dec 02 10:11:34 np0005541913.localdomain podman[328010]: 2025-12-02 10:11:34.787101589 +0000 UTC m=+0.058599633 container kill 1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:11:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:11:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:34.873 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:34 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:34.875 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:11:34 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:34.908 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:34 np0005541913.localdomain podman[328026]: 2025-12-02 10:11:34.985407603 +0000 UTC m=+0.116027213 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:11:34 np0005541913.localdomain podman[328026]: 2025-12-02 10:11:34.999825758 +0000 UTC m=+0.130445398 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:35 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:11:35 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:35.114 263406 INFO neutron.agent.dhcp.agent [None req-3844f544-b253-4fa1-b91b-f1e84edc2253 - - - - - -] DHCP configuration for ports {'f4ccb325-8298-4165-bdc6-b71985047423', 'f7cbdc65-9f74-447e-81d0-f5b9eb66518d'} is completed
Dec 02 10:11:35 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:11:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:11:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:11:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:11:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 163479 "" "Go-http-client/1.1"
Dec 02 10:11:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:11:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 21159 "" "Go-http-client/1.1"
Dec 02 10:11:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:36.151 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:36.155 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:36 np0005541913.localdomain ceph-mon[298296]: pgmap v432: 177 pgs: 177 active+clean; 150 MiB data, 911 MiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 32 KiB/s wr, 4 op/s
Dec 02 10:11:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:36.691 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port b8488798-9f51-452b-aaff-d402048f4009 with type ""
Dec 02 10:11:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:36.693 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-90303572-56ca-4145-bbbf-5a391d217194', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90303572-56ca-4145-bbbf-5a391d217194', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e49e0ab2-43b0-4f40-9f61-82b8dfd7e5a9, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=f7cbdc65-9f74-447e-81d0-f5b9eb66518d) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:36Z|00490|binding|INFO|Removing iface tapf7cbdc65-9f ovn-installed in OVS
Dec 02 10:11:36 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:36Z|00491|binding|INFO|Removing lport f7cbdc65-9f74-447e-81d0-f5b9eb66518d ovn-installed in OVS
Dec 02 10:11:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:36.694 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:36.701 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:36.701 160221 INFO neutron.agent.ovn.metadata.agent [-] Port f7cbdc65-9f74-447e-81d0-f5b9eb66518d in datapath 90303572-56ca-4145-bbbf-5a391d217194 unbound from our chassis
Dec 02 10:11:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:36.704 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90303572-56ca-4145-bbbf-5a391d217194, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:11:36 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:36.705 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[c54158fd-2bd6-496b-9121-86970b51f208]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:36 np0005541913.localdomain podman[328064]: 2025-12-02 10:11:36.835122737 +0000 UTC m=+0.063887174 container kill 1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:11:36 np0005541913.localdomain dnsmasq[327992]: exiting on receipt of SIGTERM
Dec 02 10:11:36 np0005541913.localdomain systemd[1]: libpod-1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702.scope: Deactivated successfully.
Dec 02 10:11:36 np0005541913.localdomain podman[328077]: 2025-12-02 10:11:36.883171607 +0000 UTC m=+0.035493637 container died 1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 10:11:36 np0005541913.localdomain podman[328077]: 2025-12-02 10:11:36.972156179 +0000 UTC m=+0.124478169 container cleanup 1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:11:36 np0005541913.localdomain systemd[1]: libpod-conmon-1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702.scope: Deactivated successfully.
Dec 02 10:11:36 np0005541913.localdomain podman[328079]: 2025-12-02 10:11:36.992969643 +0000 UTC m=+0.135175283 container remove 1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:11:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:37.009 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:37 np0005541913.localdomain kernel: device tapf7cbdc65-9f left promiscuous mode
Dec 02 10:11:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:37.022 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:37Z|00492|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:11:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:37.045 263406 INFO neutron.agent.dhcp.agent [None req-2ad2cb68-84bf-4287-b213-7d22078cddbd - - - - - -] Synchronizing state
Dec 02 10:11:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:37.110 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:37.302 263406 INFO neutron.agent.dhcp.agent [None req-ac6c4a74-9018-4989-9341-cd6278919b9d - - - - - -] All active networks have been fetched through RPC.
Dec 02 10:11:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:37.304 263406 INFO neutron.agent.dhcp.agent [-] Starting network 90303572-56ca-4145-bbbf-5a391d217194 dhcp configuration
Dec 02 10:11:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:11:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:11:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:11:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 02 10:11:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6d2f932238cf32e54e614fd98285fb06ac8759ba677ce563d29c2fc806ecc63b-merged.mount: Deactivated successfully.
Dec 02 10:11:37 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e20edf189cb064e5f2c290de7e1debaf866f456a33a65f179955b67eb930702-userdata-shm.mount: Deactivated successfully.
Dec 02 10:11:37 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d90303572\x2d56ca\x2d4145\x2dbbbf\x2d5a391d217194.mount: Deactivated successfully.
Dec 02 10:11:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:37.876 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:11:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:38.061 263406 INFO neutron.agent.linux.ip_lib [None req-cd09deba-86fb-4071-8f27-829712929f3f - - - - - -] Device tap5ea828b9-62 cannot be used as it has no MAC address
Dec 02 10:11:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:38.080 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:38 np0005541913.localdomain kernel: device tap5ea828b9-62 entered promiscuous mode
Dec 02 10:11:38 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670298.0900] manager: (tap5ea828b9-62): new Generic device (/org/freedesktop/NetworkManager/Devices/79)
Dec 02 10:11:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:38Z|00493|binding|INFO|Claiming lport 5ea828b9-628f-47b3-a4f4-720d9ef822f9 for this chassis.
Dec 02 10:11:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:38Z|00494|binding|INFO|5ea828b9-628f-47b3-a4f4-720d9ef822f9: Claiming unknown
Dec 02 10:11:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:38.090 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:38 np0005541913.localdomain systemd-udevd[328118]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:11:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:38.101 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-90303572-56ca-4145-bbbf-5a391d217194', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90303572-56ca-4145-bbbf-5a391d217194', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e49e0ab2-43b0-4f40-9f61-82b8dfd7e5a9, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=1, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=5ea828b9-628f-47b3-a4f4-720d9ef822f9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:38.102 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 5ea828b9-628f-47b3-a4f4-720d9ef822f9 in datapath 90303572-56ca-4145-bbbf-5a391d217194 bound to our chassis
Dec 02 10:11:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:38.103 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 90303572-56ca-4145-bbbf-5a391d217194 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:11:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:38.106 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[eb942bc5-5fca-41a4-8bf2-1968850d97ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5ea828b9-62: No such device
Dec 02 10:11:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:38Z|00495|binding|INFO|Setting lport 5ea828b9-628f-47b3-a4f4-720d9ef822f9 ovn-installed in OVS
Dec 02 10:11:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:38Z|00496|binding|INFO|Setting lport 5ea828b9-628f-47b3-a4f4-720d9ef822f9 up in Southbound
Dec 02 10:11:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5ea828b9-62: No such device
Dec 02 10:11:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5ea828b9-62: No such device
Dec 02 10:11:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5ea828b9-62: No such device
Dec 02 10:11:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5ea828b9-62: No such device
Dec 02 10:11:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5ea828b9-62: No such device
Dec 02 10:11:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5ea828b9-62: No such device
Dec 02 10:11:38 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5ea828b9-62: No such device
Dec 02 10:11:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:38.169 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:38.191 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:11:38 np0005541913.localdomain ceph-mon[298296]: pgmap v433: 177 pgs: 177 active+clean; 150 MiB data, 911 MiB used, 41 GiB / 42 GiB avail; 96 B/s rd, 30 KiB/s wr, 4 op/s
Dec 02 10:11:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:11:38 np0005541913.localdomain podman[328189]: 
Dec 02 10:11:38 np0005541913.localdomain podman[328189]: 2025-12-02 10:11:38.995198211 +0000 UTC m=+0.092139007 container create c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:39 np0005541913.localdomain systemd[1]: Started libpod-conmon-c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136.scope.
Dec 02 10:11:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:11:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:11:39 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:11:39 np0005541913.localdomain podman[328189]: 2025-12-02 10:11:38.953251573 +0000 UTC m=+0.050192409 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:11:39 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b08ea60918ddc50749a8d9387348c0bc4f40c098c37c29fe4981d5d7b2eca14f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:11:39 np0005541913.localdomain podman[328189]: 2025-12-02 10:11:39.062504055 +0000 UTC m=+0.159444811 container init c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 02 10:11:39 np0005541913.localdomain podman[328189]: 2025-12-02 10:11:39.069878751 +0000 UTC m=+0.166819507 container start c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:11:39 np0005541913.localdomain dnsmasq[328223]: started, version 2.85 cachesize 150
Dec 02 10:11:39 np0005541913.localdomain dnsmasq[328223]: DNS service limited to local subnets
Dec 02 10:11:39 np0005541913.localdomain dnsmasq[328223]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:11:39 np0005541913.localdomain dnsmasq[328223]: warning: no upstream servers configured
Dec 02 10:11:39 np0005541913.localdomain dnsmasq-dhcp[328223]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:11:39 np0005541913.localdomain dnsmasq[328223]: read /var/lib/neutron/dhcp/90303572-56ca-4145-bbbf-5a391d217194/addn_hosts - 0 addresses
Dec 02 10:11:39 np0005541913.localdomain dnsmasq-dhcp[328223]: read /var/lib/neutron/dhcp/90303572-56ca-4145-bbbf-5a391d217194/host
Dec 02 10:11:39 np0005541913.localdomain dnsmasq-dhcp[328223]: read /var/lib/neutron/dhcp/90303572-56ca-4145-bbbf-5a391d217194/opts
Dec 02 10:11:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:39.110 263406 INFO neutron.agent.dhcp.agent [None req-cd09deba-86fb-4071-8f27-829712929f3f - - - - - -] Finished network 90303572-56ca-4145-bbbf-5a391d217194 dhcp configuration
Dec 02 10:11:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:39.110 263406 INFO neutron.agent.dhcp.agent [None req-ac6c4a74-9018-4989-9341-cd6278919b9d - - - - - -] Synchronizing state complete
Dec 02 10:11:39 np0005541913.localdomain podman[328206]: 2025-12-02 10:11:39.131702529 +0000 UTC m=+0.077961799 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:11:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:39.139 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:39 np0005541913.localdomain kernel: device tap5ea828b9-62 left promiscuous mode
Dec 02 10:11:39 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:39Z|00497|binding|INFO|Releasing lport 5ea828b9-628f-47b3-a4f4-720d9ef822f9 from this chassis (sb_readonly=0)
Dec 02 10:11:39 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:39Z|00498|binding|INFO|Setting lport 5ea828b9-628f-47b3-a4f4-720d9ef822f9 down in Southbound
Dec 02 10:11:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:39.157 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:39 np0005541913.localdomain podman[328205]: 2025-12-02 10:11:39.107124014 +0000 UTC m=+0.058845390 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:11:39 np0005541913.localdomain podman[328206]: 2025-12-02 10:11:39.162929911 +0000 UTC m=+0.109189201 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 02 10:11:39 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:11:39 np0005541913.localdomain podman[328205]: 2025-12-02 10:11:39.192995951 +0000 UTC m=+0.144717377 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:11:39 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:11:39 np0005541913.localdomain dnsmasq[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/addn_hosts - 0 addresses
Dec 02 10:11:39 np0005541913.localdomain dnsmasq-dhcp[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/host
Dec 02 10:11:39 np0005541913.localdomain podman[328274]: 2025-12-02 10:11:39.23681888 +0000 UTC m=+0.036980247 container kill ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:11:39 np0005541913.localdomain dnsmasq-dhcp[327550]: read /var/lib/neutron/dhcp/b2aacd19-6fe6-44f4-8d3d-5e657d287b5b/opts
Dec 02 10:11:39 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:39Z|00499|binding|INFO|Releasing lport a910a553-85b1-4284-b87b-d67a0455f7a3 from this chassis (sb_readonly=1)
Dec 02 10:11:39 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:39Z|00500|if_status|INFO|Not setting lport a910a553-85b1-4284-b87b-d67a0455f7a3 down as sb is readonly
Dec 02 10:11:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:39.408 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:39 np0005541913.localdomain kernel: device tapa910a553-85 left promiscuous mode
Dec 02 10:11:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:39.436 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:40 np0005541913.localdomain systemd[1]: tmp-crun.zK1s6Y.mount: Deactivated successfully.
Dec 02 10:11:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:40Z|00501|binding|INFO|Setting lport a910a553-85b1-4284-b87b-d67a0455f7a3 down in Southbound
Dec 02 10:11:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:40.252 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-90303572-56ca-4145-bbbf-5a391d217194', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90303572-56ca-4145-bbbf-5a391d217194', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e49e0ab2-43b0-4f40-9f61-82b8dfd7e5a9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=5ea828b9-628f-47b3-a4f4-720d9ef822f9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:40.254 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 5ea828b9-628f-47b3-a4f4-720d9ef822f9 in datapath 90303572-56ca-4145-bbbf-5a391d217194 unbound from our chassis
Dec 02 10:11:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:40.255 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 90303572-56ca-4145-bbbf-5a391d217194 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:11:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:40.257 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[482a6bbb-9e98-42f7-b7ba-8ba009701658]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:40.262 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8eea084241c14c5d9a6cc0d912041a21', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30cf73fc-9798-4d84-a408-4d3ceadffb42, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=a910a553-85b1-4284-b87b-d67a0455f7a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:40.264 160221 INFO neutron.agent.ovn.metadata.agent [-] Port a910a553-85b1-4284-b87b-d67a0455f7a3 in datapath b2aacd19-6fe6-44f4-8d3d-5e657d287b5b unbound from our chassis
Dec 02 10:11:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:40.266 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b2aacd19-6fe6-44f4-8d3d-5e657d287b5b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:11:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:40.266 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[6a37eb67-06dd-42e8-9ba7-673a445404e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:40 np0005541913.localdomain podman[328325]: 2025-12-02 10:11:40.428833686 +0000 UTC m=+0.048375531 container kill 0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0563b4b4-439a-4655-9225-28a24ad09db2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:40 np0005541913.localdomain systemd[1]: tmp-crun.6tAxkG.mount: Deactivated successfully.
Dec 02 10:11:40 np0005541913.localdomain dnsmasq[326766]: exiting on receipt of SIGTERM
Dec 02 10:11:40 np0005541913.localdomain systemd[1]: libpod-0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223.scope: Deactivated successfully.
Dec 02 10:11:40 np0005541913.localdomain dnsmasq[328223]: exiting on receipt of SIGTERM
Dec 02 10:11:40 np0005541913.localdomain podman[328327]: 2025-12-02 10:11:40.487293433 +0000 UTC m=+0.102965564 container kill c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:11:40 np0005541913.localdomain systemd[1]: libpod-c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136.scope: Deactivated successfully.
Dec 02 10:11:40 np0005541913.localdomain podman[328351]: 2025-12-02 10:11:40.495201304 +0000 UTC m=+0.046406127 container died 0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0563b4b4-439a-4655-9225-28a24ad09db2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:11:40 np0005541913.localdomain podman[328351]: 2025-12-02 10:11:40.537643545 +0000 UTC m=+0.088848338 container remove 0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0563b4b4-439a-4655-9225-28a24ad09db2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:11:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:40.586 263406 INFO neutron.agent.dhcp.agent [None req-fc3da859-f30f-424d-bef1-e46124cbf810 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:40.586 263406 INFO neutron.agent.dhcp.agent [None req-fc3da859-f30f-424d-bef1-e46124cbf810 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:40 np0005541913.localdomain podman[328377]: 2025-12-02 10:11:40.602796682 +0000 UTC m=+0.103201641 container died c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:11:40 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 02 10:11:40 np0005541913.localdomain dnsmasq[327550]: exiting on receipt of SIGTERM
Dec 02 10:11:40 np0005541913.localdomain podman[328413]: 2025-12-02 10:11:40.629832602 +0000 UTC m=+0.059421055 container kill ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:40 np0005541913.localdomain systemd[1]: libpod-ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647.scope: Deactivated successfully.
Dec 02 10:11:40 np0005541913.localdomain podman[328377]: 2025-12-02 10:11:40.644085662 +0000 UTC m=+0.144490581 container cleanup c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:11:40 np0005541913.localdomain systemd[1]: libpod-conmon-c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136.scope: Deactivated successfully.
Dec 02 10:11:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:40Z|00502|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:11:40 np0005541913.localdomain ceph-mon[298296]: pgmap v434: 177 pgs: 177 active+clean; 150 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 48 KiB/s wr, 30 op/s
Dec 02 10:11:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:11:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:11:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:40.677 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:40 np0005541913.localdomain podman[328381]: 2025-12-02 10:11:40.684380076 +0000 UTC m=+0.177824490 container remove c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-90303572-56ca-4145-bbbf-5a391d217194, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:11:40 np0005541913.localdomain podman[328430]: 2025-12-02 10:11:40.707684527 +0000 UTC m=+0.060221916 container died ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:11:40 np0005541913.localdomain podman[328430]: 2025-12-02 10:11:40.734515692 +0000 UTC m=+0.087053021 container cleanup ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:11:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:40.738 263406 INFO neutron.agent.dhcp.agent [None req-521d5c3a-4337-42a7-9ed9-dc9cd6428df5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:40 np0005541913.localdomain systemd[1]: libpod-conmon-ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647.scope: Deactivated successfully.
Dec 02 10:11:40 np0005541913.localdomain systemd[1]: libpod-conmon-0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223.scope: Deactivated successfully.
Dec 02 10:11:40 np0005541913.localdomain podman[328434]: 2025-12-02 10:11:40.791534151 +0000 UTC m=+0.123942013 container remove ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2aacd19-6fe6-44f4-8d3d-5e657d287b5b, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: tmp-crun.3UFmj3.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b08ea60918ddc50749a8d9387348c0bc4f40c098c37c29fe4981d5d7b2eca14f-merged.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c84887807090946374c8c196bfea5897079f0dee69d8cef578b2842bfe596136-userdata-shm.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d90303572\x2d56ca\x2d4145\x2dbbbf\x2d5a391d217194.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-c1165e8d7b2376bb929e0428192dfe49a187f4a8ad512dcdd1d564a09da96bcf-merged.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab9ff839cf1c09d4901c4f131e170b0874aba048a2e4983b6a6f180e2dc8c647-userdata-shm.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-78a8e6dcaae55003fe5ded5cdde675da092aada4d48b494b157ce714a6ab4dbb-merged.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0dce264616a8006789140bd8975bcae7ca02c0845b177dcf80b742142273a223-userdata-shm.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d0563b4b4\x2d439a\x2d4655\x2d9225\x2d28a24ad09db2.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:41.155 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:41.219 263406 INFO neutron.agent.dhcp.agent [None req-4b8070dc-60eb-4c17-b6b4-b747df66d33a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:41 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2db2aacd19\x2d6fe6\x2d44f4\x2d8d3d\x2d5e657d287b5b.mount: Deactivated successfully.
Dec 02 10:11:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:41 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:11:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e177 e177: 6 total, 6 up, 6 in
Dec 02 10:11:42 np0005541913.localdomain ceph-mon[298296]: pgmap v435: 177 pgs: 177 active+clean; 150 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 36 KiB/s wr, 27 op/s
Dec 02 10:11:42 np0005541913.localdomain ceph-mon[298296]: osdmap e177: 6 total, 6 up, 6 in
Dec 02 10:11:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:42.771 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:43 np0005541913.localdomain ceph-mon[298296]: pgmap v437: 177 pgs: 177 active+clean; 196 MiB data, 982 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.2 MiB/s wr, 89 op/s
Dec 02 10:11:44 np0005541913.localdomain podman[328484]: 2025-12-02 10:11:44.702234967 +0000 UTC m=+0.069372061 container kill 429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:11:44 np0005541913.localdomain dnsmasq[327200]: exiting on receipt of SIGTERM
Dec 02 10:11:44 np0005541913.localdomain systemd[1]: libpod-429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b.scope: Deactivated successfully.
Dec 02 10:11:44 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1144789831' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:44 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1144789831' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:44 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3521941269' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:11:44 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:11:44 np0005541913.localdomain podman[328498]: 2025-12-02 10:11:44.782369923 +0000 UTC m=+0.062528508 container died 429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:11:44 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b-userdata-shm.mount: Deactivated successfully.
Dec 02 10:11:44 np0005541913.localdomain podman[328498]: 2025-12-02 10:11:44.812260299 +0000 UTC m=+0.092418854 container cleanup 429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:44 np0005541913.localdomain systemd[1]: libpod-conmon-429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b.scope: Deactivated successfully.
Dec 02 10:11:44 np0005541913.localdomain podman[328500]: 2025-12-02 10:11:44.862324154 +0000 UTC m=+0.133839758 container remove 429fb6547ec1b5fd9ceffa4ef49e3d7aa860a5db27e9d9241621a6967c28233b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:45 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0b03b76d06f065d064f053ee52b9bc33a267b704dd60e482f157855766fd952e-merged.mount: Deactivated successfully.
Dec 02 10:11:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:11:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:11:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:11:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 02 10:11:45 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:11:45 np0005541913.localdomain ceph-mon[298296]: pgmap v438: 177 pgs: 177 active+clean; 196 MiB data, 982 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.2 MiB/s wr, 89 op/s
Dec 02 10:11:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e178 e178: 6 total, 6 up, 6 in
Dec 02 10:11:45 np0005541913.localdomain podman[328576]: 
Dec 02 10:11:45 np0005541913.localdomain podman[328576]: 2025-12-02 10:11:45.834032909 +0000 UTC m=+0.110999930 container create 9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:11:45 np0005541913.localdomain podman[328576]: 2025-12-02 10:11:45.781819507 +0000 UTC m=+0.058786568 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:11:45 np0005541913.localdomain systemd[1]: Started libpod-conmon-9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b.scope.
Dec 02 10:11:45 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:11:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9c1c638ce6a7c0f3bc7617e0c55d1a035f83afacf2a942fd033f689340d0b8e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:11:45 np0005541913.localdomain podman[328576]: 2025-12-02 10:11:45.933788558 +0000 UTC m=+0.210755569 container init 9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:11:45 np0005541913.localdomain podman[328576]: 2025-12-02 10:11:45.943982269 +0000 UTC m=+0.220949280 container start 9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:11:45 np0005541913.localdomain dnsmasq[328594]: started, version 2.85 cachesize 150
Dec 02 10:11:45 np0005541913.localdomain dnsmasq[328594]: DNS service limited to local subnets
Dec 02 10:11:45 np0005541913.localdomain dnsmasq[328594]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:11:45 np0005541913.localdomain dnsmasq[328594]: warning: no upstream servers configured
Dec 02 10:11:45 np0005541913.localdomain dnsmasq-dhcp[328594]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:11:45 np0005541913.localdomain dnsmasq[328594]: read /var/lib/neutron/dhcp/d46489f9-a47b-465f-b68c-fdf4256b1786/addn_hosts - 0 addresses
Dec 02 10:11:45 np0005541913.localdomain dnsmasq-dhcp[328594]: read /var/lib/neutron/dhcp/d46489f9-a47b-465f-b68c-fdf4256b1786/host
Dec 02 10:11:45 np0005541913.localdomain dnsmasq-dhcp[328594]: read /var/lib/neutron/dhcp/d46489f9-a47b-465f-b68c-fdf4256b1786/opts
Dec 02 10:11:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:46.159 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:46 np0005541913.localdomain dnsmasq[328594]: exiting on receipt of SIGTERM
Dec 02 10:11:46 np0005541913.localdomain podman[328612]: 2025-12-02 10:11:46.550525973 +0000 UTC m=+0.059424605 container kill 9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:11:46 np0005541913.localdomain systemd[1]: libpod-9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b.scope: Deactivated successfully.
Dec 02 10:11:46 np0005541913.localdomain podman[328625]: 2025-12-02 10:11:46.628783528 +0000 UTC m=+0.063778191 container died 9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:11:46 np0005541913.localdomain podman[328625]: 2025-12-02 10:11:46.659712692 +0000 UTC m=+0.094707315 container cleanup 9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:46 np0005541913.localdomain systemd[1]: libpod-conmon-9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b.scope: Deactivated successfully.
Dec 02 10:11:46 np0005541913.localdomain systemd[1]: tmp-crun.NNbe5L.mount: Deactivated successfully.
Dec 02 10:11:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-b9c1c638ce6a7c0f3bc7617e0c55d1a035f83afacf2a942fd033f689340d0b8e-merged.mount: Deactivated successfully.
Dec 02 10:11:46 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b-userdata-shm.mount: Deactivated successfully.
Dec 02 10:11:46 np0005541913.localdomain podman[328627]: 2025-12-02 10:11:46.712285904 +0000 UTC m=+0.137279470 container remove 9313fdd7c5777dd2e825836787a1a7760bab48abec3fcde00833cedffe975c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d46489f9-a47b-465f-b68c-fdf4256b1786, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:11:46 np0005541913.localdomain kernel: device tapf3b02d29-75 left promiscuous mode
Dec 02 10:11:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:46.765 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:46 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:46Z|00503|binding|INFO|Releasing lport f3b02d29-7542-44d0-a991-a01ec607868c from this chassis (sb_readonly=0)
Dec 02 10:11:46 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:46Z|00504|binding|INFO|Setting lport f3b02d29-7542-44d0-a991-a01ec607868c down in Southbound
Dec 02 10:11:46 np0005541913.localdomain ceph-mon[298296]: osdmap e178: 6 total, 6 up, 6 in
Dec 02 10:11:46 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1526106310' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:46 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1526106310' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:46.788 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-d46489f9-a47b-465f-b68c-fdf4256b1786', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d46489f9-a47b-465f-b68c-fdf4256b1786', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c09a5d01-8e4d-42c4-b32a-41401f5c5328, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=f3b02d29-7542-44d0-a991-a01ec607868c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:46.790 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:46.791 160221 INFO neutron.agent.ovn.metadata.agent [-] Port f3b02d29-7542-44d0-a991-a01ec607868c in datapath d46489f9-a47b-465f-b68c-fdf4256b1786 unbound from our chassis
Dec 02 10:11:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e179 e179: 6 total, 6 up, 6 in
Dec 02 10:11:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:46.795 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d46489f9-a47b-465f-b68c-fdf4256b1786, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:11:46 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:46.796 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[350b00bf-8bfb-4a68-a032-ba274e817456]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:47.008 263406 INFO neutron.agent.dhcp.agent [None req-ce89e2a6-018b-4a98-97ec-2e364b80374b - - - - - -] DHCP configuration for ports {'7df0cc5b-0d4a-48fd-ae58-4a75ac1c28cb', 'f3b02d29-7542-44d0-a991-a01ec607868c'} is completed
Dec 02 10:11:47 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dd46489f9\x2da47b\x2d465f\x2db68c\x2dfdf4256b1786.mount: Deactivated successfully.
Dec 02 10:11:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:47.013 263406 INFO neutron.agent.dhcp.agent [None req-9bf69d65-f59c-4af9-9592-f185d6d7ae61 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:47 np0005541913.localdomain ceph-mon[298296]: osdmap e179: 6 total, 6 up, 6 in
Dec 02 10:11:47 np0005541913.localdomain ceph-mon[298296]: pgmap v441: 177 pgs: 177 active+clean; 196 MiB data, 982 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 3.6 MiB/s wr, 97 op/s
Dec 02 10:11:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:48.499 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:48 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:11:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:11:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:11:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:49Z|00505|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:11:49 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:49.089 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:49 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e180 e180: 6 total, 6 up, 6 in
Dec 02 10:11:50 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:11:50.334 2 INFO neutron.agent.securitygroups_rpc [None req-c7a943f3-fab0-4b36-9210-2f6cba57e1de defcf0debbf84a5c9ec6342ae3d02928 8eea084241c14c5d9a6cc0d912041a21 - - default default] Security group member updated ['712bb249-1109-4289-a9cf-1e3d3f6e301e']
Dec 02 10:11:50 np0005541913.localdomain ceph-mon[298296]: pgmap v442: 177 pgs: 177 active+clean; 196 MiB data, 982 MiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 2.9 MiB/s wr, 131 op/s
Dec 02 10:11:50 np0005541913.localdomain ceph-mon[298296]: osdmap e180: 6 total, 6 up, 6 in
Dec 02 10:11:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:51.161 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:51.164 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:11:52 np0005541913.localdomain podman[328657]: 2025-12-02 10:11:52.46477202 +0000 UTC m=+0.105949955 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:11:52 np0005541913.localdomain dnsmasq[326471]: exiting on receipt of SIGTERM
Dec 02 10:11:52 np0005541913.localdomain podman[328689]: 2025-12-02 10:11:52.562126364 +0000 UTC m=+0.065770544 container kill 5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ba9b74ca-c826-47d9-9b2c-806aa0652611, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:11:52 np0005541913.localdomain systemd[1]: libpod-5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf.scope: Deactivated successfully.
Dec 02 10:11:52 np0005541913.localdomain ceph-mon[298296]: pgmap v444: 177 pgs: 177 active+clean; 196 MiB data, 982 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 24 KiB/s wr, 67 op/s
Dec 02 10:11:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4069706698' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4069706698' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:52 np0005541913.localdomain podman[328657]: 2025-12-02 10:11:52.59952454 +0000 UTC m=+0.240702475 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Dec 02 10:11:52 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:11:52 np0005541913.localdomain podman[328701]: 2025-12-02 10:11:52.66330644 +0000 UTC m=+0.079821558 container died 5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ba9b74ca-c826-47d9-9b2c-806aa0652611, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:11:52 np0005541913.localdomain systemd[1]: tmp-crun.wLVBru.mount: Deactivated successfully.
Dec 02 10:11:52 np0005541913.localdomain podman[328701]: 2025-12-02 10:11:52.698751745 +0000 UTC m=+0.115266763 container cleanup 5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ba9b74ca-c826-47d9-9b2c-806aa0652611, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:11:52 np0005541913.localdomain systemd[1]: libpod-conmon-5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf.scope: Deactivated successfully.
Dec 02 10:11:52 np0005541913.localdomain podman[328703]: 2025-12-02 10:11:52.738839823 +0000 UTC m=+0.146727131 container remove 5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ba9b74ca-c826-47d9-9b2c-806aa0652611, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:11:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:52Z|00506|binding|INFO|Releasing lport 6305f6b8-f6d1-42c8-8da0-74c67d8b4998 from this chassis (sb_readonly=0)
Dec 02 10:11:52 np0005541913.localdomain kernel: device tap6305f6b8-f6 left promiscuous mode
Dec 02 10:11:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:52.752 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:52Z|00507|binding|INFO|Setting lport 6305f6b8-f6d1-42c8-8da0-74c67d8b4998 down in Southbound
Dec 02 10:11:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:52.777 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-846af1f535196c2b1f3fa28965eaa27fb6ef8e77dfc94da176dafd06a1387634-merged.mount: Deactivated successfully.
Dec 02 10:11:53 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b8d876561dfa8754024d0e9b3073d0fb7a95f16e176162883c7000375290bcf-userdata-shm.mount: Deactivated successfully.
Dec 02 10:11:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:53.529 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-ba9b74ca-c826-47d9-9b2c-806aa0652611', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba9b74ca-c826-47d9-9b2c-806aa0652611', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f4ef6ddb6546fbb800184721e43e93', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae8275bd-608b-4d44-bec9-32778c15dfb9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=6305f6b8-f6d1-42c8-8da0-74c67d8b4998) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:53.531 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 6305f6b8-f6d1-42c8-8da0-74c67d8b4998 in datapath ba9b74ca-c826-47d9-9b2c-806aa0652611 unbound from our chassis
Dec 02 10:11:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:53.532 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ba9b74ca-c826-47d9-9b2c-806aa0652611 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:11:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:53.533 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8a3a47-6193-4ffb-a95c-5946d5589382]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:53 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dba9b74ca\x2dc826\x2d47d9\x2d9b2c\x2d806aa0652611.mount: Deactivated successfully.
Dec 02 10:11:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:53.595 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:11:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 02 10:11:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:11:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 02 10:11:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 02 10:11:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice", "format": "json"}]: dispatch
Dec 02 10:11:53 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/283666617' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:11:53 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:53.600 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:54.564 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:11:54 np0005541913.localdomain ceph-mon[298296]: pgmap v445: 177 pgs: 177 active+clean; 197 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 89 KiB/s rd, 55 KiB/s wr, 129 op/s
Dec 02 10:11:54 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2457589825' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:54 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2457589825' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e181 e181: 6 total, 6 up, 6 in
Dec 02 10:11:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:54.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:11:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:11:55 np0005541913.localdomain podman[328731]: 2025-12-02 10:11:55.435098655 +0000 UTC m=+0.073400767 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 02 10:11:55 np0005541913.localdomain podman[328731]: 2025-12-02 10:11:55.439979395 +0000 UTC m=+0.078281437 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 02 10:11:55 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:11:55 np0005541913.localdomain ceph-mon[298296]: osdmap e181: 6 total, 6 up, 6 in
Dec 02 10:11:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:56.164 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 02 10:11:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:11:56 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:11:56 np0005541913.localdomain podman[328750]: 2025-12-02 10:11:56.445972143 +0000 UTC m=+0.082665953 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:11:56 np0005541913.localdomain podman[328750]: 2025-12-02 10:11:56.465008301 +0000 UTC m=+0.101702051 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, architecture=x86_64)
Dec 02 10:11:56 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:11:56 np0005541913.localdomain podman[328751]: 2025-12-02 10:11:56.552644096 +0000 UTC m=+0.184899648 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:11:56 np0005541913.localdomain podman[328751]: 2025-12-02 10:11:56.591092261 +0000 UTC m=+0.223347813 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:11:56 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:11:56 np0005541913.localdomain ceph-mon[298296]: pgmap v447: 177 pgs: 177 active+clean; 197 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 82 KiB/s rd, 51 KiB/s wr, 119 op/s
Dec 02 10:11:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:57Z|00508|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:11:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:57.098 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:57 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:57.434 263406 INFO neutron.agent.linux.ip_lib [None req-35d80d04-9627-4ced-9df7-fd9c39c88bcb - - - - - -] Device tap806e97f6-df cannot be used as it has no MAC address
Dec 02 10:11:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:57.464 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:57 np0005541913.localdomain kernel: device tap806e97f6-df entered promiscuous mode
Dec 02 10:11:57 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670317.4753] manager: (tap806e97f6-df): new Generic device (/org/freedesktop/NetworkManager/Devices/80)
Dec 02 10:11:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:57.477 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:57Z|00509|binding|INFO|Claiming lport 806e97f6-df70-4fff-953f-74b4c641ea7b for this chassis.
Dec 02 10:11:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:57Z|00510|binding|INFO|806e97f6-df70-4fff-953f-74b4c641ea7b: Claiming unknown
Dec 02 10:11:57 np0005541913.localdomain systemd-udevd[328802]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:11:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:57.499 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-e6b81515-1a91-47bb-810b-f820ca0caeff', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6b81515-1a91-47bb-810b-f820ca0caeff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043cc6f66b444d00959c7dcdb078fbe8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f863ae6-a602-4080-9383-f6b709828279, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=806e97f6-df70-4fff-953f-74b4c641ea7b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:11:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:57.501 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 806e97f6-df70-4fff-953f-74b4c641ea7b in datapath e6b81515-1a91-47bb-810b-f820ca0caeff bound to our chassis
Dec 02 10:11:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:57.504 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8af3668a-fa20-4b45-b40c-82550ac031c7 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:11:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:57.504 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e6b81515-1a91-47bb-810b-f820ca0caeff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:11:57 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:11:57.505 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[83ee6d7a-d8a1-470e-b025-0d8ff1a5eefa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:11:57 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap806e97f6-df: No such device
Dec 02 10:11:57 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap806e97f6-df: No such device
Dec 02 10:11:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:57Z|00511|binding|INFO|Setting lport 806e97f6-df70-4fff-953f-74b4c641ea7b ovn-installed in OVS
Dec 02 10:11:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:11:57Z|00512|binding|INFO|Setting lport 806e97f6-df70-4fff-953f-74b4c641ea7b up in Southbound
Dec 02 10:11:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:57.527 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:57 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap806e97f6-df: No such device
Dec 02 10:11:57 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap806e97f6-df: No such device
Dec 02 10:11:57 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap806e97f6-df: No such device
Dec 02 10:11:57 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap806e97f6-df: No such device
Dec 02 10:11:57 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap806e97f6-df: No such device
Dec 02 10:11:57 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap806e97f6-df: No such device
Dec 02 10:11:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:57.572 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:57.613 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:11:57 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:11:57 np0005541913.localdomain ceph-mon[298296]: pgmap v448: 177 pgs: 177 active+clean; 197 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 33 KiB/s wr, 68 op/s
Dec 02 10:11:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:11:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:11:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:11:58 np0005541913.localdomain podman[328873]: 
Dec 02 10:11:58 np0005541913.localdomain podman[328873]: 2025-12-02 10:11:58.576734886 +0000 UTC m=+0.083102146 container create b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e6b81515-1a91-47bb-810b-f820ca0caeff, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:11:58 np0005541913.localdomain podman[328873]: 2025-12-02 10:11:58.526426025 +0000 UTC m=+0.032793305 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:11:58 np0005541913.localdomain systemd[1]: Started libpod-conmon-b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a.scope.
Dec 02 10:11:58 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:11:58 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbf59d4ae25454728b26aa2ee5b303a57d6e9d975e82bd83d233416387dbda34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:11:58 np0005541913.localdomain podman[328873]: 2025-12-02 10:11:58.715525834 +0000 UTC m=+0.221893084 container init b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e6b81515-1a91-47bb-810b-f820ca0caeff, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:11:58 np0005541913.localdomain systemd[1]: tmp-crun.D0jU4k.mount: Deactivated successfully.
Dec 02 10:11:58 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e182 e182: 6 total, 6 up, 6 in
Dec 02 10:11:58 np0005541913.localdomain podman[328873]: 2025-12-02 10:11:58.732479976 +0000 UTC m=+0.238847226 container start b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e6b81515-1a91-47bb-810b-f820ca0caeff, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:11:58 np0005541913.localdomain dnsmasq[328891]: started, version 2.85 cachesize 150
Dec 02 10:11:58 np0005541913.localdomain dnsmasq[328891]: DNS service limited to local subnets
Dec 02 10:11:58 np0005541913.localdomain dnsmasq[328891]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:11:58 np0005541913.localdomain dnsmasq[328891]: warning: no upstream servers configured
Dec 02 10:11:58 np0005541913.localdomain dnsmasq-dhcp[328891]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:11:58 np0005541913.localdomain dnsmasq[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/addn_hosts - 0 addresses
Dec 02 10:11:58 np0005541913.localdomain dnsmasq-dhcp[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/host
Dec 02 10:11:58 np0005541913.localdomain dnsmasq-dhcp[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/opts
Dec 02 10:11:58 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:11:58.952 263406 INFO neutron.agent.dhcp.agent [None req-6c230f37-30a9-46b5-b999-1e0f714361e6 - - - - - -] DHCP configuration for ports {'8c152302-afea-42ec-bf7f-b738e9bcaab0'} is completed
Dec 02 10:11:59 np0005541913.localdomain ceph-mon[298296]: osdmap e182: 6 total, 6 up, 6 in
Dec 02 10:11:59 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2303854732' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:59 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2303854732' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:59 np0005541913.localdomain ceph-mon[298296]: pgmap v450: 177 pgs: 177 active+clean; 197 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 53 KiB/s wr, 127 op/s
Dec 02 10:11:59 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2930831674' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:11:59 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2930831674' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:11:59 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e183 e183: 6 total, 6 up, 6 in
Dec 02 10:11:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:59.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:11:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:11:59.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:12:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:00.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:12:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:00.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:12:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:00.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:12:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:00.913 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:12:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:00.914 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:12:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:00.914 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:12:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:00.915 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.003 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:01 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:01.062 2 INFO neutron.agent.securitygroups_rpc [None req-bd3bdc93-0ba6-42a3-9063-ee94eddd1f8f defcf0debbf84a5c9ec6342ae3d02928 8eea084241c14c5d9a6cc0d912041a21 - - default default] Security group member updated ['712bb249-1109-4289-a9cf-1e3d3f6e301e']
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.169 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:01 np0005541913.localdomain ceph-mon[298296]: osdmap e183: 6 total, 6 up, 6 in
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.426 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.455 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.456 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.855 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.855 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.856 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.856 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:12:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:01.857 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:12:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:01.885 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:12:01Z, description=, device_id=ff168046-1219-4329-be5a-02b35c99fef5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990878ad00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990878a940>], id=bcc45c26-5f02-46a1-869b-9af695c3ec53, ip_allocation=immediate, mac_address=fa:16:3e:5f:8e:4d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:11:46Z, description=, dns_domain=, id=e6b81515-1a91-47bb-810b-f820ca0caeff, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--286242228, port_security_enabled=True, project_id=043cc6f66b444d00959c7dcdb078fbe8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25349, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2719, status=ACTIVE, subnets=['0c28764e-253b-48a2-be4d-b3892f027641'], tags=[], tenant_id=043cc6f66b444d00959c7dcdb078fbe8, updated_at=2025-12-02T10:11:53Z, vlan_transparent=None, network_id=e6b81515-1a91-47bb-810b-f820ca0caeff, port_security_enabled=False, project_id=043cc6f66b444d00959c7dcdb078fbe8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2741, status=DOWN, tags=[], tenant_id=043cc6f66b444d00959c7dcdb078fbe8, updated_at=2025-12-02T10:12:01Z on network e6b81515-1a91-47bb-810b-f820ca0caeff
Dec 02 10:12:02 np0005541913.localdomain dnsmasq[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/addn_hosts - 1 addresses
Dec 02 10:12:02 np0005541913.localdomain dnsmasq-dhcp[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/host
Dec 02 10:12:02 np0005541913.localdomain dnsmasq-dhcp[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/opts
Dec 02 10:12:02 np0005541913.localdomain podman[328929]: 2025-12-02 10:12:02.138669116 +0000 UTC m=+0.067256793 container kill b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e6b81515-1a91-47bb-810b-f820ca0caeff, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: pgmap v452: 177 pgs: 177 active+clean; 197 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 25 KiB/s wr, 72 op/s
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/965134514' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/965134514' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2146629484' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.307 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:12:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e184 e184: 6 total, 6 up, 6 in
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.410 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.413 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:12:02 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:02.427 263406 INFO neutron.agent.dhcp.agent [None req-d45223e6-9c6d-4d82-bf42-af50435ba502 - - - - - -] DHCP configuration for ports {'bcc45c26-5f02-46a1-869b-9af695c3ec53'} is completed
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.649 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.650 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11177MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.651 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.651 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.737 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.737 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.738 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:12:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:02.791 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:12:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:03.055 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:12:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:03.056 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:12:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:03.056 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:12:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:12:03 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1371812197' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:12:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:03.217 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:12:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:03.224 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:12:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:03.243 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:12:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:03.245 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:12:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:03.246 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:12:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2146629484' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:12:03 np0005541913.localdomain ceph-mon[298296]: osdmap e184: 6 total, 6 up, 6 in
Dec 02 10:12:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3335471670' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3335471670' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1371812197' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:12:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:03.319 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:12:01Z, description=, device_id=ff168046-1219-4329-be5a-02b35c99fef5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087aa640>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087aaac0>], id=bcc45c26-5f02-46a1-869b-9af695c3ec53, ip_allocation=immediate, mac_address=fa:16:3e:5f:8e:4d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:11:46Z, description=, dns_domain=, id=e6b81515-1a91-47bb-810b-f820ca0caeff, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--286242228, port_security_enabled=True, project_id=043cc6f66b444d00959c7dcdb078fbe8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25349, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2719, status=ACTIVE, subnets=['0c28764e-253b-48a2-be4d-b3892f027641'], tags=[], tenant_id=043cc6f66b444d00959c7dcdb078fbe8, updated_at=2025-12-02T10:11:53Z, vlan_transparent=None, network_id=e6b81515-1a91-47bb-810b-f820ca0caeff, port_security_enabled=False, project_id=043cc6f66b444d00959c7dcdb078fbe8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2741, status=DOWN, tags=[], tenant_id=043cc6f66b444d00959c7dcdb078fbe8, updated_at=2025-12-02T10:12:01Z on network e6b81515-1a91-47bb-810b-f820ca0caeff
Dec 02 10:12:03 np0005541913.localdomain systemd[1]: tmp-crun.ke4vUd.mount: Deactivated successfully.
Dec 02 10:12:03 np0005541913.localdomain dnsmasq[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/addn_hosts - 1 addresses
Dec 02 10:12:03 np0005541913.localdomain dnsmasq-dhcp[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/host
Dec 02 10:12:03 np0005541913.localdomain dnsmasq-dhcp[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/opts
Dec 02 10:12:03 np0005541913.localdomain podman[328992]: 2025-12-02 10:12:03.554674292 +0000 UTC m=+0.076773778 container kill b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e6b81515-1a91-47bb-810b-f820ca0caeff, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:03.803 263406 INFO neutron.agent.dhcp.agent [None req-97a713e9-cb8b-40ff-930e-41b9f4bb6013 - - - - - -] DHCP configuration for ports {'bcc45c26-5f02-46a1-869b-9af695c3ec53'} is completed
Dec 02 10:12:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:12:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:12:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:12:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:12:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:12:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:12:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:12:04 np0005541913.localdomain ceph-mon[298296]: pgmap v454: 177 pgs: 177 active+clean; 197 MiB data, 988 MiB used, 41 GiB / 42 GiB avail; 174 KiB/s rd, 55 KiB/s wr, 239 op/s
Dec 02 10:12:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:12:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:12:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:12:04 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:12:05 np0005541913.localdomain dnsmasq[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/addn_hosts - 0 addresses
Dec 02 10:12:05 np0005541913.localdomain podman[329031]: 2025-12-02 10:12:05.018869701 +0000 UTC m=+0.058097009 container kill b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e6b81515-1a91-47bb-810b-f820ca0caeff, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:12:05 np0005541913.localdomain dnsmasq-dhcp[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/host
Dec 02 10:12:05 np0005541913.localdomain dnsmasq-dhcp[328891]: read /var/lib/neutron/dhcp/e6b81515-1a91-47bb-810b-f820ca0caeff/opts
Dec 02 10:12:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:12:05 np0005541913.localdomain podman[329044]: 2025-12-02 10:12:05.143058111 +0000 UTC m=+0.091526461 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:12:05 np0005541913.localdomain podman[329044]: 2025-12-02 10:12:05.159179641 +0000 UTC m=+0.107647991 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 10:12:05 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:12:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:05Z|00513|binding|INFO|Releasing lport 806e97f6-df70-4fff-953f-74b4c641ea7b from this chassis (sb_readonly=0)
Dec 02 10:12:05 np0005541913.localdomain kernel: device tap806e97f6-df left promiscuous mode
Dec 02 10:12:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:05.241 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:05Z|00514|binding|INFO|Setting lport 806e97f6-df70-4fff-953f-74b4c641ea7b down in Southbound
Dec 02 10:12:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:05.253 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-e6b81515-1a91-47bb-810b-f820ca0caeff', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6b81515-1a91-47bb-810b-f820ca0caeff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043cc6f66b444d00959c7dcdb078fbe8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f863ae6-a602-4080-9383-f6b709828279, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=806e97f6-df70-4fff-953f-74b4c641ea7b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:05.255 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 806e97f6-df70-4fff-953f-74b4c641ea7b in datapath e6b81515-1a91-47bb-810b-f820ca0caeff unbound from our chassis
Dec 02 10:12:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:05.258 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e6b81515-1a91-47bb-810b-f820ca0caeff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:12:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:05.259 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5454b5cd-66a3-447b-b034-af958d37bddd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:05.267 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 02 10:12:05 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4271932895' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:12:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:12:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1431900446' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1431900446' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4271932895' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:12:05 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:05.702 263406 INFO neutron.agent.linux.ip_lib [None req-def73ee1-72be-4d46-8540-e13b5f9f097e - - - - - -] Device tapd4923978-01 cannot be used as it has no MAC address
Dec 02 10:12:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:05.757 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:05 np0005541913.localdomain kernel: device tapd4923978-01 entered promiscuous mode
Dec 02 10:12:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:05Z|00515|binding|INFO|Claiming lport d4923978-01a3-404a-b8a3-c2641f58992d for this chassis.
Dec 02 10:12:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:05Z|00516|binding|INFO|d4923978-01a3-404a-b8a3-c2641f58992d: Claiming unknown
Dec 02 10:12:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:05.765 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:05 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670325.7661] manager: (tapd4923978-01): new Generic device (/org/freedesktop/NetworkManager/Devices/81)
Dec 02 10:12:05 np0005541913.localdomain systemd-udevd[329081]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:12:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:05.774 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-c6d140fa-e90a-46db-a2ed-8d904415f1fc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6d140fa-e90a-46db-a2ed-8d904415f1fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8eea084241c14c5d9a6cc0d912041a21', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0ae5db5-d26f-4f88-83bb-815d569d8232, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=d4923978-01a3-404a-b8a3-c2641f58992d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:05.776 160221 INFO neutron.agent.ovn.metadata.agent [-] Port d4923978-01a3-404a-b8a3-c2641f58992d in datapath c6d140fa-e90a-46db-a2ed-8d904415f1fc bound to our chassis
Dec 02 10:12:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:05.777 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c6d140fa-e90a-46db-a2ed-8d904415f1fc or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:05.778 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[2673e15d-5bb3-42d3-9ad4-71b82cb8d320]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:05 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapd4923978-01: No such device
Dec 02 10:12:05 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapd4923978-01: No such device
Dec 02 10:12:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:05Z|00517|binding|INFO|Setting lport d4923978-01a3-404a-b8a3-c2641f58992d ovn-installed in OVS
Dec 02 10:12:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:05Z|00518|binding|INFO|Setting lport d4923978-01a3-404a-b8a3-c2641f58992d up in Southbound
Dec 02 10:12:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:05.803 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:05 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapd4923978-01: No such device
Dec 02 10:12:05 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapd4923978-01: No such device
Dec 02 10:12:05 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapd4923978-01: No such device
Dec 02 10:12:05 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapd4923978-01: No such device
Dec 02 10:12:05 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapd4923978-01: No such device
Dec 02 10:12:05 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapd4923978-01: No such device
Dec 02 10:12:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:05.842 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:05.870 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:12:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:12:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:06.171 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:12:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156107 "" "Go-http-client/1.1"
Dec 02 10:12:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:12:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19255 "" "Go-http-client/1.1"
Dec 02 10:12:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:06.247 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:12:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:06.248 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:12:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:06 np0005541913.localdomain ceph-mon[298296]: pgmap v455: 177 pgs: 177 active+clean; 197 MiB data, 988 MiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 26 KiB/s wr, 149 op/s
Dec 02 10:12:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e185 e185: 6 total, 6 up, 6 in
Dec 02 10:12:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e186 e186: 6 total, 6 up, 6 in
Dec 02 10:12:06 np0005541913.localdomain podman[329153]: 
Dec 02 10:12:06 np0005541913.localdomain podman[329153]: 2025-12-02 10:12:06.816890606 +0000 UTC m=+0.098446454 container create aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:06.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:12:06 np0005541913.localdomain systemd[1]: Started libpod-conmon-aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b.scope.
Dec 02 10:12:06 np0005541913.localdomain podman[329153]: 2025-12-02 10:12:06.765735533 +0000 UTC m=+0.047291381 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:12:06 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:12:06 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abc8399162bfbb6c24544e9605c7d0a83a32a4fccef1f87ba5e365b5de598b76/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:12:06 np0005541913.localdomain podman[329153]: 2025-12-02 10:12:06.892132051 +0000 UTC m=+0.173687899 container init aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:12:06 np0005541913.localdomain podman[329153]: 2025-12-02 10:12:06.899674543 +0000 UTC m=+0.181230391 container start aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:06 np0005541913.localdomain dnsmasq[329172]: started, version 2.85 cachesize 150
Dec 02 10:12:06 np0005541913.localdomain dnsmasq[329172]: DNS service limited to local subnets
Dec 02 10:12:06 np0005541913.localdomain dnsmasq[329172]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:12:06 np0005541913.localdomain dnsmasq[329172]: warning: no upstream servers configured
Dec 02 10:12:06 np0005541913.localdomain dnsmasq-dhcp[329172]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:12:06 np0005541913.localdomain dnsmasq[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/addn_hosts - 0 addresses
Dec 02 10:12:06 np0005541913.localdomain dnsmasq-dhcp[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/host
Dec 02 10:12:06 np0005541913.localdomain dnsmasq-dhcp[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/opts
Dec 02 10:12:06 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:06.957 263406 INFO neutron.agent.dhcp.agent [None req-def73ee1-72be-4d46-8540-e13b5f9f097e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:12:05Z, description=, device_id=5a90933a-19ff-489b-be1e-b7113aa8ee2e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908976e50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908976100>], id=79296e56-a39b-4325-9fe9-d6f456dbb5b5, ip_allocation=immediate, mac_address=fa:16:3e:d1:81:39, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:03Z, description=, dns_domain=, id=c6d140fa-e90a-46db-a2ed-8d904415f1fc, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1076909150, port_security_enabled=True, project_id=8eea084241c14c5d9a6cc0d912041a21, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3091, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2747, status=ACTIVE, subnets=['8a2af494-3790-4481-98e0-c7ff125b2591'], tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:04Z, vlan_transparent=None, network_id=c6d140fa-e90a-46db-a2ed-8d904415f1fc, port_security_enabled=False, project_id=8eea084241c14c5d9a6cc0d912041a21, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2764, status=DOWN, tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:05Z on network c6d140fa-e90a-46db-a2ed-8d904415f1fc
Dec 02 10:12:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:07.109 263406 INFO neutron.agent.dhcp.agent [None req-e95f08ea-8d0f-4693-a61b-5bb1bd69195c - - - - - -] DHCP configuration for ports {'a503dddc-bc64-44f7-af1e-2ea5e1045d5a'} is completed
Dec 02 10:12:07 np0005541913.localdomain dnsmasq[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/addn_hosts - 1 addresses
Dec 02 10:12:07 np0005541913.localdomain dnsmasq-dhcp[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/host
Dec 02 10:12:07 np0005541913.localdomain dnsmasq-dhcp[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/opts
Dec 02 10:12:07 np0005541913.localdomain podman[329190]: 2025-12-02 10:12:07.158634883 +0000 UTC m=+0.070306804 container kill aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:12:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:07.360 263406 INFO neutron.agent.dhcp.agent [None req-def73ee1-72be-4d46-8540-e13b5f9f097e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:12:05Z, description=, device_id=5a90933a-19ff-489b-be1e-b7113aa8ee2e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990899bf40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990899b160>], id=79296e56-a39b-4325-9fe9-d6f456dbb5b5, ip_allocation=immediate, mac_address=fa:16:3e:d1:81:39, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:03Z, description=, dns_domain=, id=c6d140fa-e90a-46db-a2ed-8d904415f1fc, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1076909150, port_security_enabled=True, project_id=8eea084241c14c5d9a6cc0d912041a21, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3091, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2747, status=ACTIVE, subnets=['8a2af494-3790-4481-98e0-c7ff125b2591'], tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:04Z, vlan_transparent=None, network_id=c6d140fa-e90a-46db-a2ed-8d904415f1fc, port_security_enabled=False, project_id=8eea084241c14c5d9a6cc0d912041a21, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2764, status=DOWN, tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:05Z on network c6d140fa-e90a-46db-a2ed-8d904415f1fc
Dec 02 10:12:07 np0005541913.localdomain ceph-mon[298296]: osdmap e185: 6 total, 6 up, 6 in
Dec 02 10:12:07 np0005541913.localdomain ceph-mon[298296]: osdmap e186: 6 total, 6 up, 6 in
Dec 02 10:12:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/200831030' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:12:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 02 10:12:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:12:07 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 02 10:12:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:07.486 263406 INFO neutron.agent.dhcp.agent [None req-869e7b65-daa9-41b1-ad3c-bc573c3a807c - - - - - -] DHCP configuration for ports {'79296e56-a39b-4325-9fe9-d6f456dbb5b5'} is completed
Dec 02 10:12:07 np0005541913.localdomain dnsmasq[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/addn_hosts - 1 addresses
Dec 02 10:12:07 np0005541913.localdomain dnsmasq-dhcp[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/host
Dec 02 10:12:07 np0005541913.localdomain podman[329228]: 2025-12-02 10:12:07.596587405 +0000 UTC m=+0.078094762 container kill aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:12:07 np0005541913.localdomain dnsmasq-dhcp[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/opts
Dec 02 10:12:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e187 e187: 6 total, 6 up, 6 in
Dec 02 10:12:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:07.824 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:12:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:07.928 263406 INFO neutron.agent.dhcp.agent [None req-cac44c48-0caf-4c93-82a6-55b58b4e83a1 - - - - - -] DHCP configuration for ports {'79296e56-a39b-4325-9fe9-d6f456dbb5b5'} is completed
Dec 02 10:12:08 np0005541913.localdomain dnsmasq[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/addn_hosts - 0 addresses
Dec 02 10:12:08 np0005541913.localdomain dnsmasq-dhcp[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/host
Dec 02 10:12:08 np0005541913.localdomain podman[329265]: 2025-12-02 10:12:08.431724321 +0000 UTC m=+0.058387527 container kill aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:12:08 np0005541913.localdomain dnsmasq-dhcp[329172]: read /var/lib/neutron/dhcp/c6d140fa-e90a-46db-a2ed-8d904415f1fc/opts
Dec 02 10:12:08 np0005541913.localdomain ceph-mon[298296]: pgmap v458: 177 pgs: 177 active+clean; 197 MiB data, 988 MiB used, 41 GiB / 42 GiB avail; 116 KiB/s rd, 28 KiB/s wr, 160 op/s
Dec 02 10:12:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:12:08 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 02 10:12:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 02 10:12:08 np0005541913.localdomain ceph-mon[298296]: osdmap e187: 6 total, 6 up, 6 in
Dec 02 10:12:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2956956327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:12:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:08Z|00519|binding|INFO|Releasing lport d4923978-01a3-404a-b8a3-c2641f58992d from this chassis (sb_readonly=0)
Dec 02 10:12:08 np0005541913.localdomain kernel: device tapd4923978-01 left promiscuous mode
Dec 02 10:12:08 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:08Z|00520|binding|INFO|Setting lport d4923978-01a3-404a-b8a3-c2641f58992d down in Southbound
Dec 02 10:12:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:08.672 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:08.682 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-c6d140fa-e90a-46db-a2ed-8d904415f1fc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6d140fa-e90a-46db-a2ed-8d904415f1fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8eea084241c14c5d9a6cc0d912041a21', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0ae5db5-d26f-4f88-83bb-815d569d8232, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=d4923978-01a3-404a-b8a3-c2641f58992d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:08.685 160221 INFO neutron.agent.ovn.metadata.agent [-] Port d4923978-01a3-404a-b8a3-c2641f58992d in datapath c6d140fa-e90a-46db-a2ed-8d904415f1fc unbound from our chassis
Dec 02 10:12:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:08.687 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c6d140fa-e90a-46db-a2ed-8d904415f1fc or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:08 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:08.688 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[8f1aa5d3-8c1e-46c3-8644-1111ffe3cdd6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:08.694 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:08.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:12:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:12:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:12:09 np0005541913.localdomain systemd[1]: tmp-crun.EQvAtW.mount: Deactivated successfully.
Dec 02 10:12:09 np0005541913.localdomain podman[329287]: 2025-12-02 10:12:09.443112544 +0000 UTC m=+0.080542268 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:12:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e188 e188: 6 total, 6 up, 6 in
Dec 02 10:12:09 np0005541913.localdomain podman[329287]: 2025-12-02 10:12:09.484181767 +0000 UTC m=+0.121611471 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:12:09 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:12:09 np0005541913.localdomain podman[329288]: 2025-12-02 10:12:09.564217421 +0000 UTC m=+0.196528319 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:12:09 np0005541913.localdomain podman[329288]: 2025-12-02 10:12:09.604264938 +0000 UTC m=+0.236575846 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:09 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:12:09 np0005541913.localdomain dnsmasq[329172]: exiting on receipt of SIGTERM
Dec 02 10:12:09 np0005541913.localdomain podman[329352]: 2025-12-02 10:12:09.697433331 +0000 UTC m=+0.033877384 container kill aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:09 np0005541913.localdomain systemd[1]: libpod-aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b.scope: Deactivated successfully.
Dec 02 10:12:09 np0005541913.localdomain podman[329366]: 2025-12-02 10:12:09.737992482 +0000 UTC m=+0.032316933 container died aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:12:09 np0005541913.localdomain podman[329366]: 2025-12-02 10:12:09.760278736 +0000 UTC m=+0.054603167 container cleanup aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:12:09 np0005541913.localdomain systemd[1]: libpod-conmon-aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b.scope: Deactivated successfully.
Dec 02 10:12:09 np0005541913.localdomain podman[329368]: 2025-12-02 10:12:09.775284585 +0000 UTC m=+0.064297844 container remove aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6d140fa-e90a-46db-a2ed-8d904415f1fc, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:12:09 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:09.975 263406 INFO neutron.agent.dhcp.agent [None req-e455c356-8328-4ce1-a2e9-c46b0fd1d465 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:09 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:09.975 263406 INFO neutron.agent.dhcp.agent [None req-e455c356-8328-4ce1-a2e9-c46b0fd1d465 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:10 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:10.015 263406 INFO neutron.agent.linux.ip_lib [None req-b53b37c2-a206-4ab9-b0fc-93c49b697de4 - - - - - -] Device tap8b3a7663-ad cannot be used as it has no MAC address
Dec 02 10:12:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:10.039 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:10 np0005541913.localdomain kernel: device tap8b3a7663-ad entered promiscuous mode
Dec 02 10:12:10 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670330.0478] manager: (tap8b3a7663-ad): new Generic device (/org/freedesktop/NetworkManager/Devices/82)
Dec 02 10:12:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:10.048 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:10 np0005541913.localdomain systemd-udevd[329407]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:12:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:10Z|00521|binding|INFO|Claiming lport 8b3a7663-ad72-4099-a4de-0fa85d29cfd8 for this chassis.
Dec 02 10:12:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:10Z|00522|binding|INFO|8b3a7663-ad72-4099-a4de-0fa85d29cfd8: Claiming unknown
Dec 02 10:12:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:10.060 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-9a6d986e-0caf-4eff-b1d3-a10e7add5365', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a6d986e-0caf-4eff-b1d3-a10e7add5365', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '096ffa0a51b143039159efc232ec547a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fea98301-f19b-4654-8756-4655244bd809, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=8b3a7663-ad72-4099-a4de-0fa85d29cfd8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:10.065 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 8b3a7663-ad72-4099-a4de-0fa85d29cfd8 in datapath 9a6d986e-0caf-4eff-b1d3-a10e7add5365 bound to our chassis
Dec 02 10:12:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:10.067 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9a6d986e-0caf-4eff-b1d3-a10e7add5365 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:10 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:10.067 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a8edd05d-079f-4360-8501-38ffa850b4b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:10Z|00523|binding|INFO|Setting lport 8b3a7663-ad72-4099-a4de-0fa85d29cfd8 ovn-installed in OVS
Dec 02 10:12:10 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:10Z|00524|binding|INFO|Setting lport 8b3a7663-ad72-4099-a4de-0fa85d29cfd8 up in Southbound
Dec 02 10:12:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:10.095 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:10.126 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:10.158 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:10 np0005541913.localdomain systemd[1]: tmp-crun.sdFSmW.mount: Deactivated successfully.
Dec 02 10:12:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-abc8399162bfbb6c24544e9605c7d0a83a32a4fccef1f87ba5e365b5de598b76-merged.mount: Deactivated successfully.
Dec 02 10:12:10 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aaae0f07d69c92194e9087fcffa523dd13f97696a1a42cd16a31621218c7972b-userdata-shm.mount: Deactivated successfully.
Dec 02 10:12:10 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dc6d140fa\x2de90a\x2d46db\x2da2ed\x2d8d904415f1fc.mount: Deactivated successfully.
Dec 02 10:12:10 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:10.425 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:10 np0005541913.localdomain ceph-mon[298296]: pgmap v460: 177 pgs: 177 active+clean; 243 MiB data, 1002 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 3.6 MiB/s wr, 118 op/s
Dec 02 10:12:10 np0005541913.localdomain ceph-mon[298296]: osdmap e188: 6 total, 6 up, 6 in
Dec 02 10:12:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/658691468' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/658691468' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:10.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:12:11 np0005541913.localdomain podman[329462]: 
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/657463899' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/657463899' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:11 np0005541913.localdomain podman[329462]: 2025-12-02 10:12:11.067697347 +0000 UTC m=+0.097012656 container create e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a6d986e-0caf-4eff-b1d3-a10e7add5365, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:12:11 np0005541913.localdomain systemd[1]: Started libpod-conmon-e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02.scope.
Dec 02 10:12:11 np0005541913.localdomain podman[329462]: 2025-12-02 10:12:11.020114769 +0000 UTC m=+0.049430108 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:12:11 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:12:11 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b61af61b108caaf0c1a1d229e8160b6e6a2c2166342cae9dc858179db74819f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:12:11 np0005541913.localdomain podman[329462]: 2025-12-02 10:12:11.148759267 +0000 UTC m=+0.178074576 container init e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a6d986e-0caf-4eff-b1d3-a10e7add5365, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:11 np0005541913.localdomain podman[329462]: 2025-12-02 10:12:11.158451085 +0000 UTC m=+0.187766394 container start e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a6d986e-0caf-4eff-b1d3-a10e7add5365, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:11 np0005541913.localdomain dnsmasq[329480]: started, version 2.85 cachesize 150
Dec 02 10:12:11 np0005541913.localdomain dnsmasq[329480]: DNS service limited to local subnets
Dec 02 10:12:11 np0005541913.localdomain dnsmasq[329480]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:12:11 np0005541913.localdomain dnsmasq[329480]: warning: no upstream servers configured
Dec 02 10:12:11 np0005541913.localdomain dnsmasq-dhcp[329480]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d
Dec 02 10:12:11 np0005541913.localdomain dnsmasq[329480]: read /var/lib/neutron/dhcp/9a6d986e-0caf-4eff-b1d3-a10e7add5365/addn_hosts - 0 addresses
Dec 02 10:12:11 np0005541913.localdomain dnsmasq-dhcp[329480]: read /var/lib/neutron/dhcp/9a6d986e-0caf-4eff-b1d3-a10e7add5365/host
Dec 02 10:12:11 np0005541913.localdomain dnsmasq-dhcp[329480]: read /var/lib/neutron/dhcp/9a6d986e-0caf-4eff-b1d3-a10e7add5365/opts
Dec 02 10:12:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:11.175 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:11 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:11Z|00525|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:11.272 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:11 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:11.305 263406 INFO neutron.agent.dhcp.agent [None req-d581afbc-134d-463b-a55f-198042403923 - - - - - -] DHCP configuration for ports {'30df0712-b628-4d3a-8a20-1aa5e2e6b62f'} is completed
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/657463899' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/657463899' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e189 e189: 6 total, 6 up, 6 in
Dec 02 10:12:12 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:12:12 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/325227520' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:12 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:12:12 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/325227520' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:12 np0005541913.localdomain ceph-mon[298296]: pgmap v462: 177 pgs: 177 active+clean; 243 MiB data, 1002 MiB used, 41 GiB / 42 GiB avail; 4.4 MiB/s rd, 4.5 MiB/s wr, 149 op/s
Dec 02 10:12:12 np0005541913.localdomain ceph-mon[298296]: osdmap e189: 6 total, 6 up, 6 in
Dec 02 10:12:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/325227520' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/325227520' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1003594229' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:12:13 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:13.250 2 INFO neutron.agent.securitygroups_rpc [None req-1b9cf27d-4f71-42a8-aff0-a386ad5e469f 27e8ee5045c2430583000f8d62f6e4f1 096ffa0a51b143039159efc232ec547a - - default default] Security group member updated ['0a7d83ca-acbf-4932-884e-9eff3b0bc0ff']
Dec 02 10:12:13 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2361950633' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:12:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e190 e190: 6 total, 6 up, 6 in
Dec 02 10:12:14 np0005541913.localdomain ceph-mon[298296]: pgmap v464: 177 pgs: 177 active+clean; 197 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.6 MiB/s wr, 274 op/s
Dec 02 10:12:14 np0005541913.localdomain ceph-mon[298296]: osdmap e190: 6 total, 6 up, 6 in
Dec 02 10:12:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:12:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:12:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:12:14 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 02 10:12:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:12:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.108 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.114 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0941a6d1-f5f9-42be-888c-2cf3b53f844a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.110142', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60cd8668-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': 'c2edacff1acd44b2ca6bddbad40764893443ec374760929b917e9322ecd65951'}]}, 'timestamp': '2025-12-02 10:12:16.115206', '_unique_id': 'fa86c36372904335a22520bf70c0b7bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.117 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.118 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0dbaf221-4174-4b83-ad58-4afe3c794183', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.118733', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60ce2bfe-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': '77927cfbf380b4ae973b7a2700565f72d94640245765043f14c155bfa48e23d4'}]}, 'timestamp': '2025-12-02 10:12:16.119473', '_unique_id': 'c570d5b5c57e47488c87ddd444eb1dce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.120 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.136 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.137 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58c4df1e-edc3-4e58-8aa9-dc6e855fcac6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:12:16.123084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '60d0e952-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.342187659, 'message_signature': '186bdc927890c25abd74209eea8d41d08a90279c267baf4f36d0a0472c2d5ade'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:12:16.123084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '60d0fb36-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.342187659, 'message_signature': '7073cf6d260ce8a1a0bf8de9203525a2bb98a03afcf3bded3e4e6f8c2766e490'}]}, 'timestamp': '2025-12-02 10:12:16.137804', '_unique_id': 'c8137cb34ba947828746423a8cee67c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.140 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.140 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '453a95b6-d19b-4e4a-b976-6409d51cf3d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.140486', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60d17a02-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': 'c270180542612c6bb8c80c7d992580af41465583a4e2bc185be0e10230b95419'}]}, 'timestamp': '2025-12-02 10:12:16.140990', '_unique_id': '22d2408dd5b94bd785acc0345261b5d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.142 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.143 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd588d663-1336-4266-8967-6f77e165184a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.143449', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60d1ed84-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': '45481602be5629dc4cc38cd044ed2dc14ba97f19cc1d1b0b1a4fb70371fe9de5'}]}, 'timestamp': '2025-12-02 10:12:16.143940', '_unique_id': '77786c55cf264802b62d5aea096c993b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.144 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.145 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.167 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 18990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '746a8bd4-e123-4121-8381-246db5d7797d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18990000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:12:16.146021', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '60d598bc-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.386411188, 'message_signature': '1871605b5f87c260c49847c45a13eb95bab7c138bacec43149ab244e7763f36e'}]}, 'timestamp': '2025-12-02 10:12:16.167987', '_unique_id': 'ca9c5d01b2884878a7e8d08f34fdd4c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:16.179 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.210 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6ddcf75-0026-43a5-8a08-3e9d02ec981d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:12:16.170345', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '60dc0dfa-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': '1b13798f951ef363fb6c88aaf80d9efc4f3d7a11998d026ed8b81b688073c0c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:12:16.170345', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '60dc24e8-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': 'e6f414dc2c5ad59451a9e13d41ba34467b08acbbc64a83f51f7c773544c4b809'}]}, 'timestamp': '2025-12-02 10:12:16.210886', '_unique_id': 'ca8d39bad68c48f3ae423f3c4f619fb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.212 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.213 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c06cd3c8-63d6-4de4-b944-9c8f30986c90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.213934', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60dcaefe-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': '81c44ac76ccc95e7f5b969ee78b27e92667dab38f421cf5fcf254e3e225900d7'}]}, 'timestamp': '2025-12-02 10:12:16.214496', '_unique_id': '4066944921df48e3a06f20eea5649fb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.215 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.216 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.217 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.217 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e385f3b2-ceaa-47e5-93a8-131c3cc109d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:12:16.216989', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '60dd24ce-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.342187659, 'message_signature': '3cb826deb531f3d6256ed0b253053890909151adacb17c37dcbbe0f50c62a17d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:12:16.216989', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '60dd363a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.342187659, 'message_signature': 'db5a3e0fc720e99e69d867d7ce3ec8b06948d0b8a6aa7316070b9d1a69cb8b0c'}]}, 'timestamp': '2025-12-02 10:12:16.217863', '_unique_id': '3512a0e634cc47d2b9566181bf0de4a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.218 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.220 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.220 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d13f493-b0e0-4e74-844a-288122e3746b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:12:16.220194', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '60dda200-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.342187659, 'message_signature': '8e478316ff31cc06294787774acc0dfecaed14691adbb1b4784e4851a4c6018c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:12:16.220194', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '60ddb38a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.342187659, 'message_signature': 'c0712134f0a06f5d812d202ed2c3f78a04505bd9c0124419a32c9b4c763c07f8'}]}, 'timestamp': '2025-12-02 10:12:16.221068', '_unique_id': '04509c7f5f28496c97235ac782db3e1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.221 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4eced968-48b4-400b-b047-878ded048894', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:12:16.223207', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '60de1794-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': '717c9823efb39a353aad090eb26385fabc8cf6d436a6b4a5819b7cd41c64fd69'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:12:16.223207', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '60de2e5a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': '5b79e555648d6fdb98eec8082cf98024ce4da9f168b2d54cfa36ddf56caf87c5'}]}, 'timestamp': '2025-12-02 10:12:16.224285', '_unique_id': '1d8109f8a816493c9912a654250a72ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.227 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.228 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69b4d8b0-a90e-48c0-ad5e-8d4c8fa26a68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:12:16.227730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '60ded6fc-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': '238f72d56f148f4c8e39cd774177657555ffe5f6aee63b744bab7fe96bba2f08'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:12:16.227730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '60deed5e-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': 'df0c489eadee7207257945db50e27ae1ae1eb28b6315e92eb4bdf0c2334e9123'}]}, 'timestamp': '2025-12-02 10:12:16.229119', '_unique_id': 'e3528992f84c4844a5fcfbac54c70019'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.230 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.232 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '349f9f33-7c30-43e8-98fc-355f67090deb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.232153', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60df76f2-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': '16de74b41a73fa21afa56567a6f6ed65e09c01f7bb418e1b04a4867361c95d51'}]}, 'timestamp': '2025-12-02 10:12:16.232688', '_unique_id': '05c11c66f95e4bd4bbb90f178c13603d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.233 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.234 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.235 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd04bba80-c86b-43da-93f2-0f2ca3b1700c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:12:16.234837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '60dfddc2-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': 'f59390e942b8890904f30956a592dbcdb6d6394fc7811fa81892676042ed7532'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:12:16.234837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '60dff01e-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': '8dd092f8624b0e15e0dcc1267e5773ffa97347c44a0ae07b6e781a2f8796ba25'}]}, 'timestamp': '2025-12-02 10:12:16.235758', '_unique_id': 'f4fce673db1a4beda0c5de73b2afa31d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.236 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.238 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8db02dd1-1b8b-4275-87a7-4e2d59d5fee1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.238186', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60e060ee-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': '78661369bf9d9112e39e56b8b2f712e0731a29f6c81fd9ab394911bbbafdf347'}]}, 'timestamp': '2025-12-02 10:12:16.238674', '_unique_id': 'b465affe937246438ce0dc942e87fc1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.240 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a83b81d1-2114-414c-9258-83ac4262b0f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.240830', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60e0c7f0-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': '4610c6914605ccb3069d543c008894a75e904dc933568e55ebb4a1ec29cc0f0e'}]}, 'timestamp': '2025-12-02 10:12:16.241273', '_unique_id': 'b79e8cc118554c23bc002b88913624bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.242 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.243 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ab24288-746a-4d2e-92ca-2774e89f30e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.243909', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60e1402c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': '7167b08072fd2628085b848045dc8be46f394de84df0edfbb81350ecd6e23940'}]}, 'timestamp': '2025-12-02 10:12:16.244354', '_unique_id': '2be22dee6b2c44ea8d994b1a27ce4205'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.245 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.246 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.246 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c73fa7fe-2156-4b53-ba1a-f6dd0d7bdde4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:12:16.246438', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '60e1a4cc-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.386411188, 'message_signature': '78a79cc11c10ca07bfbaad1d96528bb134852c51dcc5d8d97df0aa08c5b06da2'}]}, 'timestamp': '2025-12-02 10:12:16.246916', '_unique_id': '72f4366c282d42eb9c6a30500c188cc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.247 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.248 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.248 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e77c4b8-0f97-4248-8e0d-5fd900428db9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:12:16.248715', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '60e1f8b4-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': '42309a902940abc278e873eaac8b808198cd10b7ddef8d1ed5ab00a76984efa5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:12:16.248715', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '60e2026e-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': '063f31faf2fbe70b72e1e652c17fc80139525d626dc45cade51a37cd09c56313'}]}, 'timestamp': '2025-12-02 10:12:16.249233', '_unique_id': '521d6b653c1c4ee4bc8778d751d74243'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.249 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.250 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3fff19e-81e4-48cd-9d0d-947f310edc2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:12:16.250871', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '60e25354-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.329248945, 'message_signature': '4fc0fbef4e4c05121b7af963844de40972d5b9f2ed06bd547ced3c0c69a34c95'}]}, 'timestamp': '2025-12-02 10:12:16.251318', '_unique_id': '464ec066547846e59b210bc239909dcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.251 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.252 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '980341e2-c538-43ff-b2d9-f51bf2c8c9e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:12:16.252729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '60e2956c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': 'fe044fe531084028a7c4ca7f803103ce1b46544c9b53320600bb964fae332546'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:12:16.252729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '60e2a408-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12498.389436619, 'message_signature': '2cea3e8ae5c55ba5b707ddac4072c544f9525a9b5a5c347e37a0cfc7c387c2b2'}]}, 'timestamp': '2025-12-02 10:12:16.253372', '_unique_id': '4e7e3ef72b8c4de3a43c7b60459abf0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:12:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:12:16.253 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:12:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e191 e191: 6 total, 6 up, 6 in
Dec 02 10:12:16 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:16Z|00526|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:12:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:16.651 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:16 np0005541913.localdomain ceph-mon[298296]: pgmap v466: 177 pgs: 177 active+clean; 197 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 110 KiB/s rd, 33 KiB/s wr, 156 op/s
Dec 02 10:12:16 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "884b3444-4a7a-4744-9a4b-7d6039625376", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:12:16 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "884b3444-4a7a-4744-9a4b-7d6039625376", "format": "json"}]: dispatch
Dec 02 10:12:16 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:12:16 np0005541913.localdomain ceph-mon[298296]: osdmap e191: 6 total, 6 up, 6 in
Dec 02 10:12:16 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:16.909 263406 INFO neutron.agent.linux.ip_lib [None req-f0cf0a3b-6b3e-4618-96f5-732e76579a9d - - - - - -] Device tap3fc6bc7c-d6 cannot be used as it has no MAC address
Dec 02 10:12:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:16.962 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:16 np0005541913.localdomain kernel: device tap3fc6bc7c-d6 entered promiscuous mode
Dec 02 10:12:16 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670336.9688] manager: (tap3fc6bc7c-d6): new Generic device (/org/freedesktop/NetworkManager/Devices/83)
Dec 02 10:12:16 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:16Z|00527|binding|INFO|Claiming lport 3fc6bc7c-d600-4263-b6b5-a826d1a899f3 for this chassis.
Dec 02 10:12:16 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:16Z|00528|binding|INFO|3fc6bc7c-d600-4263-b6b5-a826d1a899f3: Claiming unknown
Dec 02 10:12:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:16.970 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:16 np0005541913.localdomain systemd-udevd[329491]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:12:16 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:16.981 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:e21f/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7499f69a-21e4-43dd-8d90-9037f211beae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7499f69a-21e4-43dd-8d90-9037f211beae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '096ffa0a51b143039159efc232ec547a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de229515-8d8f-41bc-bfc9-16d179cdd33e, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=3fc6bc7c-d600-4263-b6b5-a826d1a899f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:16 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:16.983 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 3fc6bc7c-d600-4263-b6b5-a826d1a899f3 in datapath 7499f69a-21e4-43dd-8d90-9037f211beae bound to our chassis
Dec 02 10:12:16 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:16.987 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port eda5af0c-4f23-4256-97dd-161fdf57e787 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:12:16 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:16.988 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7499f69a-21e4-43dd-8d90-9037f211beae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:12:16 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:16.989 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[51df919c-330c-40d5-bc30-80dfbd74d4ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:17Z|00529|binding|INFO|Setting lport 3fc6bc7c-d600-4263-b6b5-a826d1a899f3 ovn-installed in OVS
Dec 02 10:12:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:17Z|00530|binding|INFO|Setting lport 3fc6bc7c-d600-4263-b6b5-a826d1a899f3 up in Southbound
Dec 02 10:12:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:17.015 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:17.048 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:17.075 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:17 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:17.471 2 INFO neutron.agent.securitygroups_rpc [None req-12cf221e-0940-4511-92ba-1f5763df32bf 27e8ee5045c2430583000f8d62f6e4f1 096ffa0a51b143039159efc232ec547a - - default default] Security group member updated ['0a7d83ca-acbf-4932-884e-9eff3b0bc0ff']
Dec 02 10:12:17 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "r", "format": "json"}]: dispatch
Dec 02 10:12:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:12:17 np0005541913.localdomain ceph-mon[298296]: pgmap v468: 177 pgs: 177 active+clean; 197 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 110 KiB/s rd, 33 KiB/s wr, 156 op/s
Dec 02 10:12:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:12:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:12:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow r pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:12:17 np0005541913.localdomain podman[329546]: 
Dec 02 10:12:17 np0005541913.localdomain podman[329546]: 2025-12-02 10:12:17.985919261 +0000 UTC m=+0.098176417 container create 98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7499f69a-21e4-43dd-8d90-9037f211beae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:12:18 np0005541913.localdomain systemd[1]: Started libpod-conmon-98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f.scope.
Dec 02 10:12:18 np0005541913.localdomain podman[329546]: 2025-12-02 10:12:17.939968537 +0000 UTC m=+0.052225723 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:12:18 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:12:18 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4d875eb3c1b3c9e2ceb62f7bab21f952c4354d929abfca283190eb2f995b7f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:12:18 np0005541913.localdomain podman[329546]: 2025-12-02 10:12:18.072466757 +0000 UTC m=+0.184723913 container init 98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7499f69a-21e4-43dd-8d90-9037f211beae, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:12:18 np0005541913.localdomain podman[329546]: 2025-12-02 10:12:18.087695064 +0000 UTC m=+0.199952210 container start 98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7499f69a-21e4-43dd-8d90-9037f211beae, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:12:18 np0005541913.localdomain dnsmasq[329564]: started, version 2.85 cachesize 150
Dec 02 10:12:18 np0005541913.localdomain dnsmasq[329564]: DNS service limited to local subnets
Dec 02 10:12:18 np0005541913.localdomain dnsmasq[329564]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:12:18 np0005541913.localdomain dnsmasq[329564]: warning: no upstream servers configured
Dec 02 10:12:18 np0005541913.localdomain dnsmasq-dhcp[329564]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:12:18 np0005541913.localdomain dnsmasq[329564]: read /var/lib/neutron/dhcp/7499f69a-21e4-43dd-8d90-9037f211beae/addn_hosts - 0 addresses
Dec 02 10:12:18 np0005541913.localdomain dnsmasq-dhcp[329564]: read /var/lib/neutron/dhcp/7499f69a-21e4-43dd-8d90-9037f211beae/host
Dec 02 10:12:18 np0005541913.localdomain dnsmasq-dhcp[329564]: read /var/lib/neutron/dhcp/7499f69a-21e4-43dd-8d90-9037f211beae/opts
Dec 02 10:12:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:18.164 263406 INFO neutron.agent.dhcp.agent [None req-f0cf0a3b-6b3e-4618-96f5-732e76579a9d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:12:16Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908aefb50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908850dc0>], id=6275e0d7-4643-4f6a-be6e-c6bc38b44570, ip_allocation=immediate, mac_address=fa:16:3e:ac:4c:e1, name=tempest-NetworksIpV6TestAttrs-1409601479, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:14Z, description=, dns_domain=, id=7499f69a-21e4-43dd-8d90-9037f211beae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksIpV6TestAttrs-test-network-943954912, port_security_enabled=True, project_id=096ffa0a51b143039159efc232ec547a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=60372, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2798, status=ACTIVE, subnets=['401ed2f0-cf82-4106-9eea-aee47f00d9f7'], tags=[], tenant_id=096ffa0a51b143039159efc232ec547a, updated_at=2025-12-02T10:12:15Z, vlan_transparent=None, network_id=7499f69a-21e4-43dd-8d90-9037f211beae, port_security_enabled=True, project_id=096ffa0a51b143039159efc232ec547a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['0a7d83ca-acbf-4932-884e-9eff3b0bc0ff'], standard_attr_id=2808, status=DOWN, tags=[], tenant_id=096ffa0a51b143039159efc232ec547a, updated_at=2025-12-02T10:12:16Z on network 7499f69a-21e4-43dd-8d90-9037f211beae
Dec 02 10:12:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:18.377 263406 INFO neutron.agent.dhcp.agent [None req-a5c9f2bc-2156-4555-a4ca-f799d6a4f989 - - - - - -] DHCP configuration for ports {'e6bafd7e-a7b7-4380-a96b-2e66c659fd1f'} is completed
Dec 02 10:12:18 np0005541913.localdomain dnsmasq[329564]: read /var/lib/neutron/dhcp/7499f69a-21e4-43dd-8d90-9037f211beae/addn_hosts - 1 addresses
Dec 02 10:12:18 np0005541913.localdomain dnsmasq-dhcp[329564]: read /var/lib/neutron/dhcp/7499f69a-21e4-43dd-8d90-9037f211beae/host
Dec 02 10:12:18 np0005541913.localdomain podman[329583]: 2025-12-02 10:12:18.388241383 +0000 UTC m=+0.069403251 container kill 98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7499f69a-21e4-43dd-8d90-9037f211beae, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:12:18 np0005541913.localdomain dnsmasq-dhcp[329564]: read /var/lib/neutron/dhcp/7499f69a-21e4-43dd-8d90-9037f211beae/opts
Dec 02 10:12:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1842775925' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1842775925' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:18.733 263406 INFO neutron.agent.dhcp.agent [None req-ff0439dd-f66c-46d5-8d8d-6da67932b1ad - - - - - -] DHCP configuration for ports {'6275e0d7-4643-4f6a-be6e-c6bc38b44570'} is completed
Dec 02 10:12:18 np0005541913.localdomain dnsmasq[328891]: exiting on receipt of SIGTERM
Dec 02 10:12:18 np0005541913.localdomain podman[329619]: 2025-12-02 10:12:18.814570494 +0000 UTC m=+0.062068565 container kill b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e6b81515-1a91-47bb-810b-f820ca0caeff, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:12:18 np0005541913.localdomain systemd[1]: libpod-b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a.scope: Deactivated successfully.
Dec 02 10:12:18 np0005541913.localdomain podman[329639]: 2025-12-02 10:12:18.887233301 +0000 UTC m=+0.052480240 container died b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e6b81515-1a91-47bb-810b-f820ca0caeff, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:12:18 np0005541913.localdomain podman[329639]: 2025-12-02 10:12:18.93112034 +0000 UTC m=+0.096367249 container remove b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e6b81515-1a91-47bb-810b-f820ca0caeff, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:18 np0005541913.localdomain systemd[1]: libpod-conmon-b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a.scope: Deactivated successfully.
Dec 02 10:12:18 np0005541913.localdomain systemd[1]: tmp-crun.v3Wtgc.mount: Deactivated successfully.
Dec 02 10:12:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf59d4ae25454728b26aa2ee5b303a57d6e9d975e82bd83d233416387dbda34-merged.mount: Deactivated successfully.
Dec 02 10:12:18 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1965ecd259394969d6902012099f161effc69497021569f8f7dfe0b65565b3a-userdata-shm.mount: Deactivated successfully.
Dec 02 10:12:19 np0005541913.localdomain dnsmasq[329564]: exiting on receipt of SIGTERM
Dec 02 10:12:19 np0005541913.localdomain podman[329672]: 2025-12-02 10:12:19.122993773 +0000 UTC m=+0.058833469 container kill 98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7499f69a-21e4-43dd-8d90-9037f211beae, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:12:19 np0005541913.localdomain systemd[1]: tmp-crun.NkYxaf.mount: Deactivated successfully.
Dec 02 10:12:19 np0005541913.localdomain systemd[1]: libpod-98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f.scope: Deactivated successfully.
Dec 02 10:12:19 np0005541913.localdomain podman[329686]: 2025-12-02 10:12:19.197965211 +0000 UTC m=+0.058641974 container died 98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7499f69a-21e4-43dd-8d90-9037f211beae, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:12:19 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:19.206 263406 INFO neutron.agent.dhcp.agent [None req-af69e25e-f22c-4bd9-aa8d-86621db1e93f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:19 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2de6b81515\x2d1a91\x2d47bb\x2d810b\x2df820ca0caeff.mount: Deactivated successfully.
Dec 02 10:12:19 np0005541913.localdomain podman[329686]: 2025-12-02 10:12:19.238525682 +0000 UTC m=+0.099202405 container cleanup 98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7499f69a-21e4-43dd-8d90-9037f211beae, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:12:19 np0005541913.localdomain systemd[1]: libpod-conmon-98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f.scope: Deactivated successfully.
Dec 02 10:12:19 np0005541913.localdomain podman[329687]: 2025-12-02 10:12:19.292754277 +0000 UTC m=+0.146318621 container remove 98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7499f69a-21e4-43dd-8d90-9037f211beae, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:12:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:19Z|00531|binding|INFO|Releasing lport 3fc6bc7c-d600-4263-b6b5-a826d1a899f3 from this chassis (sb_readonly=0)
Dec 02 10:12:19 np0005541913.localdomain kernel: device tap3fc6bc7c-d6 left promiscuous mode
Dec 02 10:12:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:19Z|00532|binding|INFO|Setting lport 3fc6bc7c-d600-4263-b6b5-a826d1a899f3 down in Southbound
Dec 02 10:12:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:19.308 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:19 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:19.319 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:e21f/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-7499f69a-21e4-43dd-8d90-9037f211beae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7499f69a-21e4-43dd-8d90-9037f211beae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '096ffa0a51b143039159efc232ec547a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de229515-8d8f-41bc-bfc9-16d179cdd33e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=3fc6bc7c-d600-4263-b6b5-a826d1a899f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:19 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:19.321 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 3fc6bc7c-d600-4263-b6b5-a826d1a899f3 in datapath 7499f69a-21e4-43dd-8d90-9037f211beae unbound from our chassis
Dec 02 10:12:19 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:19.323 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7499f69a-21e4-43dd-8d90-9037f211beae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:12:19 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:19.324 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[80742142-9a56-4112-abac-73f61204db8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:19.334 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:19 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:19.650 263406 INFO neutron.agent.dhcp.agent [None req-166fd323-dce2-4942-9c5c-b3ec4409e54b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:19 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:19.681 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "884b3444-4a7a-4744-9a4b-7d6039625376", "snap_name": "a0843ffe-d6ab-48e1-a5c8-33bfbdacd761", "format": "json"}]: dispatch
Dec 02 10:12:19 np0005541913.localdomain ceph-mon[298296]: pgmap v469: 177 pgs: 177 active+clean; 198 MiB data, 979 MiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 71 KiB/s wr, 180 op/s
Dec 02 10:12:19 np0005541913.localdomain systemd[1]: tmp-crun.6QgC2a.mount: Deactivated successfully.
Dec 02 10:12:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-d4d875eb3c1b3c9e2ceb62f7bab21f952c4354d929abfca283190eb2f995b7f2-merged.mount: Deactivated successfully.
Dec 02 10:12:19 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98fb077e84c96895dbfdbc31164d597ffc3b5e23ab83bd6453106b41ef744d9f-userdata-shm.mount: Deactivated successfully.
Dec 02 10:12:19 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d7499f69a\x2d21e4\x2d43dd\x2d8d90\x2d9037f211beae.mount: Deactivated successfully.
Dec 02 10:12:20 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:12:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 02 10:12:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:12:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 02 10:12:20 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 02 10:12:20 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 02 10:12:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:21.183 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e192 e192: 6 total, 6 up, 6 in
Dec 02 10:12:22 np0005541913.localdomain ceph-mon[298296]: pgmap v470: 177 pgs: 177 active+clean; 198 MiB data, 979 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 42 KiB/s wr, 52 op/s
Dec 02 10:12:22 np0005541913.localdomain ceph-mon[298296]: osdmap e192: 6 total, 6 up, 6 in
Dec 02 10:12:22 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:12:22 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:22.942 263406 INFO neutron.agent.linux.ip_lib [None req-e94ea10b-b576-462b-91dc-f7657f041dcc - - - - - -] Device tap7db38a9e-eb cannot be used as it has no MAC address
Dec 02 10:12:22 np0005541913.localdomain podman[329716]: 2025-12-02 10:12:22.967383882 +0000 UTC m=+0.092384313 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:22.972 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:22 np0005541913.localdomain kernel: device tap7db38a9e-eb entered promiscuous mode
Dec 02 10:12:22 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670342.9807] manager: (tap7db38a9e-eb): new Generic device (/org/freedesktop/NetworkManager/Devices/84)
Dec 02 10:12:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:22.981 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:22 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:22Z|00533|binding|INFO|Claiming lport 7db38a9e-ebcb-4d9d-918e-68ae809943d3 for this chassis.
Dec 02 10:12:22 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:22Z|00534|binding|INFO|7db38a9e-ebcb-4d9d-918e-68ae809943d3: Claiming unknown
Dec 02 10:12:22 np0005541913.localdomain systemd-udevd[329742]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:12:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:22.993 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe21:5f27/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-e5941a38-5b52-4ec5-8b26-a548856326e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5941a38-5b52-4ec5-8b26-a548856326e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '096ffa0a51b143039159efc232ec547a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ddfbbc4e-1c2d-489c-bd00-e334cdcd7a08, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=7db38a9e-ebcb-4d9d-918e-68ae809943d3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:22.995 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 7db38a9e-ebcb-4d9d-918e-68ae809943d3 in datapath e5941a38-5b52-4ec5-8b26-a548856326e1 bound to our chassis
Dec 02 10:12:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:22.996 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e5941a38-5b52-4ec5-8b26-a548856326e1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:22 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:22.997 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[c785fdbf-ebfb-4c23-861b-abe30e7318c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:23 np0005541913.localdomain podman[329716]: 2025-12-02 10:12:23.008075677 +0000 UTC m=+0.133076128 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 10:12:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap7db38a9e-eb: No such device
Dec 02 10:12:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:23.019 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:23 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:23Z|00535|binding|INFO|Setting lport 7db38a9e-ebcb-4d9d-918e-68ae809943d3 ovn-installed in OVS
Dec 02 10:12:23 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:23Z|00536|binding|INFO|Setting lport 7db38a9e-ebcb-4d9d-918e-68ae809943d3 up in Southbound
Dec 02 10:12:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap7db38a9e-eb: No such device
Dec 02 10:12:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:23.023 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap7db38a9e-eb: No such device
Dec 02 10:12:23 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:12:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap7db38a9e-eb: No such device
Dec 02 10:12:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap7db38a9e-eb: No such device
Dec 02 10:12:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap7db38a9e-eb: No such device
Dec 02 10:12:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap7db38a9e-eb: No such device
Dec 02 10:12:23 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap7db38a9e-eb: No such device
Dec 02 10:12:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:23.060 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:23.089 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:23 np0005541913.localdomain podman[329812]: 
Dec 02 10:12:24 np0005541913.localdomain podman[329812]: 2025-12-02 10:12:24.007234594 +0000 UTC m=+0.093599846 container create e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e5941a38-5b52-4ec5-8b26-a548856326e1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:12:24 np0005541913.localdomain systemd[1]: Started libpod-conmon-e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746.scope.
Dec 02 10:12:24 np0005541913.localdomain podman[329812]: 2025-12-02 10:12:23.961815812 +0000 UTC m=+0.048181084 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:12:24 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:12:24 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c9816e47c75cc8fff026b35d669da0f5a6ffd6039598106d9c9dfaf32ebd644/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:12:24 np0005541913.localdomain podman[329812]: 2025-12-02 10:12:24.09081635 +0000 UTC m=+0.177181602 container init e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e5941a38-5b52-4ec5-8b26-a548856326e1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 10:12:24 np0005541913.localdomain podman[329812]: 2025-12-02 10:12:24.100023446 +0000 UTC m=+0.186388708 container start e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e5941a38-5b52-4ec5-8b26-a548856326e1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:12:24 np0005541913.localdomain dnsmasq[329831]: started, version 2.85 cachesize 150
Dec 02 10:12:24 np0005541913.localdomain dnsmasq[329831]: DNS service limited to local subnets
Dec 02 10:12:24 np0005541913.localdomain dnsmasq[329831]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:12:24 np0005541913.localdomain dnsmasq[329831]: warning: no upstream servers configured
Dec 02 10:12:24 np0005541913.localdomain dnsmasq[329831]: read /var/lib/neutron/dhcp/e5941a38-5b52-4ec5-8b26-a548856326e1/addn_hosts - 0 addresses
Dec 02 10:12:24 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:24.212 263406 INFO neutron.agent.dhcp.agent [None req-58ec0c9c-c91c-454d-9396-097c0f3b11ce - - - - - -] DHCP configuration for ports {'81275455-dcc9-470b-999b-b60444ee78b6'} is completed
Dec 02 10:12:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:24.261 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 79e4d960-f1e9-41e9-8db0-4e28d904bc22 with type ""
Dec 02 10:12:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:24Z|00537|binding|INFO|Removing iface tap7db38a9e-eb ovn-installed in OVS
Dec 02 10:12:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:24Z|00538|binding|INFO|Removing lport 7db38a9e-ebcb-4d9d-918e-68ae809943d3 ovn-installed in OVS
Dec 02 10:12:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:24.264 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe21:5f27/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-e5941a38-5b52-4ec5-8b26-a548856326e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5941a38-5b52-4ec5-8b26-a548856326e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '096ffa0a51b143039159efc232ec547a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ddfbbc4e-1c2d-489c-bd00-e334cdcd7a08, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=7db38a9e-ebcb-4d9d-918e-68ae809943d3) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:24.263 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:24.267 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 7db38a9e-ebcb-4d9d-918e-68ae809943d3 in datapath e5941a38-5b52-4ec5-8b26-a548856326e1 unbound from our chassis
Dec 02 10:12:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:24.268 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e5941a38-5b52-4ec5-8b26-a548856326e1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:24 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:24.269 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8e6a76-243c-444e-b127-d248088f4af0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:24.272 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:24 np0005541913.localdomain dnsmasq[329831]: exiting on receipt of SIGTERM
Dec 02 10:12:24 np0005541913.localdomain podman[329848]: 2025-12-02 10:12:24.440314014 +0000 UTC m=+0.067813578 container kill e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e5941a38-5b52-4ec5-8b26-a548856326e1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:12:24 np0005541913.localdomain systemd[1]: libpod-e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746.scope: Deactivated successfully.
Dec 02 10:12:24 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:24Z|00539|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:12:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:24.524 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:24 np0005541913.localdomain podman[329863]: 2025-12-02 10:12:24.546515054 +0000 UTC m=+0.079517259 container died e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e5941a38-5b52-4ec5-8b26-a548856326e1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:12:24 np0005541913.localdomain podman[329863]: 2025-12-02 10:12:24.588073512 +0000 UTC m=+0.121075677 container remove e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e5941a38-5b52-4ec5-8b26-a548856326e1, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:24 np0005541913.localdomain systemd[1]: libpod-conmon-e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746.scope: Deactivated successfully.
Dec 02 10:12:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:24.599 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:24 np0005541913.localdomain kernel: device tap7db38a9e-eb left promiscuous mode
Dec 02 10:12:24 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:24.612 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:24 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:24.628 263406 INFO neutron.agent.dhcp.agent [None req-ceb3f944-56fd-4f94-92b5-ccaeec7af91f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:24 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:24.629 263406 INFO neutron.agent.dhcp.agent [None req-ceb3f944-56fd-4f94-92b5-ccaeec7af91f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:24 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "884b3444-4a7a-4744-9a4b-7d6039625376", "snap_name": "a0843ffe-d6ab-48e1-a5c8-33bfbdacd761_e41a686a-024b-44d9-a830-e637ced25120", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:24 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "884b3444-4a7a-4744-9a4b-7d6039625376", "snap_name": "a0843ffe-d6ab-48e1-a5c8-33bfbdacd761", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:24 np0005541913.localdomain ceph-mon[298296]: pgmap v472: 177 pgs: 177 active+clean; 198 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 68 KiB/s wr, 55 op/s
Dec 02 10:12:24 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:12:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 02 10:12:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:12:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:12:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:12:24 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e193 e193: 6 total, 6 up, 6 in
Dec 02 10:12:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7c9816e47c75cc8fff026b35d669da0f5a6ffd6039598106d9c9dfaf32ebd644-merged.mount: Deactivated successfully.
Dec 02 10:12:25 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e23c89c91c2e65f9f29640f5483b0afd97ccd2241bab5d638b5c8041f0fe2746-userdata-shm.mount: Deactivated successfully.
Dec 02 10:12:25 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2de5941a38\x2d5b52\x2d4ec5\x2d8b26\x2da548856326e1.mount: Deactivated successfully.
Dec 02 10:12:25 np0005541913.localdomain ceph-mon[298296]: osdmap e193: 6 total, 6 up, 6 in
Dec 02 10:12:25 np0005541913.localdomain ceph-mon[298296]: pgmap v474: 177 pgs: 177 active+clean; 198 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 68 KiB/s wr, 55 op/s
Dec 02 10:12:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e194 e194: 6 total, 6 up, 6 in
Dec 02 10:12:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:26.187 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:12:26 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:26.382 263406 INFO neutron.agent.linux.ip_lib [None req-1a475abf-8b54-4c02-9ae1-9d6f9e605242 - - - - - -] Device tap5cd3a631-9d cannot be used as it has no MAC address
Dec 02 10:12:26 np0005541913.localdomain systemd[1]: tmp-crun.PHX4vd.mount: Deactivated successfully.
Dec 02 10:12:26 np0005541913.localdomain podman[329892]: 2025-12-02 10:12:26.404885478 +0000 UTC m=+0.096676708 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:26.417 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:26 np0005541913.localdomain kernel: device tap5cd3a631-9d entered promiscuous mode
Dec 02 10:12:26 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670346.4254] manager: (tap5cd3a631-9d): new Generic device (/org/freedesktop/NetworkManager/Devices/85)
Dec 02 10:12:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:26Z|00540|binding|INFO|Claiming lport 5cd3a631-9d08-4568-9b09-16bfca349289 for this chassis.
Dec 02 10:12:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:26Z|00541|binding|INFO|5cd3a631-9d08-4568-9b09-16bfca349289: Claiming unknown
Dec 02 10:12:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:26.428 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:26 np0005541913.localdomain systemd-udevd[329917]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:12:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:26.436 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-64648a3d-835b-4d29-8cd8-3f34abaacc37', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-64648a3d-835b-4d29-8cd8-3f34abaacc37', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '096ffa0a51b143039159efc232ec547a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c90c0f68-19d2-4032-9c10-7810e57b2671, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=5cd3a631-9d08-4568-9b09-16bfca349289) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:26 np0005541913.localdomain podman[329892]: 2025-12-02 10:12:26.438361499 +0000 UTC m=+0.130152699 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:12:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:26.438 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 5cd3a631-9d08-4568-9b09-16bfca349289 in datapath 64648a3d-835b-4d29-8cd8-3f34abaacc37 bound to our chassis
Dec 02 10:12:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:26.440 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 64648a3d-835b-4d29-8cd8-3f34abaacc37 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:26.441 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d57860-8405-4fc2-87bf-089db5206178]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:26 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:12:26 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5cd3a631-9d: No such device
Dec 02 10:12:26 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5cd3a631-9d: No such device
Dec 02 10:12:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:26Z|00542|binding|INFO|Setting lport 5cd3a631-9d08-4568-9b09-16bfca349289 ovn-installed in OVS
Dec 02 10:12:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:26Z|00543|binding|INFO|Setting lport 5cd3a631-9d08-4568-9b09-16bfca349289 up in Southbound
Dec 02 10:12:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:26.468 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:26 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5cd3a631-9d: No such device
Dec 02 10:12:26 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5cd3a631-9d: No such device
Dec 02 10:12:26 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5cd3a631-9d: No such device
Dec 02 10:12:26 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5cd3a631-9d: No such device
Dec 02 10:12:26 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5cd3a631-9d: No such device
Dec 02 10:12:26 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap5cd3a631-9d: No such device
Dec 02 10:12:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:26.516 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:26.551 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:26 np0005541913.localdomain ceph-mon[298296]: osdmap e194: 6 total, 6 up, 6 in
Dec 02 10:12:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "884b3444-4a7a-4744-9a4b-7d6039625376", "format": "json"}]: dispatch
Dec 02 10:12:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "884b3444-4a7a-4744-9a4b-7d6039625376", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e195 e195: 6 total, 6 up, 6 in
Dec 02 10:12:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:26.834 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 9683fa55-109f-4a3c-8663-60b73e691f42 with type ""
Dec 02 10:12:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:26.836 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-64648a3d-835b-4d29-8cd8-3f34abaacc37', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-64648a3d-835b-4d29-8cd8-3f34abaacc37', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '096ffa0a51b143039159efc232ec547a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c90c0f68-19d2-4032-9c10-7810e57b2671, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=5cd3a631-9d08-4568-9b09-16bfca349289) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:26Z|00544|binding|INFO|Removing iface tap5cd3a631-9d ovn-installed in OVS
Dec 02 10:12:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:26.838 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 5cd3a631-9d08-4568-9b09-16bfca349289 in datapath 64648a3d-835b-4d29-8cd8-3f34abaacc37 unbound from our chassis
Dec 02 10:12:26 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:26Z|00545|binding|INFO|Removing lport 5cd3a631-9d08-4568-9b09-16bfca349289 ovn-installed in OVS
Dec 02 10:12:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:26.839 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 64648a3d-835b-4d29-8cd8-3f34abaacc37 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:26 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:26.840 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[914e0d6a-2e0d-49f2-a30a-9c8a6e645f2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:26.888 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:26.890 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:12:27 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:12:27 np0005541913.localdomain podman[329982]: 2025-12-02 10:12:27.45794891 +0000 UTC m=+0.099170873 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, name=ubi9-minimal, vcs-type=git)
Dec 02 10:12:27 np0005541913.localdomain podman[330002]: 
Dec 02 10:12:27 np0005541913.localdomain podman[330002]: 2025-12-02 10:12:27.484891809 +0000 UTC m=+0.090823281 container create e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:12:27 np0005541913.localdomain podman[329982]: 2025-12-02 10:12:27.495152402 +0000 UTC m=+0.136374345 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Dec 02 10:12:27 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:12:27 np0005541913.localdomain systemd[1]: Started libpod-conmon-e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2.scope.
Dec 02 10:12:27 np0005541913.localdomain podman[330002]: 2025-12-02 10:12:27.447944474 +0000 UTC m=+0.053876016 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:12:27 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:12:27 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e55b32f7917d1bfc37ba3bfadf74cf4ebf2c68e08e8a4389ecaa7281d36c25b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:12:27 np0005541913.localdomain podman[329983]: 2025-12-02 10:12:27.502038596 +0000 UTC m=+0.137114906 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:12:27 np0005541913.localdomain podman[330002]: 2025-12-02 10:12:27.558604833 +0000 UTC m=+0.164536335 container init e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:27 np0005541913.localdomain podman[330002]: 2025-12-02 10:12:27.568151008 +0000 UTC m=+0.174082510 container start e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:12:27 np0005541913.localdomain dnsmasq[330045]: started, version 2.85 cachesize 150
Dec 02 10:12:27 np0005541913.localdomain dnsmasq[330045]: DNS service limited to local subnets
Dec 02 10:12:27 np0005541913.localdomain dnsmasq[330045]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:12:27 np0005541913.localdomain dnsmasq[330045]: warning: no upstream servers configured
Dec 02 10:12:27 np0005541913.localdomain dnsmasq-dhcp[330045]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 02 10:12:27 np0005541913.localdomain dnsmasq[330045]: read /var/lib/neutron/dhcp/64648a3d-835b-4d29-8cd8-3f34abaacc37/addn_hosts - 0 addresses
Dec 02 10:12:27 np0005541913.localdomain dnsmasq-dhcp[330045]: read /var/lib/neutron/dhcp/64648a3d-835b-4d29-8cd8-3f34abaacc37/host
Dec 02 10:12:27 np0005541913.localdomain dnsmasq-dhcp[330045]: read /var/lib/neutron/dhcp/64648a3d-835b-4d29-8cd8-3f34abaacc37/opts
Dec 02 10:12:27 np0005541913.localdomain podman[329983]: 2025-12-02 10:12:27.58288422 +0000 UTC m=+0.217960560 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:12:27 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.650 263406 INFO neutron.agent.dhcp.agent [None req-6c4a45a3-c7bb-422e-beae-d8715cc80aed - - - - - -] DHCP configuration for ports {'c435f09f-f557-4267-b1b5-e62d98b6c4b5'} is completed
Dec 02 10:12:27 np0005541913.localdomain kernel: device tap5cd3a631-9d left promiscuous mode
Dec 02 10:12:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:27.665 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:27 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:27.687 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e196 e196: 6 total, 6 up, 6 in
Dec 02 10:12:27 np0005541913.localdomain ceph-mon[298296]: osdmap e195: 6 total, 6 up, 6 in
Dec 02 10:12:27 np0005541913.localdomain ceph-mon[298296]: pgmap v477: 177 pgs: 177 active+clean; 198 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s wr, 4 op/s
Dec 02 10:12:27 np0005541913.localdomain dnsmasq[330045]: read /var/lib/neutron/dhcp/64648a3d-835b-4d29-8cd8-3f34abaacc37/addn_hosts - 0 addresses
Dec 02 10:12:27 np0005541913.localdomain dnsmasq-dhcp[330045]: read /var/lib/neutron/dhcp/64648a3d-835b-4d29-8cd8-3f34abaacc37/host
Dec 02 10:12:27 np0005541913.localdomain podman[330068]: 2025-12-02 10:12:27.879962977 +0000 UTC m=+0.058757007 container kill e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:12:27 np0005541913.localdomain dnsmasq-dhcp[330045]: read /var/lib/neutron/dhcp/64648a3d-835b-4d29-8cd8-3f34abaacc37/opts
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent [None req-1a475abf-8b54-4c02-9ae1-9d6f9e605242 - - - - - -] Unable to reload_allocations dhcp for 64648a3d-835b-4d29-8cd8-3f34abaacc37.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap5cd3a631-9d not found in namespace qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37.
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     return fut.result()
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     raise self._exception
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap5cd3a631-9d not found in namespace qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37.
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.910 263406 ERROR neutron.agent.dhcp.agent 
Dec 02 10:12:27 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:27.917 263406 INFO neutron.agent.dhcp.agent [None req-ac6c4a74-9018-4989-9341-cd6278919b9d - - - - - -] Synchronizing state
Dec 02 10:12:28 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:28.116 263406 INFO neutron.agent.dhcp.agent [None req-171d14fd-52cf-40cb-9315-4edb7100f18b - - - - - -] All active networks have been fetched through RPC.
Dec 02 10:12:28 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:28.117 263406 INFO neutron.agent.dhcp.agent [-] Starting network 64648a3d-835b-4d29-8cd8-3f34abaacc37 dhcp configuration
Dec 02 10:12:28 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:28.118 263406 INFO neutron.agent.dhcp.agent [-] Finished network 64648a3d-835b-4d29-8cd8-3f34abaacc37 dhcp configuration
Dec 02 10:12:28 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:28.118 263406 INFO neutron.agent.dhcp.agent [-] Starting network 7499f69a-21e4-43dd-8d90-9037f211beae dhcp configuration
Dec 02 10:12:28 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:28.119 263406 INFO neutron.agent.dhcp.agent [-] Finished network 7499f69a-21e4-43dd-8d90-9037f211beae dhcp configuration
Dec 02 10:12:28 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:28.119 263406 INFO neutron.agent.dhcp.agent [None req-171d14fd-52cf-40cb-9315-4edb7100f18b - - - - - -] Synchronizing state complete
Dec 02 10:12:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:28.183 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:28 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:28Z|00546|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:12:28 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:28.265 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:28 np0005541913.localdomain dnsmasq[330045]: exiting on receipt of SIGTERM
Dec 02 10:12:28 np0005541913.localdomain podman[330098]: 2025-12-02 10:12:28.40433722 +0000 UTC m=+0.065186018 container kill e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:12:28 np0005541913.localdomain systemd[1]: libpod-e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2.scope: Deactivated successfully.
Dec 02 10:12:28 np0005541913.localdomain podman[330112]: 2025-12-02 10:12:28.486304035 +0000 UTC m=+0.068443665 container died e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:12:28 np0005541913.localdomain podman[330112]: 2025-12-02 10:12:28.524510113 +0000 UTC m=+0.106649703 container cleanup e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:12:28 np0005541913.localdomain systemd[1]: libpod-conmon-e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2.scope: Deactivated successfully.
Dec 02 10:12:28 np0005541913.localdomain podman[330114]: 2025-12-02 10:12:28.562110136 +0000 UTC m=+0.134775764 container remove e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64648a3d-835b-4d29-8cd8-3f34abaacc37, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:12:29 np0005541913.localdomain ceph-mon[298296]: osdmap e196: 6 total, 6 up, 6 in
Dec 02 10:12:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:12:29 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e197 e197: 6 total, 6 up, 6 in
Dec 02 10:12:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:29.313 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5e55b32f7917d1bfc37ba3bfadf74cf4ebf2c68e08e8a4389ecaa7281d36c25b-merged.mount: Deactivated successfully.
Dec 02 10:12:29 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e431b3e08409da0d2dafa19fffaa17db5444b42a11cdc11303acfe59acd8c0e2-userdata-shm.mount: Deactivated successfully.
Dec 02 10:12:29 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d64648a3d\x2d835b\x2d4d29\x2d8cd8\x2d3f34abaacc37.mount: Deactivated successfully.
Dec 02 10:12:30 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b9e19d1e-178b-4a98-88b5-d79880cd9496", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:12:30 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b9e19d1e-178b-4a98-88b5-d79880cd9496", "format": "json"}]: dispatch
Dec 02 10:12:30 np0005541913.localdomain ceph-mon[298296]: osdmap e197: 6 total, 6 up, 6 in
Dec 02 10:12:30 np0005541913.localdomain ceph-mon[298296]: pgmap v480: 177 pgs: 177 active+clean; 198 MiB data, 985 MiB used, 41 GiB / 42 GiB avail; 99 KiB/s rd, 107 KiB/s wr, 152 op/s
Dec 02 10:12:30 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e198 e198: 6 total, 6 up, 6 in
Dec 02 10:12:30 np0005541913.localdomain sudo[330141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:12:30 np0005541913.localdomain sudo[330141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:12:30 np0005541913.localdomain sudo[330141]: pam_unix(sudo:session): session closed for user root
Dec 02 10:12:30 np0005541913.localdomain sudo[330159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:12:30 np0005541913.localdomain sudo[330159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:12:31 np0005541913.localdomain ceph-mon[298296]: osdmap e198: 6 total, 6 up, 6 in
Dec 02 10:12:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:31.191 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e199 e199: 6 total, 6 up, 6 in
Dec 02 10:12:31 np0005541913.localdomain sudo[330159]: pam_unix(sudo:session): session closed for user root
Dec 02 10:12:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:12:31 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3972691788' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:12:31 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3972691788' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:31 np0005541913.localdomain sudo[330207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:12:31 np0005541913.localdomain sudo[330207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:12:31 np0005541913.localdomain sudo[330207]: pam_unix(sudo:session): session closed for user root
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: pgmap v482: 177 pgs: 177 active+clean; 198 MiB data, 985 MiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 98 KiB/s wr, 140 op/s
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b9e19d1e-178b-4a98-88b5-d79880cd9496", "auth_id": "bob", "tenant_id": "a241a07e4161486091e8de3f95a1d6c6", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: osdmap e199: 6 total, 6 up, 6 in
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180,allow rw path=/volumes/_nogroup/b9e19d1e-178b-4a98-88b5-d79880cd9496/13c658d8-8e0f-421c-9526-6f9449a5852e", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72,allow rw pool=manila_data namespace=fsvolumens_b9e19d1e-178b-4a98-88b5-d79880cd9496"]} : dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180,allow rw path=/volumes/_nogroup/b9e19d1e-178b-4a98-88b5-d79880cd9496/13c658d8-8e0f-421c-9526-6f9449a5852e", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72,allow rw pool=manila_data namespace=fsvolumens_b9e19d1e-178b-4a98-88b5-d79880cd9496"]} : dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180,allow rw path=/volumes/_nogroup/b9e19d1e-178b-4a98-88b5-d79880cd9496/13c658d8-8e0f-421c-9526-6f9449a5852e", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72,allow rw pool=manila_data namespace=fsvolumens_b9e19d1e-178b-4a98-88b5-d79880cd9496"]}]': finished
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3972691788' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3972691788' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:12:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e200 e200: 6 total, 6 up, 6 in
Dec 02 10:12:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:12:33 np0005541913.localdomain ceph-mon[298296]: osdmap e200: 6 total, 6 up, 6 in
Dec 02 10:12:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:12:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:12:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:12:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:12:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:12:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:12:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:12:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:12:34 np0005541913.localdomain ceph-mon[298296]: pgmap v485: 177 pgs: 177 active+clean; 198 MiB data, 986 MiB used, 41 GiB / 42 GiB avail; 109 KiB/s rd, 69 KiB/s wr, 157 op/s
Dec 02 10:12:34 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4056498242' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:34 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4056498242' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 02 10:12:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72"]} : dispatch
Dec 02 10:12:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72"]} : dispatch
Dec 02 10:12:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/4951f94c-f3a4-4170-9869-8238a9dc7b72/9297652e-e843-4300-a77e-137058f03180", "osd", "allow rw pool=manila_data namespace=fsvolumens_4951f94c-f3a4-4170-9869-8238a9dc7b72"]}]': finished
Dec 02 10:12:35 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:35.025 2 INFO neutron.agent.securitygroups_rpc [None req-c29e5bff-b968-4a83-beb0-2b46e231db68 27e8ee5045c2430583000f8d62f6e4f1 096ffa0a51b143039159efc232ec547a - - default default] Security group member updated ['0a7d83ca-acbf-4932-884e-9eff3b0bc0ff']
Dec 02 10:12:35 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:35.073 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:35.101 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:35.103 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:12:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:35.134 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:35.168 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:12:35 np0005541913.localdomain podman[330225]: 2025-12-02 10:12:35.466098259 +0000 UTC m=+0.097084949 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 10:12:35 np0005541913.localdomain podman[330225]: 2025-12-02 10:12:35.509048883 +0000 UTC m=+0.140035523 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:12:35 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:12:35 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:35.660 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:35 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b9e19d1e-178b-4a98-88b5-d79880cd9496", "auth_id": "bob", "format": "json"}]: dispatch
Dec 02 10:12:35 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b9e19d1e-178b-4a98-88b5-d79880cd9496", "auth_id": "bob", "format": "json"}]: dispatch
Dec 02 10:12:35 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:35.863 263406 INFO neutron.agent.linux.ip_lib [None req-816e9919-ed63-4148-b713-8cde65fae396 - - - - - -] Device tapc46392fc-2a cannot be used as it has no MAC address
Dec 02 10:12:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:35.886 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:35 np0005541913.localdomain kernel: device tapc46392fc-2a entered promiscuous mode
Dec 02 10:12:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:35.896 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:35 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670355.8982] manager: (tapc46392fc-2a): new Generic device (/org/freedesktop/NetworkManager/Devices/86)
Dec 02 10:12:35 np0005541913.localdomain systemd-udevd[330254]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:12:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:35Z|00547|binding|INFO|Claiming lport c46392fc-2aa4-4d80-b8dd-cf511eace16e for this chassis.
Dec 02 10:12:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:35Z|00548|binding|INFO|c46392fc-2aa4-4d80-b8dd-cf511eace16e: Claiming unknown
Dec 02 10:12:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:35.915 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-474eb989-d757-4df7-9a0f-19d414dbaf64', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-474eb989-d757-4df7-9a0f-19d414dbaf64', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8eea084241c14c5d9a6cc0d912041a21', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8265533d-51c3-4865-8bdc-d09b3aea005a, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c46392fc-2aa4-4d80-b8dd-cf511eace16e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:35.917 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c46392fc-2aa4-4d80-b8dd-cf511eace16e in datapath 474eb989-d757-4df7-9a0f-19d414dbaf64 bound to our chassis
Dec 02 10:12:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:35.918 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 474eb989-d757-4df7-9a0f-19d414dbaf64 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:35.919 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[0d3e9e84-b724-411a-9c71-9eaafed9c742]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc46392fc-2a: No such device
Dec 02 10:12:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc46392fc-2a: No such device
Dec 02 10:12:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:35.937 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:35.941 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc46392fc-2a: No such device
Dec 02 10:12:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc46392fc-2a: No such device
Dec 02 10:12:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc46392fc-2a: No such device
Dec 02 10:12:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc46392fc-2a: No such device
Dec 02 10:12:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:35Z|00549|binding|INFO|Setting lport c46392fc-2aa4-4d80-b8dd-cf511eace16e ovn-installed in OVS
Dec 02 10:12:35 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:35Z|00550|binding|INFO|Setting lport c46392fc-2aa4-4d80-b8dd-cf511eace16e up in Southbound
Dec 02 10:12:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc46392fc-2a: No such device
Dec 02 10:12:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:35.963 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:35 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapc46392fc-2a: No such device
Dec 02 10:12:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:35.980 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:36.006 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:12:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:12:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:12:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156107 "" "Go-http-client/1.1"
Dec 02 10:12:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:12:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19258 "" "Go-http-client/1.1"
Dec 02 10:12:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:36.221 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:36.287 2 INFO neutron.agent.securitygroups_rpc [None req-11289385-b413-4a50-89a7-e0a67d214908 27e8ee5045c2430583000f8d62f6e4f1 096ffa0a51b143039159efc232ec547a - - default default] Security group member updated ['0a7d83ca-acbf-4932-884e-9eff3b0bc0ff']
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e201 e201: 6 total, 6 up, 6 in
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.646329) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670356646362, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2856, "num_deletes": 268, "total_data_size": 4652381, "memory_usage": 4791136, "flush_reason": "Manual Compaction"}
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670356663736, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3040370, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28409, "largest_seqno": 31259, "table_properties": {"data_size": 3028743, "index_size": 7492, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27700, "raw_average_key_size": 22, "raw_value_size": 3004285, "raw_average_value_size": 2450, "num_data_blocks": 314, "num_entries": 1226, "num_filter_entries": 1226, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670242, "oldest_key_time": 1764670242, "file_creation_time": 1764670356, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 17441 microseconds, and 4089 cpu microseconds.
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.663768) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3040370 bytes OK
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.663785) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.666082) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.666097) EVENT_LOG_v1 {"time_micros": 1764670356666093, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.666112) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 4638877, prev total WAL file size 4638877, number of live WAL files 2.
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.666838) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end)
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2969KB)], [48(16MB)]
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670356666905, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 20006033, "oldest_snapshot_seqno": -1}
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 13619 keys, 18489822 bytes, temperature: kUnknown
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670356773672, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 18489822, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18410882, "index_size": 43831, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34053, "raw_key_size": 363795, "raw_average_key_size": 26, "raw_value_size": 18177986, "raw_average_value_size": 1334, "num_data_blocks": 1659, "num_entries": 13619, "num_filter_entries": 13619, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670356, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.773995) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 18489822 bytes
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.775994) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.2 rd, 173.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 16.2 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(12.7) write-amplify(6.1) OK, records in: 14176, records dropped: 557 output_compression: NoCompression
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.776021) EVENT_LOG_v1 {"time_micros": 1764670356776009, "job": 28, "event": "compaction_finished", "compaction_time_micros": 106870, "compaction_time_cpu_micros": 50985, "output_level": 6, "num_output_files": 1, "total_output_size": 18489822, "num_input_records": 14176, "num_output_records": 13619, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670356776661, "job": 28, "event": "table_file_deletion", "file_number": 50}
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670356779081, "job": 28, "event": "table_file_deletion", "file_number": 48}
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.666720) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.779193) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.779198) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.779199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.779201) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:12:36.779202) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: pgmap v486: 177 pgs: 177 active+clean; 198 MiB data, 986 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 47 KiB/s wr, 107 op/s
Dec 02 10:12:36 np0005541913.localdomain ceph-mon[298296]: osdmap e201: 6 total, 6 up, 6 in
Dec 02 10:12:36 np0005541913.localdomain podman[330325]: 
Dec 02 10:12:36 np0005541913.localdomain podman[330325]: 2025-12-02 10:12:36.861342931 +0000 UTC m=+0.076176741 container create a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 10:12:36 np0005541913.localdomain systemd[1]: Started libpod-conmon-a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8.scope.
Dec 02 10:12:36 np0005541913.localdomain podman[330325]: 2025-12-02 10:12:36.819528317 +0000 UTC m=+0.034362177 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:12:36 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:12:36 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc0ff32bb3b0c7e43de93978e58a81d8da0d84d48e152a3ded188d7eca22b1fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:12:36 np0005541913.localdomain podman[330325]: 2025-12-02 10:12:36.934061849 +0000 UTC m=+0.148895659 container init a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:12:36 np0005541913.localdomain podman[330325]: 2025-12-02 10:12:36.945575426 +0000 UTC m=+0.160409246 container start a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:36 np0005541913.localdomain dnsmasq[330343]: started, version 2.85 cachesize 150
Dec 02 10:12:36 np0005541913.localdomain dnsmasq[330343]: DNS service limited to local subnets
Dec 02 10:12:36 np0005541913.localdomain dnsmasq[330343]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:12:36 np0005541913.localdomain dnsmasq[330343]: warning: no upstream servers configured
Dec 02 10:12:36 np0005541913.localdomain dnsmasq-dhcp[330343]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Dec 02 10:12:36 np0005541913.localdomain dnsmasq[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/addn_hosts - 0 addresses
Dec 02 10:12:36 np0005541913.localdomain dnsmasq-dhcp[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/host
Dec 02 10:12:36 np0005541913.localdomain dnsmasq-dhcp[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/opts
Dec 02 10:12:36 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:36.987 263406 INFO neutron.agent.dhcp.agent [None req-816e9919-ed63-4148-b713-8cde65fae396 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:12:35Z, description=, device_id=9ad261e0-bab0-4724-94e5-b35ab4156358, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087ffd30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087ffb50>], id=f4a13a37-be3e-4104-8212-c7cc53124943, ip_allocation=immediate, mac_address=fa:16:3e:19:af:81, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:33Z, description=, dns_domain=, id=474eb989-d757-4df7-9a0f-19d414dbaf64, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1375377552, port_security_enabled=True, project_id=8eea084241c14c5d9a6cc0d912041a21, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31758, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2890, status=ACTIVE, subnets=['436074ec-6e3f-4c51-9fe0-3048ad5d2eb8'], tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:35Z, vlan_transparent=None, network_id=474eb989-d757-4df7-9a0f-19d414dbaf64, port_security_enabled=False, project_id=8eea084241c14c5d9a6cc0d912041a21, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2900, status=DOWN, tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:35Z on network 474eb989-d757-4df7-9a0f-19d414dbaf64
Dec 02 10:12:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:37.070 263406 INFO neutron.agent.dhcp.agent [None req-894e3777-1d00-47d5-8d29-055623b7ca5d - - - - - -] DHCP configuration for ports {'f1a94dfd-bb5d-4d7c-81e2-aafa79be9a4c'} is completed
Dec 02 10:12:37 np0005541913.localdomain podman[330361]: 2025-12-02 10:12:37.184055481 +0000 UTC m=+0.057831492 container kill a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:12:37 np0005541913.localdomain dnsmasq[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/addn_hosts - 1 addresses
Dec 02 10:12:37 np0005541913.localdomain dnsmasq-dhcp[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/host
Dec 02 10:12:37 np0005541913.localdomain dnsmasq-dhcp[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/opts
Dec 02 10:12:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:37.410 263406 INFO neutron.agent.dhcp.agent [None req-06253fe5-d958-40ca-9b4a-09511d319b5b - - - - - -] DHCP configuration for ports {'f4a13a37-be3e-4104-8212-c7cc53124943'} is completed
Dec 02 10:12:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:37.480 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:12:35Z, description=, device_id=9ad261e0-bab0-4724-94e5-b35ab4156358, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d02e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d0c70>], id=f4a13a37-be3e-4104-8212-c7cc53124943, ip_allocation=immediate, mac_address=fa:16:3e:19:af:81, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:33Z, description=, dns_domain=, id=474eb989-d757-4df7-9a0f-19d414dbaf64, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1375377552, port_security_enabled=True, project_id=8eea084241c14c5d9a6cc0d912041a21, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31758, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2890, status=ACTIVE, subnets=['436074ec-6e3f-4c51-9fe0-3048ad5d2eb8'], tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:35Z, vlan_transparent=None, network_id=474eb989-d757-4df7-9a0f-19d414dbaf64, port_security_enabled=False, project_id=8eea084241c14c5d9a6cc0d912041a21, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2900, status=DOWN, tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:35Z on network 474eb989-d757-4df7-9a0f-19d414dbaf64
Dec 02 10:12:37 np0005541913.localdomain podman[330396]: 2025-12-02 10:12:37.673865404 +0000 UTC m=+0.052502930 container kill a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:12:37 np0005541913.localdomain dnsmasq[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/addn_hosts - 1 addresses
Dec 02 10:12:37 np0005541913.localdomain dnsmasq-dhcp[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/host
Dec 02 10:12:37 np0005541913.localdomain dnsmasq-dhcp[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/opts
Dec 02 10:12:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e202 e202: 6 total, 6 up, 6 in
Dec 02 10:12:37 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:37.844 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:37 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:12:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:37.959 263406 INFO neutron.agent.dhcp.agent [None req-a03b9791-ca49-4319-9930-7b63218c4c5e - - - - - -] DHCP configuration for ports {'f4a13a37-be3e-4104-8212-c7cc53124943'} is completed
Dec 02 10:12:37 np0005541913.localdomain systemd[1]: tmp-crun.eCzohM.mount: Deactivated successfully.
Dec 02 10:12:37 np0005541913.localdomain podman[330435]: 2025-12-02 10:12:37.982311024 +0000 UTC m=+0.071074995 container kill e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a6d986e-0caf-4eff-b1d3-a10e7add5365, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:12:37 np0005541913.localdomain dnsmasq[329480]: exiting on receipt of SIGTERM
Dec 02 10:12:37 np0005541913.localdomain systemd[1]: libpod-e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02.scope: Deactivated successfully.
Dec 02 10:12:38 np0005541913.localdomain podman[330448]: 2025-12-02 10:12:38.036048575 +0000 UTC m=+0.042741329 container died e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a6d986e-0caf-4eff-b1d3-a10e7add5365, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:12:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:38.104 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:12:38 np0005541913.localdomain podman[330448]: 2025-12-02 10:12:38.1213634 +0000 UTC m=+0.128056124 container cleanup e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a6d986e-0caf-4eff-b1d3-a10e7add5365, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:12:38 np0005541913.localdomain systemd[1]: libpod-conmon-e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02.scope: Deactivated successfully.
Dec 02 10:12:38 np0005541913.localdomain podman[330455]: 2025-12-02 10:12:38.148283607 +0000 UTC m=+0.143700011 container remove e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a6d986e-0caf-4eff-b1d3-a10e7add5365, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:12:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:38Z|00551|binding|INFO|Releasing lport 8b3a7663-ad72-4099-a4de-0fa85d29cfd8 from this chassis (sb_readonly=0)
Dec 02 10:12:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:38.161 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:38 np0005541913.localdomain kernel: device tap8b3a7663-ad left promiscuous mode
Dec 02 10:12:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:38Z|00552|binding|INFO|Setting lport 8b3a7663-ad72-4099-a4de-0fa85d29cfd8 down in Southbound
Dec 02 10:12:38 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:38.189 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:38.626 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-9a6d986e-0caf-4eff-b1d3-a10e7add5365', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a6d986e-0caf-4eff-b1d3-a10e7add5365', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '096ffa0a51b143039159efc232ec547a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fea98301-f19b-4654-8756-4655244bd809, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=8b3a7663-ad72-4099-a4de-0fa85d29cfd8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:38.628 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 8b3a7663-ad72-4099-a4de-0fa85d29cfd8 in datapath 9a6d986e-0caf-4eff-b1d3-a10e7add5365 unbound from our chassis
Dec 02 10:12:38 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:38.628 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:38.629 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9a6d986e-0caf-4eff-b1d3-a10e7add5365 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:38 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:38.630 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[26123019-b16c-45ab-9db3-25fd2c010d9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: pgmap v488: 177 pgs: 177 active+clean; 198 MiB data, 986 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 47 KiB/s wr, 107 op/s
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "format": "json"}]: dispatch
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: osdmap e202: 6 total, 6 up, 6 in
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "bob", "format": "json"}]: dispatch
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Dec 02 10:12:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "auth_id": "bob", "format": "json"}]: dispatch
Dec 02 10:12:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6b61af61b108caaf0c1a1d229e8160b6e6a2c2166342cae9dc858179db74819f-merged.mount: Deactivated successfully.
Dec 02 10:12:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7aa22a26d4884b96d161141fb097bff586446e4178b03bb247630105f691f02-userdata-shm.mount: Deactivated successfully.
Dec 02 10:12:38 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d9a6d986e\x2d0caf\x2d4eff\x2db1d3\x2da10e7add5365.mount: Deactivated successfully.
Dec 02 10:12:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:39.065 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:39 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:39Z|00553|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:12:39 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:39.346 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:39 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/681489516' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:12:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:12:40 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:12:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:40.415 263406 INFO neutron.agent.linux.ip_lib [None req-7475a2be-70ae-4108-8633-1396fe3aec0a - - - - - -] Device tap910e474d-33 cannot be used as it has no MAC address
Dec 02 10:12:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:40.488 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:40 np0005541913.localdomain kernel: device tap910e474d-33 entered promiscuous mode
Dec 02 10:12:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:40Z|00554|binding|INFO|Claiming lport 910e474d-3372-448b-96c4-1b31ecbfdabc for this chassis.
Dec 02 10:12:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:40Z|00555|binding|INFO|910e474d-3372-448b-96c4-1b31ecbfdabc: Claiming unknown
Dec 02 10:12:40 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670360.4956] manager: (tap910e474d-33): new Generic device (/org/freedesktop/NetworkManager/Devices/87)
Dec 02 10:12:40 np0005541913.localdomain systemd-udevd[330518]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:12:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:40.495 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:40 np0005541913.localdomain podman[330479]: 2025-12-02 10:12:40.501066036 +0000 UTC m=+0.169198270 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 02 10:12:40 np0005541913.localdomain podman[330478]: 2025-12-02 10:12:40.501812816 +0000 UTC m=+0.176308039 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:12:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:40.509 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-ead683fd-472e-432f-9d41-70d9f0a3ce59', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ead683fd-472e-432f-9d41-70d9f0a3ce59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8eea084241c14c5d9a6cc0d912041a21', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bda8d64-5938-4ad9-938a-8b5e4fa77265, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=910e474d-3372-448b-96c4-1b31ecbfdabc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:40.510 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 910e474d-3372-448b-96c4-1b31ecbfdabc in datapath ead683fd-472e-432f-9d41-70d9f0a3ce59 bound to our chassis
Dec 02 10:12:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:40.511 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ead683fd-472e-432f-9d41-70d9f0a3ce59 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:40.512 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb0f55d-5395-4dfa-ba4c-df20754d823a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:40Z|00556|binding|INFO|Setting lport 910e474d-3372-448b-96c4-1b31ecbfdabc ovn-installed in OVS
Dec 02 10:12:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:40Z|00557|binding|INFO|Setting lport 910e474d-3372-448b-96c4-1b31ecbfdabc up in Southbound
Dec 02 10:12:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:40.514 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap910e474d-33: No such device
Dec 02 10:12:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap910e474d-33: No such device
Dec 02 10:12:40 np0005541913.localdomain podman[330478]: 2025-12-02 10:12:40.534364873 +0000 UTC m=+0.208860126 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:12:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:40.534 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap910e474d-33: No such device
Dec 02 10:12:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap910e474d-33: No such device
Dec 02 10:12:40 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:12:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap910e474d-33: No such device
Dec 02 10:12:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap910e474d-33: No such device
Dec 02 10:12:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap910e474d-33: No such device
Dec 02 10:12:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap910e474d-33: No such device
Dec 02 10:12:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:40.576 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:40 np0005541913.localdomain podman[330479]: 2025-12-02 10:12:40.600444595 +0000 UTC m=+0.268576829 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 10:12:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:40.599 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:40 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:12:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:40.727 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:40 np0005541913.localdomain ceph-mon[298296]: pgmap v490: 177 pgs: 177 active+clean; 199 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 92 KiB/s wr, 144 op/s
Dec 02 10:12:40 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:12:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:41.224 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:41 np0005541913.localdomain podman[330604]: 
Dec 02 10:12:41 np0005541913.localdomain podman[330604]: 2025-12-02 10:12:41.448047652 +0000 UTC m=+0.085377486 container create 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8.scope.
Dec 02 10:12:41 np0005541913.localdomain podman[330604]: 2025-12-02 10:12:41.400595027 +0000 UTC m=+0.037924841 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:12:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:12:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e4e3f8bacff50ad7014797f289aea2e0fcd601ff6a98e2bf0ffd59761e472e2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:12:41 np0005541913.localdomain podman[330604]: 2025-12-02 10:12:41.519386453 +0000 UTC m=+0.156716257 container init 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:12:41 np0005541913.localdomain podman[330604]: 2025-12-02 10:12:41.526193074 +0000 UTC m=+0.163522878 container start 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:41 np0005541913.localdomain dnsmasq[330622]: started, version 2.85 cachesize 150
Dec 02 10:12:41 np0005541913.localdomain dnsmasq[330622]: DNS service limited to local subnets
Dec 02 10:12:41 np0005541913.localdomain dnsmasq[330622]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:12:41 np0005541913.localdomain dnsmasq[330622]: warning: no upstream servers configured
Dec 02 10:12:41 np0005541913.localdomain dnsmasq-dhcp[330622]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d
Dec 02 10:12:41 np0005541913.localdomain dnsmasq[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/addn_hosts - 0 addresses
Dec 02 10:12:41 np0005541913.localdomain dnsmasq-dhcp[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/host
Dec 02 10:12:41 np0005541913.localdomain dnsmasq-dhcp[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/opts
Dec 02 10:12:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:41.594 263406 INFO neutron.agent.dhcp.agent [None req-7475a2be-70ae-4108-8633-1396fe3aec0a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:12:39Z, description=, device_id=9ad261e0-bab0-4724-94e5-b35ab4156358, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990877b4f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990877b4c0>], id=2d6953c4-8711-483a-9c59-9af2bc2e2b9f, ip_allocation=immediate, mac_address=fa:16:3e:9a:a2:51, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:37Z, description=, dns_domain=, id=ead683fd-472e-432f-9d41-70d9f0a3ce59, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1205334506, port_security_enabled=True, project_id=8eea084241c14c5d9a6cc0d912041a21, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10483, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2925, status=ACTIVE, subnets=['6694b2b3-a432-4b76-871c-299d16c7d158'], tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:39Z, vlan_transparent=None, network_id=ead683fd-472e-432f-9d41-70d9f0a3ce59, port_security_enabled=False, project_id=8eea084241c14c5d9a6cc0d912041a21, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2934, status=DOWN, tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:39Z on network ead683fd-472e-432f-9d41-70d9f0a3ce59
Dec 02 10:12:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:41.728 263406 INFO neutron.agent.dhcp.agent [None req-13ca9244-6b7b-4418-9791-9a2be7ffcb4c - - - - - -] DHCP configuration for ports {'7a9c7117-9f23-4d6a-bb0e-cfade268b4d0'} is completed
Dec 02 10:12:41 np0005541913.localdomain dnsmasq[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/addn_hosts - 1 addresses
Dec 02 10:12:41 np0005541913.localdomain podman[330641]: 2025-12-02 10:12:41.774278336 +0000 UTC m=+0.055209122 container kill 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:12:41 np0005541913.localdomain dnsmasq-dhcp[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/host
Dec 02 10:12:41 np0005541913.localdomain dnsmasq-dhcp[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/opts
Dec 02 10:12:41 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e72625fe-e204-4902-a792-e35cd0c49318", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:12:41 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e72625fe-e204-4902-a792-e35cd0c49318", "format": "json"}]: dispatch
Dec 02 10:12:41 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:12:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:41.928 263406 INFO neutron.agent.dhcp.agent [None req-7475a2be-70ae-4108-8633-1396fe3aec0a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:12:39Z, description=, device_id=9ad261e0-bab0-4724-94e5-b35ab4156358, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990928e6a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990928e340>], id=2d6953c4-8711-483a-9c59-9af2bc2e2b9f, ip_allocation=immediate, mac_address=fa:16:3e:9a:a2:51, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:37Z, description=, dns_domain=, id=ead683fd-472e-432f-9d41-70d9f0a3ce59, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1205334506, port_security_enabled=True, project_id=8eea084241c14c5d9a6cc0d912041a21, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10483, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2925, status=ACTIVE, subnets=['6694b2b3-a432-4b76-871c-299d16c7d158'], tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:39Z, vlan_transparent=None, network_id=ead683fd-472e-432f-9d41-70d9f0a3ce59, port_security_enabled=False, project_id=8eea084241c14c5d9a6cc0d912041a21, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2934, status=DOWN, tags=[], tenant_id=8eea084241c14c5d9a6cc0d912041a21, updated_at=2025-12-02T10:12:39Z on network ead683fd-472e-432f-9d41-70d9f0a3ce59
Dec 02 10:12:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:42.064 263406 INFO neutron.agent.dhcp.agent [None req-d31ab290-2f9f-410d-92b1-1f2a8570896a - - - - - -] DHCP configuration for ports {'2d6953c4-8711-483a-9c59-9af2bc2e2b9f'} is completed
Dec 02 10:12:42 np0005541913.localdomain podman[330681]: 2025-12-02 10:12:42.121979912 +0000 UTC m=+0.040930502 container kill 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:12:42 np0005541913.localdomain dnsmasq[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/addn_hosts - 1 addresses
Dec 02 10:12:42 np0005541913.localdomain dnsmasq-dhcp[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/host
Dec 02 10:12:42 np0005541913.localdomain dnsmasq-dhcp[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/opts
Dec 02 10:12:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:42.318 263406 INFO neutron.agent.dhcp.agent [None req-6642ab26-c777-461d-942f-a4c0b36f6652 - - - - - -] DHCP configuration for ports {'2d6953c4-8711-483a-9c59-9af2bc2e2b9f'} is completed
Dec 02 10:12:43 np0005541913.localdomain ceph-mon[298296]: pgmap v491: 177 pgs: 177 active+clean; 199 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 39 KiB/s wr, 36 op/s
Dec 02 10:12:43 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9e19d1e-178b-4a98-88b5-d79880cd9496", "format": "json"}]: dispatch
Dec 02 10:12:43 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b9e19d1e-178b-4a98-88b5-d79880cd9496", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:43 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:12:43 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "format": "json"}]: dispatch
Dec 02 10:12:44 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:44.506 2 INFO neutron.agent.securitygroups_rpc [None req-4efd9dba-690c-4981-8a45-b91486e5b5eb 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:12:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:44.780 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:45.058 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:45 np0005541913.localdomain ceph-mon[298296]: pgmap v492: 177 pgs: 177 active+clean; 443 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 88 KiB/s rd, 31 MiB/s wr, 143 op/s
Dec 02 10:12:45 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "e72625fe-e204-4902-a792-e35cd0c49318", "new_size": 2147483648, "format": "json"}]: dispatch
Dec 02 10:12:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "format": "json"}]: dispatch
Dec 02 10:12:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4951f94c-f3a4-4170-9869-8238a9dc7b72", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "snap_name": "f755cd55-747e-4403-a245-43cbf9abc4bb", "format": "json"}]: dispatch
Dec 02 10:12:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:46.227 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:47 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:47.113 2 INFO neutron.agent.securitygroups_rpc [None req-6b684e17-61a9-4f64-b7d6-8863ef06b405 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:12:47 np0005541913.localdomain ceph-mon[298296]: pgmap v493: 177 pgs: 177 active+clean; 443 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 82 KiB/s rd, 29 MiB/s wr, 133 op/s
Dec 02 10:12:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e72625fe-e204-4902-a792-e35cd0c49318", "format": "json"}]: dispatch
Dec 02 10:12:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e72625fe-e204-4902-a792-e35cd0c49318", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:47 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:47.323 2 INFO neutron.agent.securitygroups_rpc [None req-6b684e17-61a9-4f64-b7d6-8863ef06b405 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:12:47 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:47.809 2 INFO neutron.agent.securitygroups_rpc [None req-2a241596-c202-4fbe-a6b6-7dd1095be821 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:12:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:47.825 263406 INFO neutron.agent.linux.ip_lib [None req-0989cd2c-e333-4b90-849e-00da2af07b86 - - - - - -] Device tap63f924a2-a2 cannot be used as it has no MAC address
Dec 02 10:12:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:47.847 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:47 np0005541913.localdomain kernel: device tap63f924a2-a2 entered promiscuous mode
Dec 02 10:12:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:47Z|00558|binding|INFO|Claiming lport 63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a for this chassis.
Dec 02 10:12:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:47Z|00559|binding|INFO|63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a: Claiming unknown
Dec 02 10:12:47 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670367.8553] manager: (tap63f924a2-a2): new Generic device (/org/freedesktop/NetworkManager/Devices/88)
Dec 02 10:12:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:47.854 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:47 np0005541913.localdomain systemd-udevd[330711]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:12:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:47.868 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/16', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-06e734ec-67aa-4893-acc9-29e384e3b54b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e734ec-67aa-4893-acc9-29e384e3b54b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b02ac233ae12415688cf9d451b55b171', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d614af94-5391-4170-b5ce-0f6ef9d77e23, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:47.870 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a in datapath 06e734ec-67aa-4893-acc9-29e384e3b54b bound to our chassis
Dec 02 10:12:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:47.872 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 06e734ec-67aa-4893-acc9-29e384e3b54b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:47 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:47.873 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[b6bb848b-9acd-4128-b361-97e32e78a1f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap63f924a2-a2: No such device
Dec 02 10:12:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:47Z|00560|binding|INFO|Setting lport 63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a ovn-installed in OVS
Dec 02 10:12:47 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:47Z|00561|binding|INFO|Setting lport 63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a up in Southbound
Dec 02 10:12:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:47.888 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap63f924a2-a2: No such device
Dec 02 10:12:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap63f924a2-a2: No such device
Dec 02 10:12:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap63f924a2-a2: No such device
Dec 02 10:12:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap63f924a2-a2: No such device
Dec 02 10:12:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap63f924a2-a2: No such device
Dec 02 10:12:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap63f924a2-a2: No such device
Dec 02 10:12:47 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap63f924a2-a2: No such device
Dec 02 10:12:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:47.929 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:47.965 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:48 np0005541913.localdomain ceph-mon[298296]: pgmap v494: 177 pgs: 177 active+clean; 443 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 70 KiB/s rd, 24 MiB/s wr, 114 op/s
Dec 02 10:12:48 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:48.419 2 INFO neutron.agent.securitygroups_rpc [None req-2a4c2d24-b746-4ff5-abcb-d93c9952342d 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:12:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:48.702 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:48 np0005541913.localdomain podman[330782]: 
Dec 02 10:12:48 np0005541913.localdomain podman[330782]: 2025-12-02 10:12:48.878757513 +0000 UTC m=+0.141188413 container create 013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-06e734ec-67aa-4893-acc9-29e384e3b54b, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:12:48 np0005541913.localdomain systemd[1]: Started libpod-conmon-013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208.scope.
Dec 02 10:12:48 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:12:48 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ce5d90708afeefe16b42fc574fe8ffc15e78fbc6fadf28b8ac10c5802cf2141/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:12:48 np0005541913.localdomain podman[330782]: 2025-12-02 10:12:48.84448601 +0000 UTC m=+0.106916910 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:12:48 np0005541913.localdomain podman[330782]: 2025-12-02 10:12:48.949507598 +0000 UTC m=+0.211938518 container init 013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-06e734ec-67aa-4893-acc9-29e384e3b54b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 02 10:12:48 np0005541913.localdomain podman[330782]: 2025-12-02 10:12:48.955665162 +0000 UTC m=+0.218096082 container start 013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-06e734ec-67aa-4893-acc9-29e384e3b54b, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:12:48 np0005541913.localdomain dnsmasq[330800]: started, version 2.85 cachesize 150
Dec 02 10:12:48 np0005541913.localdomain dnsmasq[330800]: DNS service limited to local subnets
Dec 02 10:12:48 np0005541913.localdomain dnsmasq[330800]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:12:48 np0005541913.localdomain dnsmasq[330800]: warning: no upstream servers configured
Dec 02 10:12:48 np0005541913.localdomain dnsmasq-dhcp[330800]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:12:48 np0005541913.localdomain dnsmasq[330800]: read /var/lib/neutron/dhcp/06e734ec-67aa-4893-acc9-29e384e3b54b/addn_hosts - 0 addresses
Dec 02 10:12:48 np0005541913.localdomain dnsmasq-dhcp[330800]: read /var/lib/neutron/dhcp/06e734ec-67aa-4893-acc9-29e384e3b54b/host
Dec 02 10:12:48 np0005541913.localdomain dnsmasq-dhcp[330800]: read /var/lib/neutron/dhcp/06e734ec-67aa-4893-acc9-29e384e3b54b/opts
Dec 02 10:12:49 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:49.125 263406 INFO neutron.agent.dhcp.agent [None req-b0b5b4c3-4d1a-4f18-9dd6-6e947985d07b - - - - - -] DHCP configuration for ports {'8bed618a-c3b3-40e6-9137-108d11128420'} is completed
Dec 02 10:12:49 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "snap_name": "9dc12954-5289-4f2d-a2ed-7e457be2a4e6", "format": "json"}]: dispatch
Dec 02 10:12:49 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:49Z|00562|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:12:50 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:50.021 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:50 np0005541913.localdomain ceph-mon[298296]: pgmap v495: 177 pgs: 177 active+clean; 839 MiB data, 2.8 GiB used, 39 GiB / 42 GiB avail; 108 KiB/s rd, 56 MiB/s wr, 185 op/s
Dec 02 10:12:50 np0005541913.localdomain systemd[1]: tmp-crun.kIKoLH.mount: Deactivated successfully.
Dec 02 10:12:50 np0005541913.localdomain dnsmasq[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/addn_hosts - 0 addresses
Dec 02 10:12:50 np0005541913.localdomain dnsmasq-dhcp[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/host
Dec 02 10:12:50 np0005541913.localdomain dnsmasq-dhcp[330622]: read /var/lib/neutron/dhcp/ead683fd-472e-432f-9d41-70d9f0a3ce59/opts
Dec 02 10:12:50 np0005541913.localdomain podman[330818]: 2025-12-02 10:12:50.904818735 +0000 UTC m=+0.066906644 container kill 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:12:51 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:51Z|00563|binding|INFO|Releasing lport 910e474d-3372-448b-96c4-1b31ecbfdabc from this chassis (sb_readonly=0)
Dec 02 10:12:51 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:51Z|00564|binding|INFO|Setting lport 910e474d-3372-448b-96c4-1b31ecbfdabc down in Southbound
Dec 02 10:12:51 np0005541913.localdomain kernel: device tap910e474d-33 left promiscuous mode
Dec 02 10:12:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:51.174 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:51.187 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-ead683fd-472e-432f-9d41-70d9f0a3ce59', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ead683fd-472e-432f-9d41-70d9f0a3ce59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8eea084241c14c5d9a6cc0d912041a21', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bda8d64-5938-4ad9-938a-8b5e4fa77265, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=910e474d-3372-448b-96c4-1b31ecbfdabc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:51.189 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 910e474d-3372-448b-96c4-1b31ecbfdabc in datapath ead683fd-472e-432f-9d41-70d9f0a3ce59 unbound from our chassis
Dec 02 10:12:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:51.191 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ead683fd-472e-432f-9d41-70d9f0a3ce59 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:51.192 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[67d53b06-57bc-404b-9253-c481fe8e9e7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:51.197 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:51.229 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:51 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "977a2594-0007-4fab-a7e2-b6bc2dee3113", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:12:51 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "977a2594-0007-4fab-a7e2-b6bc2dee3113", "format": "json"}]: dispatch
Dec 02 10:12:51 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:12:51 np0005541913.localdomain podman[330860]: 2025-12-02 10:12:51.724274173 +0000 UTC m=+0.061705975 container kill 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:51 np0005541913.localdomain dnsmasq[330622]: exiting on receipt of SIGTERM
Dec 02 10:12:51 np0005541913.localdomain systemd[1]: libpod-8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8.scope: Deactivated successfully.
Dec 02 10:12:51 np0005541913.localdomain podman[330873]: 2025-12-02 10:12:51.801839989 +0000 UTC m=+0.062669010 container died 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:12:51 np0005541913.localdomain podman[330873]: 2025-12-02 10:12:51.874366492 +0000 UTC m=+0.135195463 container cleanup 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:12:51 np0005541913.localdomain systemd[1]: libpod-conmon-8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8.scope: Deactivated successfully.
Dec 02 10:12:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-7e4e3f8bacff50ad7014797f289aea2e0fcd601ff6a98e2bf0ffd59761e472e2-merged.mount: Deactivated successfully.
Dec 02 10:12:51 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8-userdata-shm.mount: Deactivated successfully.
Dec 02 10:12:51 np0005541913.localdomain podman[330879]: 2025-12-02 10:12:51.980305775 +0000 UTC m=+0.225480259 container remove 8675761b09b43ef4adc4321a4ababcdf03f11703e5c242b5e3d9d3d70c359ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ead683fd-472e-432f-9d41-70d9f0a3ce59, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 02 10:12:52 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dead683fd\x2d472e\x2d432f\x2d9d41\x2d70d9f0a3ce59.mount: Deactivated successfully.
Dec 02 10:12:52 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:52.010 263406 INFO neutron.agent.dhcp.agent [None req-02651b53-7c8d-421f-acf6-e280523ea510 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:52 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:52.115 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:52 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:52Z|00565|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:12:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:52.363 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:52 np0005541913.localdomain ceph-mon[298296]: pgmap v496: 177 pgs: 177 active+clean; 839 MiB data, 2.8 GiB used, 39 GiB / 42 GiB avail; 86 KiB/s rd, 53 MiB/s wr, 151 op/s
Dec 02 10:12:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2286031429' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:12:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2286031429' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:12:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "snap_name": "9dc12954-5289-4f2d-a2ed-7e457be2a4e6_0d6f16e3-0b37-49c0-9f62-73be706c6c58", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "snap_name": "9dc12954-5289-4f2d-a2ed-7e457be2a4e6", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e203 e203: 6 total, 6 up, 6 in
Dec 02 10:12:52 np0005541913.localdomain podman[330920]: 2025-12-02 10:12:52.971282894 +0000 UTC m=+0.045645747 container kill a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:52 np0005541913.localdomain dnsmasq[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/addn_hosts - 0 addresses
Dec 02 10:12:52 np0005541913.localdomain dnsmasq-dhcp[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/host
Dec 02 10:12:52 np0005541913.localdomain dnsmasq-dhcp[330343]: read /var/lib/neutron/dhcp/474eb989-d757-4df7-9a0f-19d414dbaf64/opts
Dec 02 10:12:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:53Z|00566|binding|INFO|Releasing lport c46392fc-2aa4-4d80-b8dd-cf511eace16e from this chassis (sb_readonly=0)
Dec 02 10:12:53 np0005541913.localdomain kernel: device tapc46392fc-2a left promiscuous mode
Dec 02 10:12:53 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:53Z|00567|binding|INFO|Setting lport c46392fc-2aa4-4d80-b8dd-cf511eace16e down in Southbound
Dec 02 10:12:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:53.150 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:53.168 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-474eb989-d757-4df7-9a0f-19d414dbaf64', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-474eb989-d757-4df7-9a0f-19d414dbaf64', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8eea084241c14c5d9a6cc0d912041a21', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8265533d-51c3-4865-8bdc-d09b3aea005a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=c46392fc-2aa4-4d80-b8dd-cf511eace16e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:53.170 160221 INFO neutron.agent.ovn.metadata.agent [-] Port c46392fc-2aa4-4d80-b8dd-cf511eace16e in datapath 474eb989-d757-4df7-9a0f-19d414dbaf64 unbound from our chassis
Dec 02 10:12:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:53.172 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 474eb989-d757-4df7-9a0f-19d414dbaf64 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:53.173 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[b0cc3405-eb6b-4873-8a94-2ccce94a211d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:53 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:53.179 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:53 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:12:53 np0005541913.localdomain podman[330941]: 2025-12-02 10:12:53.458711494 +0000 UTC m=+0.092288361 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Dec 02 10:12:53 np0005541913.localdomain podman[330941]: 2025-12-02 10:12:53.472181963 +0000 UTC m=+0.105758830 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 10:12:53 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:12:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e204 e204: 6 total, 6 up, 6 in
Dec 02 10:12:53 np0005541913.localdomain ceph-mon[298296]: osdmap e203: 6 total, 6 up, 6 in
Dec 02 10:12:53 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:53.791 2 INFO neutron.agent.securitygroups_rpc [None req-099086ea-fe05-4147-b538-a150ea38436a 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:12:54 np0005541913.localdomain podman[330976]: 2025-12-02 10:12:54.194523682 +0000 UTC m=+0.058537531 container kill a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 02 10:12:54 np0005541913.localdomain dnsmasq[330343]: exiting on receipt of SIGTERM
Dec 02 10:12:54 np0005541913.localdomain systemd[1]: libpod-a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8.scope: Deactivated successfully.
Dec 02 10:12:54 np0005541913.localdomain podman[330989]: 2025-12-02 10:12:54.259771221 +0000 UTC m=+0.052001257 container died a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:54 np0005541913.localdomain podman[330989]: 2025-12-02 10:12:54.344289063 +0000 UTC m=+0.136519069 container cleanup a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:54 np0005541913.localdomain systemd[1]: libpod-conmon-a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8.scope: Deactivated successfully.
Dec 02 10:12:54 np0005541913.localdomain podman[330991]: 2025-12-02 10:12:54.371763986 +0000 UTC m=+0.155179287 container remove a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-474eb989-d757-4df7-9a0f-19d414dbaf64, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:54.572 263406 INFO neutron.agent.dhcp.agent [None req-50e02f86-2447-46a4-8ec6-09ae40214694 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e205 e205: 6 total, 6 up, 6 in
Dec 02 10:12:54 np0005541913.localdomain ceph-mon[298296]: pgmap v498: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 122 KiB/s rd, 80 MiB/s wr, 215 op/s
Dec 02 10:12:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "977a2594-0007-4fab-a7e2-b6bc2dee3113", "new_size": 2147483648, "format": "json"}]: dispatch
Dec 02 10:12:54 np0005541913.localdomain ceph-mon[298296]: osdmap e204: 6 total, 6 up, 6 in
Dec 02 10:12:54 np0005541913.localdomain ceph-mon[298296]: osdmap e205: 6 total, 6 up, 6 in
Dec 02 10:12:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:54.733 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:55 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:12:55.086 2 INFO neutron.agent.securitygroups_rpc [None req-51b03213-9479-4e3e-bbd0-ea81e004b8c6 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:12:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:55.143 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-fc0ff32bb3b0c7e43de93978e58a81d8da0d84d48e152a3ded188d7eca22b1fd-merged.mount: Deactivated successfully.
Dec 02 10:12:55 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a36aace63e53e676cd7466ec343081260f54631e9ace4245db330b9311db77c8-userdata-shm.mount: Deactivated successfully.
Dec 02 10:12:55 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d474eb989\x2dd757\x2d4df7\x2d9a0f\x2d19d414dbaf64.mount: Deactivated successfully.
Dec 02 10:12:55 np0005541913.localdomain systemd[1]: tmp-crun.SR6N5z.mount: Deactivated successfully.
Dec 02 10:12:55 np0005541913.localdomain dnsmasq[330800]: exiting on receipt of SIGTERM
Dec 02 10:12:55 np0005541913.localdomain podman[331036]: 2025-12-02 10:12:55.258996739 +0000 UTC m=+0.078226445 container kill 013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-06e734ec-67aa-4893-acc9-29e384e3b54b, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 02 10:12:55 np0005541913.localdomain systemd[1]: libpod-013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208.scope: Deactivated successfully.
Dec 02 10:12:55 np0005541913.localdomain podman[331050]: 2025-12-02 10:12:55.335323214 +0000 UTC m=+0.066008941 container died 013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-06e734ec-67aa-4893-acc9-29e384e3b54b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:55 np0005541913.localdomain podman[331050]: 2025-12-02 10:12:55.367037189 +0000 UTC m=+0.097722836 container cleanup 013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-06e734ec-67aa-4893-acc9-29e384e3b54b, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:12:55 np0005541913.localdomain systemd[1]: libpod-conmon-013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208.scope: Deactivated successfully.
Dec 02 10:12:55 np0005541913.localdomain podman[331052]: 2025-12-02 10:12:55.422373103 +0000 UTC m=+0.142769355 container remove 013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-06e734ec-67aa-4893-acc9-29e384e3b54b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:12:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:55.464 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:55 np0005541913.localdomain kernel: device tap63f924a2-a2 left promiscuous mode
Dec 02 10:12:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:55Z|00568|binding|INFO|Releasing lport 63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a from this chassis (sb_readonly=0)
Dec 02 10:12:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:55Z|00569|binding|INFO|Setting lport 63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a down in Southbound
Dec 02 10:12:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:55.476 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/16', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-06e734ec-67aa-4893-acc9-29e384e3b54b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e734ec-67aa-4893-acc9-29e384e3b54b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b02ac233ae12415688cf9d451b55b171', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d614af94-5391-4170-b5ce-0f6ef9d77e23, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:55.478 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 63f924a2-a2a4-4cfe-a211-cbc03e5f1e6a in datapath 06e734ec-67aa-4893-acc9-29e384e3b54b unbound from our chassis
Dec 02 10:12:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:55.481 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 06e734ec-67aa-4893-acc9-29e384e3b54b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:12:55 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:55.482 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb4a782-1793-4f8b-87a4-e3e25e1a5220]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:55.486 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:55 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:55Z|00570|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:12:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:55.619 263406 INFO neutron.agent.dhcp.agent [None req-43365a34-7e51-47b7-98ba-33708ce982ca - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:55.621 263406 INFO neutron.agent.dhcp.agent [None req-43365a34-7e51-47b7-98ba-33708ce982ca - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:55.628 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:55 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "snap_name": "f755cd55-747e-4403-a245-43cbf9abc4bb_c1048806-17d0-47ca-8ee7-cf284ee33136", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e206 e206: 6 total, 6 up, 6 in
Dec 02 10:12:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:55.928 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:12:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-1ce5d90708afeefe16b42fc574fe8ffc15e78fbc6fadf28b8ac10c5802cf2141-merged.mount: Deactivated successfully.
Dec 02 10:12:56 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-013abab66e6cef5fba071da9ad9a421e1736c8950ee0cc8858d0e59d9c6bd208-userdata-shm.mount: Deactivated successfully.
Dec 02 10:12:56 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d06e734ec\x2d67aa\x2d4893\x2dacc9\x2d29e384e3b54b.mount: Deactivated successfully.
Dec 02 10:12:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:56.232 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:12:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e207 e207: 6 total, 6 up, 6 in
Dec 02 10:12:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "snap_name": "f755cd55-747e-4403-a245-43cbf9abc4bb", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:56 np0005541913.localdomain ceph-mon[298296]: pgmap v501: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 118 KiB/s rd, 67 MiB/s wr, 199 op/s
Dec 02 10:12:56 np0005541913.localdomain ceph-mon[298296]: osdmap e206: 6 total, 6 up, 6 in
Dec 02 10:12:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:56.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:12:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:12:57 np0005541913.localdomain podman[331077]: 2025-12-02 10:12:57.436357634 +0000 UTC m=+0.079075679 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:12:57 np0005541913.localdomain podman[331077]: 2025-12-02 10:12:57.467162535 +0000 UTC m=+0.109880580 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:12:57 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:12:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e208 e208: 6 total, 6 up, 6 in
Dec 02 10:12:57 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "977a2594-0007-4fab-a7e2-b6bc2dee3113", "format": "json"}]: dispatch
Dec 02 10:12:57 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "977a2594-0007-4fab-a7e2-b6bc2dee3113", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:57 np0005541913.localdomain ceph-mon[298296]: osdmap e207: 6 total, 6 up, 6 in
Dec 02 10:12:57 np0005541913.localdomain ceph-mon[298296]: osdmap e208: 6 total, 6 up, 6 in
Dec 02 10:12:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:12:58 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:12:58 np0005541913.localdomain podman[331094]: 2025-12-02 10:12:58.442523958 +0000 UTC m=+0.084402180 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Dec 02 10:12:58 np0005541913.localdomain podman[331094]: 2025-12-02 10:12:58.484022874 +0000 UTC m=+0.125901036 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible)
Dec 02 10:12:58 np0005541913.localdomain podman[331095]: 2025-12-02 10:12:58.496843105 +0000 UTC m=+0.131822114 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:12:58 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:12:58 np0005541913.localdomain podman[331095]: 2025-12-02 10:12:58.510223572 +0000 UTC m=+0.145202601 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:12:58 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:12:58 np0005541913.localdomain ceph-mon[298296]: pgmap v504: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail
Dec 02 10:12:58 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e209 e209: 6 total, 6 up, 6 in
Dec 02 10:12:58 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:12:58.837 263406 INFO neutron.agent.linux.ip_lib [None req-789dd8d6-e996-4ca0-8760-61ecbd03a21d - - - - - -] Device tap0eafd6e9-4e cannot be used as it has no MAC address
Dec 02 10:12:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:58.888 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:58 np0005541913.localdomain kernel: device tap0eafd6e9-4e entered promiscuous mode
Dec 02 10:12:58 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670378.8969] manager: (tap0eafd6e9-4e): new Generic device (/org/freedesktop/NetworkManager/Devices/89)
Dec 02 10:12:58 np0005541913.localdomain systemd-udevd[331145]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:12:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:58.903 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:58 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:58Z|00571|binding|INFO|Claiming lport 0eafd6e9-4ed5-4679-82a5-3fa515f3ef91 for this chassis.
Dec 02 10:12:58 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:58Z|00572|binding|INFO|0eafd6e9-4ed5-4679-82a5-3fa515f3ef91: Claiming unknown
Dec 02 10:12:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:58.917 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-c0f42f6a-43c1-485f-b58f-6f747f110eb9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0f42f6a-43c1-485f-b58f-6f747f110eb9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14bea8ce-4017-4cce-9625-2e1c4e2524ca, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0eafd6e9-4ed5-4679-82a5-3fa515f3ef91) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:12:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:58.919 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0eafd6e9-4ed5-4679-82a5-3fa515f3ef91 in datapath c0f42f6a-43c1-485f-b58f-6f747f110eb9 bound to our chassis
Dec 02 10:12:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:58.921 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0f42f6a-43c1-485f-b58f-6f747f110eb9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:12:58 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:12:58.921 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[82561596-d9bd-448f-9d5f-16bc1816ebbc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:12:58 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0eafd6e9-4e: No such device
Dec 02 10:12:58 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0eafd6e9-4e: No such device
Dec 02 10:12:58 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0eafd6e9-4e: No such device
Dec 02 10:12:58 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:58Z|00573|binding|INFO|Setting lport 0eafd6e9-4ed5-4679-82a5-3fa515f3ef91 ovn-installed in OVS
Dec 02 10:12:58 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:12:58Z|00574|binding|INFO|Setting lport 0eafd6e9-4ed5-4679-82a5-3fa515f3ef91 up in Southbound
Dec 02 10:12:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:58.941 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:58 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0eafd6e9-4e: No such device
Dec 02 10:12:58 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0eafd6e9-4e: No such device
Dec 02 10:12:58 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0eafd6e9-4e: No such device
Dec 02 10:12:58 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0eafd6e9-4e: No such device
Dec 02 10:12:58 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap0eafd6e9-4e: No such device
Dec 02 10:12:58 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:58.973 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:12:59.010 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:12:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "format": "json"}]: dispatch
Dec 02 10:12:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8115277a-c4bb-4c47-9857-029dcd8c9879", "force": true, "format": "json"}]: dispatch
Dec 02 10:12:59 np0005541913.localdomain ceph-mon[298296]: osdmap e209: 6 total, 6 up, 6 in
Dec 02 10:12:59 np0005541913.localdomain podman[331216]: 
Dec 02 10:12:59 np0005541913.localdomain podman[331216]: 2025-12-02 10:12:59.899253908 +0000 UTC m=+0.099556444 container create 0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f42f6a-43c1-485f-b58f-6f747f110eb9, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:12:59 np0005541913.localdomain systemd[1]: Started libpod-conmon-0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3.scope.
Dec 02 10:12:59 np0005541913.localdomain podman[331216]: 2025-12-02 10:12:59.848558417 +0000 UTC m=+0.048861013 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:12:59 np0005541913.localdomain systemd[1]: tmp-crun.VYBSaP.mount: Deactivated successfully.
Dec 02 10:12:59 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:12:59 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/911e3bfa49476741920b436c07a5e2f1b04d6d325f1750f3ae03d6c86d880d03/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:12:59 np0005541913.localdomain podman[331216]: 2025-12-02 10:12:59.976169158 +0000 UTC m=+0.176471694 container init 0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f42f6a-43c1-485f-b58f-6f747f110eb9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:12:59 np0005541913.localdomain podman[331216]: 2025-12-02 10:12:59.987355296 +0000 UTC m=+0.187657832 container start 0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f42f6a-43c1-485f-b58f-6f747f110eb9, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:12:59 np0005541913.localdomain dnsmasq[331234]: started, version 2.85 cachesize 150
Dec 02 10:12:59 np0005541913.localdomain dnsmasq[331234]: DNS service limited to local subnets
Dec 02 10:12:59 np0005541913.localdomain dnsmasq[331234]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:12:59 np0005541913.localdomain dnsmasq[331234]: warning: no upstream servers configured
Dec 02 10:12:59 np0005541913.localdomain dnsmasq-dhcp[331234]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:12:59 np0005541913.localdomain dnsmasq[331234]: read /var/lib/neutron/dhcp/c0f42f6a-43c1-485f-b58f-6f747f110eb9/addn_hosts - 0 addresses
Dec 02 10:12:59 np0005541913.localdomain dnsmasq-dhcp[331234]: read /var/lib/neutron/dhcp/c0f42f6a-43c1-485f-b58f-6f747f110eb9/host
Dec 02 10:12:59 np0005541913.localdomain dnsmasq-dhcp[331234]: read /var/lib/neutron/dhcp/c0f42f6a-43c1-485f-b58f-6f747f110eb9/opts
Dec 02 10:13:00 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:00.177 263406 INFO neutron.agent.dhcp.agent [None req-65c51da1-bb7a-4031-858c-2702d92c92b4 - - - - - -] DHCP configuration for ports {'aa61ce72-44d2-4708-b516-008529f36e5b'} is completed
Dec 02 10:13:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:00.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:00.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:13:00 np0005541913.localdomain ceph-mon[298296]: pgmap v507: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 173 KiB/s rd, 90 KiB/s wr, 304 op/s
Dec 02 10:13:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c36fc0cf-86b1-4fe5-92a4-23ca7fc4ab60", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:13:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c36fc0cf-86b1-4fe5-92a4-23ca7fc4ab60", "format": "json"}]: dispatch
Dec 02 10:13:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:13:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e210 e210: 6 total, 6 up, 6 in
Dec 02 10:13:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:01.235 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e211 e211: 6 total, 6 up, 6 in
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.667165) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670381667231, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 748, "num_deletes": 261, "total_data_size": 737478, "memory_usage": 751592, "flush_reason": "Manual Compaction"}
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670381672811, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 482171, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31264, "largest_seqno": 32007, "table_properties": {"data_size": 478562, "index_size": 1400, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9028, "raw_average_key_size": 19, "raw_value_size": 470871, "raw_average_value_size": 1037, "num_data_blocks": 60, "num_entries": 454, "num_filter_entries": 454, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670356, "oldest_key_time": 1764670356, "file_creation_time": 1764670381, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 5683 microseconds, and 2270 cpu microseconds.
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.672854) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 482171 bytes OK
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.672874) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.676025) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.676048) EVENT_LOG_v1 {"time_micros": 1764670381676042, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.676070) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 733301, prev total WAL file size 733301, number of live WAL files 2.
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.676815) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323730' seq:72057594037927935, type:22 .. '6C6F676D0034353234' seq:0, type:0; will stop at (end)
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(470KB)], [51(17MB)]
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670381676880, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 18971993, "oldest_snapshot_seqno": -1}
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 13531 keys, 18454636 bytes, temperature: kUnknown
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670381784239, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 18454636, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18376453, "index_size": 43266, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33861, "raw_key_size": 363209, "raw_average_key_size": 26, "raw_value_size": 18145232, "raw_average_value_size": 1341, "num_data_blocks": 1625, "num_entries": 13531, "num_filter_entries": 13531, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670381, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.784554) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 18454636 bytes
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.786332) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.6 rd, 171.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 17.6 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(77.6) write-amplify(38.3) OK, records in: 14073, records dropped: 542 output_compression: NoCompression
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.786365) EVENT_LOG_v1 {"time_micros": 1764670381786351, "job": 30, "event": "compaction_finished", "compaction_time_micros": 107458, "compaction_time_cpu_micros": 51577, "output_level": 6, "num_output_files": 1, "total_output_size": 18454636, "num_input_records": 14073, "num_output_records": 13531, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670381786787, "job": 30, "event": "table_file_deletion", "file_number": 53}
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670381789461, "job": 30, "event": "table_file_deletion", "file_number": 51}
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.676599) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.789586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.789594) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.789598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.789602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:13:01.789606) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:13:01 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:01Z|00575|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:13:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:01.883 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:01 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:01.925 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:01Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087ca5e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088547f0>], id=9d3551a6-415a-4192-871f-74c3fa9d8968, ip_allocation=immediate, mac_address=fa:16:3e:69:65:d2, name=tempest-PortsTestJSON-1035181662, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:57Z, description=, dns_domain=, id=c0f42f6a-43c1-485f-b58f-6f747f110eb9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-474863926, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14478, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3002, status=ACTIVE, subnets=['5fceb30f-da49-41a2-b0c9-d0d13330eb82'], tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:12:57Z, vlan_transparent=None, network_id=c0f42f6a-43c1-485f-b58f-6f747f110eb9, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3023, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:01Z on network c0f42f6a-43c1-485f-b58f-6f747f110eb9
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: osdmap e210: 6 total, 6 up, 6 in
Dec 02 10:13:01 np0005541913.localdomain ceph-mon[298296]: osdmap e211: 6 total, 6 up, 6 in
Dec 02 10:13:02 np0005541913.localdomain systemd[1]: tmp-crun.QfcvDU.mount: Deactivated successfully.
Dec 02 10:13:02 np0005541913.localdomain dnsmasq[331234]: read /var/lib/neutron/dhcp/c0f42f6a-43c1-485f-b58f-6f747f110eb9/addn_hosts - 1 addresses
Dec 02 10:13:02 np0005541913.localdomain dnsmasq-dhcp[331234]: read /var/lib/neutron/dhcp/c0f42f6a-43c1-485f-b58f-6f747f110eb9/host
Dec 02 10:13:02 np0005541913.localdomain dnsmasq-dhcp[331234]: read /var/lib/neutron/dhcp/c0f42f6a-43c1-485f-b58f-6f747f110eb9/opts
Dec 02 10:13:02 np0005541913.localdomain podman[331252]: 2025-12-02 10:13:02.14218027 +0000 UTC m=+0.056965600 container kill 0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f42f6a-43c1-485f-b58f-6f747f110eb9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:13:02 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:02.342 263406 INFO neutron.agent.dhcp.agent [None req-ab85030f-0ef5-44bf-ac9a-e8f5e365dcf7 - - - - - -] DHCP configuration for ports {'9d3551a6-415a-4192-871f-74c3fa9d8968'} is completed
Dec 02 10:13:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:02.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:02.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:13:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:02.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:13:02 np0005541913.localdomain ceph-mon[298296]: pgmap v509: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 156 KiB/s rd, 82 KiB/s wr, 274 op/s
Dec 02 10:13:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e212 e212: 6 total, 6 up, 6 in
Dec 02 10:13:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:02.978 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:13:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:02.979 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:13:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:02.979 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:13:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:02.980 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:13:02 np0005541913.localdomain systemd[1]: tmp-crun.KTOEGJ.mount: Deactivated successfully.
Dec 02 10:13:02 np0005541913.localdomain dnsmasq[331234]: read /var/lib/neutron/dhcp/c0f42f6a-43c1-485f-b58f-6f747f110eb9/addn_hosts - 0 addresses
Dec 02 10:13:02 np0005541913.localdomain podman[331291]: 2025-12-02 10:13:02.988092372 +0000 UTC m=+0.068292410 container kill 0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f42f6a-43c1-485f-b58f-6f747f110eb9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:13:02 np0005541913.localdomain dnsmasq-dhcp[331234]: read /var/lib/neutron/dhcp/c0f42f6a-43c1-485f-b58f-6f747f110eb9/host
Dec 02 10:13:02 np0005541913.localdomain dnsmasq-dhcp[331234]: read /var/lib/neutron/dhcp/c0f42f6a-43c1-485f-b58f-6f747f110eb9/opts
Dec 02 10:13:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:03.056 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:13:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:03.057 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:13:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:03.058 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:13:03 np0005541913.localdomain dnsmasq[331234]: exiting on receipt of SIGTERM
Dec 02 10:13:03 np0005541913.localdomain podman[331329]: 2025-12-02 10:13:03.544005677 +0000 UTC m=+0.058032978 container kill 0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f42f6a-43c1-485f-b58f-6f747f110eb9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 02 10:13:03 np0005541913.localdomain systemd[1]: libpod-0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3.scope: Deactivated successfully.
Dec 02 10:13:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:03Z|00576|binding|INFO|Removing iface tap0eafd6e9-4e ovn-installed in OVS
Dec 02 10:13:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:03.549 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 8506d660-1e86-469c-a0f8-c15c752515df with type ""
Dec 02 10:13:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:03Z|00577|binding|INFO|Removing lport 0eafd6e9-4ed5-4679-82a5-3fa515f3ef91 ovn-installed in OVS
Dec 02 10:13:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:03.551 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-c0f42f6a-43c1-485f-b58f-6f747f110eb9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0f42f6a-43c1-485f-b58f-6f747f110eb9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14bea8ce-4017-4cce-9625-2e1c4e2524ca, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=0eafd6e9-4ed5-4679-82a5-3fa515f3ef91) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.550 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:03.553 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 0eafd6e9-4ed5-4679-82a5-3fa515f3ef91 in datapath c0f42f6a-43c1-485f-b58f-6f747f110eb9 unbound from our chassis
Dec 02 10:13:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:03.556 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0f42f6a-43c1-485f-b58f-6f747f110eb9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.558 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:03.557 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8ba699-dcbd-4247-9946-016ab632b3dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:03 np0005541913.localdomain podman[331341]: 2025-12-02 10:13:03.619470238 +0000 UTC m=+0.061431218 container died 0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f42f6a-43c1-485f-b58f-6f747f110eb9, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:13:03 np0005541913.localdomain systemd[1]: tmp-crun.c1vorU.mount: Deactivated successfully.
Dec 02 10:13:03 np0005541913.localdomain podman[331341]: 2025-12-02 10:13:03.712479537 +0000 UTC m=+0.154440467 container cleanup 0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f42f6a-43c1-485f-b58f-6f747f110eb9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:13:03 np0005541913.localdomain systemd[1]: libpod-conmon-0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3.scope: Deactivated successfully.
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.734 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:13:03 np0005541913.localdomain podman[331348]: 2025-12-02 10:13:03.735931032 +0000 UTC m=+0.159077550 container remove 0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f42f6a-43c1-485f-b58f-6f747f110eb9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:13:03 np0005541913.localdomain kernel: device tap0eafd6e9-4e left promiscuous mode
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.749 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.750 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.750 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.752 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.768 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.774 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.775 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.775 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.776 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:13:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:03.776 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:13:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:03.795 263406 INFO neutron.agent.dhcp.agent [None req-d0c861b6-4f2f-40e5-a824-2429d60fa29c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:03 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:03.798 263406 INFO neutron.agent.dhcp.agent [None req-d0c861b6-4f2f-40e5-a824-2429d60fa29c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:03 np0005541913.localdomain ceph-mon[298296]: osdmap e212: 6 total, 6 up, 6 in
Dec 02 10:13:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:13:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:13:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:13:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:13:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:13:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:13:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:13:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:13:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1647211614' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.346 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.570s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:13:04 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:04Z|00578|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.405 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.405 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.413 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-911e3bfa49476741920b436c07a5e2f1b04d6d325f1750f3ae03d6c86d880d03-merged.mount: Deactivated successfully.
Dec 02 10:13:04 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0b7cb64389e578053d4e84fe477f9693ceb84b709a9d41d8a7372798b604b5a3-userdata-shm.mount: Deactivated successfully.
Dec 02 10:13:04 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dc0f42f6a\x2d43c1\x2d485f\x2db58f\x2d6f747f110eb9.mount: Deactivated successfully.
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.586 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.587 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11156MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.587 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.587 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:13:04 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:04.690 2 INFO neutron.agent.securitygroups_rpc [None req-87ab1b98-cb82-4479-9190-360042c3aeed 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.803 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.804 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.804 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:13:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:04.937 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:13:04 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:04.986 2 INFO neutron.agent.securitygroups_rpc [None req-c2e2f39f-f9d0-4681-8662-02fbec20bbf5 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:05 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:05.002 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: pgmap v512: 177 pgs: 3 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 170 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 87 KiB/s rd, 56 KiB/s wr, 125 op/s
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c36fc0cf-86b1-4fe5-92a4-23ca7fc4ab60", "format": "json"}]: dispatch
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c36fc0cf-86b1-4fe5-92a4-23ca7fc4ab60", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1077481991' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1077481991' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1647211614' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3306293220' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3306293220' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1146820639' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1146820639' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e213 e213: 6 total, 6 up, 6 in
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:13:05 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3953452699' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:13:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:05.398 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:13:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:05.405 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:13:05 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:05.407 2 INFO neutron.agent.securitygroups_rpc [None req-2bcc8101-e89e-4e48-9cb1-9d691b1fbb0a 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:05.420 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:13:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:05.423 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:13:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:05.423 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:13:05 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:05.766 2 INFO neutron.agent.securitygroups_rpc [None req-af45566d-4b46-4c57-afce-d4fa5a30fbdd 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:05 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:05.784 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:06 np0005541913.localdomain ceph-mon[298296]: osdmap e213: 6 total, 6 up, 6 in
Dec 02 10:13:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3953452699' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:13:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:13:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:13:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:13:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:13:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:13:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18780 "" "Go-http-client/1.1"
Dec 02 10:13:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:06.239 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:06 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:06.261 2 INFO neutron.agent.securitygroups_rpc [None req-4587ba21-504d-4c76-ba2e-bc511139041d 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:13:06 np0005541913.localdomain podman[331417]: 2025-12-02 10:13:06.437065155 +0000 UTC m=+0.076840239 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:13:06 np0005541913.localdomain podman[331417]: 2025-12-02 10:13:06.454633823 +0000 UTC m=+0.094408887 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:13:06 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:13:06 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:06Z|00579|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:13:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:06.631 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e214 e214: 6 total, 6 up, 6 in
Dec 02 10:13:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:06.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:06.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:06.829 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:06.829 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:06.830 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 02 10:13:07 np0005541913.localdomain ceph-mon[298296]: pgmap v514: 177 pgs: 3 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 170 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 89 KiB/s rd, 58 KiB/s wr, 127 op/s
Dec 02 10:13:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2337962783' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2337962783' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:07 np0005541913.localdomain ceph-mon[298296]: osdmap e214: 6 total, 6 up, 6 in
Dec 02 10:13:07 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:13:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/641882817' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:13:07 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:07.212 2 INFO neutron.agent.securitygroups_rpc [None req-5417d582-9df9-4660-b1a1-ab7e1fcb97ff 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:07.237 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:07.871 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:07.872 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:07.873 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 02 10:13:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:07.938 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 02 10:13:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1abd7d7a-1fad-4e16-a25e-c36a0784c2b0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:13:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1abd7d7a-1fad-4e16-a25e-c36a0784c2b0", "format": "json"}]: dispatch
Dec 02 10:13:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:13:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:08.893 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:09 np0005541913.localdomain ceph-mon[298296]: pgmap v516: 177 pgs: 3 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 170 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 68 KiB/s rd, 44 KiB/s wr, 98 op/s
Dec 02 10:13:09 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1eb437a1-55f1-4e1b-ab0c-b43c26904e3d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:13:09 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1eb437a1-55f1-4e1b-ab0c-b43c26904e3d", "format": "json"}]: dispatch
Dec 02 10:13:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3210582778' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:13:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e215 e215: 6 total, 6 up, 6 in
Dec 02 10:13:09 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:09.284 263406 INFO neutron.agent.linux.ip_lib [None req-a77f70ce-56fc-404e-8c2b-e755c3e8b3be - - - - - -] Device tapdf392c2e-28 cannot be used as it has no MAC address
Dec 02 10:13:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:09.308 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:09 np0005541913.localdomain kernel: device tapdf392c2e-28 entered promiscuous mode
Dec 02 10:13:09 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670389.3157] manager: (tapdf392c2e-28): new Generic device (/org/freedesktop/NetworkManager/Devices/90)
Dec 02 10:13:09 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:09Z|00580|binding|INFO|Claiming lport df392c2e-2825-4eb5-b487-3c55b6dc044b for this chassis.
Dec 02 10:13:09 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:09Z|00581|binding|INFO|df392c2e-2825-4eb5-b487-3c55b6dc044b: Claiming unknown
Dec 02 10:13:09 np0005541913.localdomain systemd-udevd[331447]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:13:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:09.319 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:09.327 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-e625cddc-8a19-4455-8def-acda09527180', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e625cddc-8a19-4455-8def-acda09527180', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39bb33c3-24c9-42a5-b452-f2a8901739e7, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=df392c2e-2825-4eb5-b487-3c55b6dc044b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:09.329 160221 INFO neutron.agent.ovn.metadata.agent [-] Port df392c2e-2825-4eb5-b487-3c55b6dc044b in datapath e625cddc-8a19-4455-8def-acda09527180 bound to our chassis
Dec 02 10:13:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:09.330 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e625cddc-8a19-4455-8def-acda09527180 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:13:09 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:09.331 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[178b507e-ac4b-4382-912e-628ad2a28035]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:09 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapdf392c2e-28: No such device
Dec 02 10:13:09 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapdf392c2e-28: No such device
Dec 02 10:13:09 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:09Z|00582|binding|INFO|Setting lport df392c2e-2825-4eb5-b487-3c55b6dc044b ovn-installed in OVS
Dec 02 10:13:09 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:09Z|00583|binding|INFO|Setting lport df392c2e-2825-4eb5-b487-3c55b6dc044b up in Southbound
Dec 02 10:13:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:09.353 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:09 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapdf392c2e-28: No such device
Dec 02 10:13:09 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapdf392c2e-28: No such device
Dec 02 10:13:09 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapdf392c2e-28: No such device
Dec 02 10:13:09 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapdf392c2e-28: No such device
Dec 02 10:13:09 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapdf392c2e-28: No such device
Dec 02 10:13:09 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapdf392c2e-28: No such device
Dec 02 10:13:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:09.390 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:09.422 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:10 np0005541913.localdomain ceph-mon[298296]: osdmap e215: 6 total, 6 up, 6 in
Dec 02 10:13:10 np0005541913.localdomain ceph-mon[298296]: pgmap v518: 177 pgs: 177 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 51 KiB/s wr, 113 op/s
Dec 02 10:13:10 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1abd7d7a-1fad-4e16-a25e-c36a0784c2b0", "format": "json"}]: dispatch
Dec 02 10:13:10 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1abd7d7a-1fad-4e16-a25e-c36a0784c2b0", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:10 np0005541913.localdomain podman[331518]: 
Dec 02 10:13:10 np0005541913.localdomain podman[331518]: 2025-12-02 10:13:10.288222475 +0000 UTC m=+0.075982626 container create 5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 02 10:13:10 np0005541913.localdomain systemd[1]: Started libpod-conmon-5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601.scope.
Dec 02 10:13:10 np0005541913.localdomain systemd[1]: tmp-crun.AW8p4s.mount: Deactivated successfully.
Dec 02 10:13:10 np0005541913.localdomain podman[331518]: 2025-12-02 10:13:10.244389727 +0000 UTC m=+0.032149948 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:13:10 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:13:10 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a66ae28acad9356d960c9af761227f8a8a17b603705a375c63844eea0d7621df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:13:10 np0005541913.localdomain podman[331518]: 2025-12-02 10:13:10.377793102 +0000 UTC m=+0.165553253 container init 5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 02 10:13:10 np0005541913.localdomain podman[331518]: 2025-12-02 10:13:10.384960383 +0000 UTC m=+0.172720564 container start 5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 02 10:13:10 np0005541913.localdomain dnsmasq[331536]: started, version 2.85 cachesize 150
Dec 02 10:13:10 np0005541913.localdomain dnsmasq[331536]: DNS service limited to local subnets
Dec 02 10:13:10 np0005541913.localdomain dnsmasq[331536]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:13:10 np0005541913.localdomain dnsmasq[331536]: warning: no upstream servers configured
Dec 02 10:13:10 np0005541913.localdomain dnsmasq-dhcp[331536]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:13:10 np0005541913.localdomain dnsmasq[331536]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/addn_hosts - 0 addresses
Dec 02 10:13:10 np0005541913.localdomain dnsmasq-dhcp[331536]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/host
Dec 02 10:13:10 np0005541913.localdomain dnsmasq-dhcp[331536]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/opts
Dec 02 10:13:10 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:10.560 263406 INFO neutron.agent.dhcp.agent [None req-0e179484-72bb-43d3-87a5-c1213fc25ddf - - - - - -] DHCP configuration for ports {'f6a41283-fc6f-4680-bb13-d9b38f4d32ad'} is completed
Dec 02 10:13:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:13:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:13:11 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "1eb437a1-55f1-4e1b-ab0c-b43c26904e3d", "snap_name": "d93e1e67-e087-48ee-b546-b47b516fdb8a", "format": "json"}]: dispatch
Dec 02 10:13:11 np0005541913.localdomain podman[331548]: 2025-12-02 10:13:11.228745028 +0000 UTC m=+0.108028729 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:13:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:11.243 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:11 np0005541913.localdomain podman[331574]: 2025-12-02 10:13:11.24718167 +0000 UTC m=+0.061518510 container kill 5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:13:11 np0005541913.localdomain dnsmasq[331536]: exiting on receipt of SIGTERM
Dec 02 10:13:11 np0005541913.localdomain systemd[1]: libpod-5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601.scope: Deactivated successfully.
Dec 02 10:13:11 np0005541913.localdomain podman[331546]: 2025-12-02 10:13:11.206210418 +0000 UTC m=+0.090399420 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:13:11 np0005541913.localdomain podman[331548]: 2025-12-02 10:13:11.267488181 +0000 UTC m=+0.146771842 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:13:11 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:13:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:11 np0005541913.localdomain podman[331546]: 2025-12-02 10:13:11.288974604 +0000 UTC m=+0.173163626 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:13:11 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:13:11 np0005541913.localdomain podman[331614]: 2025-12-02 10:13:11.328060116 +0000 UTC m=+0.060612517 container died 5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:13:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601-userdata-shm.mount: Deactivated successfully.
Dec 02 10:13:11 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-a66ae28acad9356d960c9af761227f8a8a17b603705a375c63844eea0d7621df-merged.mount: Deactivated successfully.
Dec 02 10:13:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:11.354 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:47:bc 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-e625cddc-8a19-4455-8def-acda09527180', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e625cddc-8a19-4455-8def-acda09527180', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39bb33c3-24c9-42a5-b452-f2a8901739e7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f6a41283-fc6f-4680-bb13-d9b38f4d32ad) old=Port_Binding(mac=['fa:16:3e:b4:47:bc 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-e625cddc-8a19-4455-8def-acda09527180', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e625cddc-8a19-4455-8def-acda09527180', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:11.357 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f6a41283-fc6f-4680-bb13-d9b38f4d32ad in datapath e625cddc-8a19-4455-8def-acda09527180 updated
Dec 02 10:13:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:11.361 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port 72691af4-7c63-4983-b608-e1485f951d79 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:13:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:11.362 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e625cddc-8a19-4455-8def-acda09527180, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:13:11 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:11.363 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[88eac451-4022-4ab2-a8e2-da9488e2dd79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:11 np0005541913.localdomain podman[331614]: 2025-12-02 10:13:11.368745829 +0000 UTC m=+0.101298160 container remove 5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:13:11 np0005541913.localdomain systemd[1]: libpod-conmon-5ec97774e8c59a213cb27abffc1bdf99d569ecadc5b8c88ddeb4534b0f5a9601.scope: Deactivated successfully.
Dec 02 10:13:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e216 e216: 6 total, 6 up, 6 in
Dec 02 10:13:12 np0005541913.localdomain ceph-mon[298296]: pgmap v519: 177 pgs: 177 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 50 KiB/s wr, 110 op/s
Dec 02 10:13:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2086186195' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2086186195' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:12 np0005541913.localdomain ceph-mon[298296]: osdmap e216: 6 total, 6 up, 6 in
Dec 02 10:13:12 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:12.476 2 INFO neutron.agent.securitygroups_rpc [None req-7de79900-a7b1-40d4-917a-526ff9de5d92 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:12 np0005541913.localdomain podman[331691]: 
Dec 02 10:13:12 np0005541913.localdomain podman[331691]: 2025-12-02 10:13:12.737354662 +0000 UTC m=+0.091365746 container create bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:13:12 np0005541913.localdomain systemd[1]: Started libpod-conmon-bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a.scope.
Dec 02 10:13:12 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:13:12 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f931dc05886cc5f2c688b7d673d13227eb02357be8cf59904bb71f31ec32857/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:13:12 np0005541913.localdomain podman[331691]: 2025-12-02 10:13:12.692874786 +0000 UTC m=+0.046885910 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:13:12 np0005541913.localdomain podman[331691]: 2025-12-02 10:13:12.803031962 +0000 UTC m=+0.157043036 container init bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:13:12 np0005541913.localdomain podman[331691]: 2025-12-02 10:13:12.812302548 +0000 UTC m=+0.166313632 container start bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 02 10:13:12 np0005541913.localdomain dnsmasq[331709]: started, version 2.85 cachesize 150
Dec 02 10:13:12 np0005541913.localdomain dnsmasq[331709]: DNS service limited to local subnets
Dec 02 10:13:12 np0005541913.localdomain dnsmasq[331709]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:13:12 np0005541913.localdomain dnsmasq[331709]: warning: no upstream servers configured
Dec 02 10:13:12 np0005541913.localdomain dnsmasq-dhcp[331709]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 02 10:13:12 np0005541913.localdomain dnsmasq-dhcp[331709]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:13:12 np0005541913.localdomain dnsmasq[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/addn_hosts - 0 addresses
Dec 02 10:13:12 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/host
Dec 02 10:13:12 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/opts
Dec 02 10:13:12 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:12.873 263406 INFO neutron.agent.dhcp.agent [None req-08f2df75-5a63-4b3b-bc5f-1a33698df058 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:12Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a009a0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908a00880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a00bb0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908a00dc0>], id=654fad6f-8756-4ac9-a018-496dc04a2b32, ip_allocation=immediate, mac_address=fa:16:3e:2d:d1:84, name=tempest-PortsTestJSON-1309543666, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:13:07Z, description=, dns_domain=, id=e625cddc-8a19-4455-8def-acda09527180, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-235371861, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=26299, qos_policy_id=None, revision_number=3, router:external=False, shared=False, standard_attr_id=3047, status=ACTIVE, subnets=['3043535f-39cc-4332-a833-e57fad08ebc2', 'f691f97d-c7e9-4344-88fe-0164a1ed278d'], tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:10Z, vlan_transparent=None, network_id=e625cddc-8a19-4455-8def-acda09527180, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a05fa096-2813-49c8-a900-5ab13174ee5a'], standard_attr_id=3069, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:12Z on network e625cddc-8a19-4455-8def-acda09527180
Dec 02 10:13:13 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:13.014 2 INFO neutron.agent.securitygroups_rpc [None req-bfd5de53-2a30-4cbd-a16d-14779023f72a 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:13 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:13.037 263406 INFO neutron.agent.dhcp.agent [None req-fc15c30d-eb4a-4b9c-8271-29bab936e9ea - - - - - -] DHCP configuration for ports {'f6a41283-fc6f-4680-bb13-d9b38f4d32ad', 'df392c2e-2825-4eb5-b487-3c55b6dc044b'} is completed
Dec 02 10:13:13 np0005541913.localdomain dnsmasq[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/addn_hosts - 2 addresses
Dec 02 10:13:13 np0005541913.localdomain podman[331727]: 2025-12-02 10:13:13.156645305 +0000 UTC m=+0.063575164 container kill bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:13:13 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/host
Dec 02 10:13:13 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/opts
Dec 02 10:13:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e217 e217: 6 total, 6 up, 6 in
Dec 02 10:13:13 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:13.308 263406 INFO neutron.agent.dhcp.agent [None req-382fd06d-1133-4aeb-92b2-e117a8fab261 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:12Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089e7a00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089e7eb0>], id=654fad6f-8756-4ac9-a018-496dc04a2b32, ip_allocation=immediate, mac_address=fa:16:3e:2d:d1:84, name=tempest-PortsTestJSON-1309543666, network_id=e625cddc-8a19-4455-8def-acda09527180, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['a05fa096-2813-49c8-a900-5ab13174ee5a'], standard_attr_id=3069, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:12Z on network e625cddc-8a19-4455-8def-acda09527180
Dec 02 10:13:13 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:13.406 263406 INFO neutron.agent.dhcp.agent [None req-bc6b1eb8-05a8-4f81-ae5f-9a326e5386e7 - - - - - -] DHCP configuration for ports {'654fad6f-8756-4ac9-a018-496dc04a2b32'} is completed
Dec 02 10:13:13 np0005541913.localdomain dnsmasq[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/addn_hosts - 1 addresses
Dec 02 10:13:13 np0005541913.localdomain podman[331765]: 2025-12-02 10:13:13.57450044 +0000 UTC m=+0.056943438 container kill bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:13:13 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/host
Dec 02 10:13:13 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/opts
Dec 02 10:13:13 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:13.619 2 INFO neutron.agent.securitygroups_rpc [None req-c543e5d9-b160-4e67-9d20-7dd20a3ffb2c 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:13 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:13.835 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:12Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908870280>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908870520>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088705b0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f9908870790>], id=654fad6f-8756-4ac9-a018-496dc04a2b32, ip_allocation=immediate, mac_address=fa:16:3e:2d:d1:84, name=tempest-PortsTestJSON-1309543666, network_id=e625cddc-8a19-4455-8def-acda09527180, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['a05fa096-2813-49c8-a900-5ab13174ee5a'], standard_attr_id=3069, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:13Z on network e625cddc-8a19-4455-8def-acda09527180
Dec 02 10:13:13 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:13.879 263406 INFO neutron.agent.dhcp.agent [None req-670de3b0-bd62-43f8-a9f6-31e7892e9771 - - - - - -] DHCP configuration for ports {'654fad6f-8756-4ac9-a018-496dc04a2b32'} is completed
Dec 02 10:13:14 np0005541913.localdomain dnsmasq[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/addn_hosts - 2 addresses
Dec 02 10:13:14 np0005541913.localdomain podman[331801]: 2025-12-02 10:13:14.085124448 +0000 UTC m=+0.059413654 container kill bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:13:14 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/host
Dec 02 10:13:14 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/opts
Dec 02 10:13:14 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "51b857f0-7bbc-47ae-84d3-61ced77d3364", "format": "json"}]: dispatch
Dec 02 10:13:14 np0005541913.localdomain ceph-mon[298296]: osdmap e217: 6 total, 6 up, 6 in
Dec 02 10:13:14 np0005541913.localdomain ceph-mon[298296]: pgmap v522: 177 pgs: 177 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 149 KiB/s rd, 76 KiB/s wr, 204 op/s
Dec 02 10:13:14 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:14.261 2 INFO neutron.agent.securitygroups_rpc [None req-654147e9-be5d-4ad5-89df-552977a60280 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:14.348 263406 INFO neutron.agent.dhcp.agent [None req-77d491f8-c234-4e8a-9514-b620aa0a1e84 - - - - - -] DHCP configuration for ports {'654fad6f-8756-4ac9-a018-496dc04a2b32'} is completed
Dec 02 10:13:14 np0005541913.localdomain dnsmasq[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/addn_hosts - 0 addresses
Dec 02 10:13:14 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/host
Dec 02 10:13:14 np0005541913.localdomain dnsmasq-dhcp[331709]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/opts
Dec 02 10:13:14 np0005541913.localdomain podman[331838]: 2025-12-02 10:13:14.552489813 +0000 UTC m=+0.060725539 container kill bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:13:14 np0005541913.localdomain dnsmasq[331709]: exiting on receipt of SIGTERM
Dec 02 10:13:14 np0005541913.localdomain podman[331874]: 2025-12-02 10:13:14.957714322 +0000 UTC m=+0.065884046 container kill bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:13:14 np0005541913.localdomain systemd[1]: libpod-bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a.scope: Deactivated successfully.
Dec 02 10:13:15 np0005541913.localdomain podman[331889]: 2025-12-02 10:13:15.034826167 +0000 UTC m=+0.055733956 container died bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:13:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a-userdata-shm.mount: Deactivated successfully.
Dec 02 10:13:15 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-0f931dc05886cc5f2c688b7d673d13227eb02357be8cf59904bb71f31ec32857-merged.mount: Deactivated successfully.
Dec 02 10:13:15 np0005541913.localdomain podman[331889]: 2025-12-02 10:13:15.08147889 +0000 UTC m=+0.102386649 container remove bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:13:15 np0005541913.localdomain systemd[1]: libpod-conmon-bf7040d577fb66045f0c0dca66748515c48bbc8ed4eece4b41cff07243affd5a.scope: Deactivated successfully.
Dec 02 10:13:15 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:15Z|00584|binding|INFO|Removing iface tapdf392c2e-28 ovn-installed in OVS
Dec 02 10:13:15 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:15.133 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 72691af4-7c63-4983-b608-e1485f951d79 with type ""
Dec 02 10:13:15 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:15Z|00585|binding|INFO|Removing lport df392c2e-2825-4eb5-b487-3c55b6dc044b ovn-installed in OVS
Dec 02 10:13:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:15.135 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:15 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:15.136 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-e625cddc-8a19-4455-8def-acda09527180', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e625cddc-8a19-4455-8def-acda09527180', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39bb33c3-24c9-42a5-b452-f2a8901739e7, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=df392c2e-2825-4eb5-b487-3c55b6dc044b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:15 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:15.139 160221 INFO neutron.agent.ovn.metadata.agent [-] Port df392c2e-2825-4eb5-b487-3c55b6dc044b in datapath e625cddc-8a19-4455-8def-acda09527180 unbound from our chassis
Dec 02 10:13:15 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:15.142 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e625cddc-8a19-4455-8def-acda09527180, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:13:15 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:15.143 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[fc155eb4-5680-4e33-9442-4252c9141458]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:15.146 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3657225323' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:13:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/853958348' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/853958348' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1eb437a1-55f1-4e1b-ab0c-b43c26904e3d", "snap_name": "d93e1e67-e087-48ee-b546-b47b516fdb8a_ad4a4f2d-b818-4a17-accd-1b8ca8c807b2", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1eb437a1-55f1-4e1b-ab0c-b43c26904e3d", "snap_name": "d93e1e67-e087-48ee-b546-b47b516fdb8a", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3212393842' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3212393842' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:15 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:15Z|00586|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:13:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:15.868 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:15 np0005541913.localdomain podman[331965]: 
Dec 02 10:13:15 np0005541913.localdomain podman[331965]: 2025-12-02 10:13:15.92765538 +0000 UTC m=+0.055771257 container create 3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:13:15 np0005541913.localdomain systemd[1]: Started libpod-conmon-3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5.scope.
Dec 02 10:13:15 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:13:15 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625ef60836c560783979780add8e80057a2796c370ea3ab250428921cca8ada9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:13:15 np0005541913.localdomain podman[331965]: 2025-12-02 10:13:15.988867641 +0000 UTC m=+0.116983518 container init 3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 02 10:13:15 np0005541913.localdomain podman[331965]: 2025-12-02 10:13:15.999212617 +0000 UTC m=+0.127328494 container start 3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:13:15 np0005541913.localdomain podman[331965]: 2025-12-02 10:13:15.899478179 +0000 UTC m=+0.027594036 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:13:16 np0005541913.localdomain dnsmasq[331983]: started, version 2.85 cachesize 150
Dec 02 10:13:16 np0005541913.localdomain dnsmasq[331983]: DNS service limited to local subnets
Dec 02 10:13:16 np0005541913.localdomain dnsmasq[331983]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:13:16 np0005541913.localdomain dnsmasq[331983]: warning: no upstream servers configured
Dec 02 10:13:16 np0005541913.localdomain dnsmasq-dhcp[331983]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:13:16 np0005541913.localdomain dnsmasq[331983]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/addn_hosts - 0 addresses
Dec 02 10:13:16 np0005541913.localdomain dnsmasq-dhcp[331983]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/host
Dec 02 10:13:16 np0005541913.localdomain dnsmasq-dhcp[331983]: read /var/lib/neutron/dhcp/e625cddc-8a19-4455-8def-acda09527180/opts
Dec 02 10:13:16 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:16.080 263406 INFO neutron.agent.dhcp.agent [None req-dbda9556-274d-4654-bf47-fc38e9c2313b - - - - - -] DHCP configuration for ports {'f6a41283-fc6f-4680-bb13-d9b38f4d32ad', 'df392c2e-2825-4eb5-b487-3c55b6dc044b'} is completed
Dec 02 10:13:16 np0005541913.localdomain dnsmasq[331983]: exiting on receipt of SIGTERM
Dec 02 10:13:16 np0005541913.localdomain podman[332002]: 2025-12-02 10:13:16.234737433 +0000 UTC m=+0.061760186 container kill 3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:13:16 np0005541913.localdomain systemd[1]: libpod-3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5.scope: Deactivated successfully.
Dec 02 10:13:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:16.246 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:16 np0005541913.localdomain ceph-mon[298296]: pgmap v523: 177 pgs: 177 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 24 KiB/s wr, 89 op/s
Dec 02 10:13:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3088318282' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:13:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1171825153' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1171825153' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:16 np0005541913.localdomain podman[332015]: 2025-12-02 10:13:16.305351005 +0000 UTC m=+0.055568042 container died 3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:13:16 np0005541913.localdomain podman[332015]: 2025-12-02 10:13:16.389528848 +0000 UTC m=+0.139745835 container cleanup 3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:13:16 np0005541913.localdomain systemd[1]: libpod-conmon-3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5.scope: Deactivated successfully.
Dec 02 10:13:16 np0005541913.localdomain podman[332018]: 2025-12-02 10:13:16.414951925 +0000 UTC m=+0.153256674 container remove 3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e625cddc-8a19-4455-8def-acda09527180, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:13:16 np0005541913.localdomain kernel: device tapdf392c2e-28 left promiscuous mode
Dec 02 10:13:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:16.428 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:16.444 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:16 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:16.462 263406 INFO neutron.agent.dhcp.agent [None req-cfca3a08-db63-400e-83fb-cc90c7e0831c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:16 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:16.463 263406 INFO neutron.agent.dhcp.agent [None req-cfca3a08-db63-400e-83fb-cc90c7e0831c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-625ef60836c560783979780add8e80057a2796c370ea3ab250428921cca8ada9-merged.mount: Deactivated successfully.
Dec 02 10:13:16 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d49c60f43ea561f511990ffb0cb1a2cb09d10ffc7bc467484479f6f604dfcf5-userdata-shm.mount: Deactivated successfully.
Dec 02 10:13:16 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2de625cddc\x2d8a19\x2d4455\x2d8def\x2dacda09527180.mount: Deactivated successfully.
Dec 02 10:13:17 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:17.666 263406 INFO neutron.agent.linux.ip_lib [None req-cf859642-9de9-4e75-b0d5-72fce650fb9a - - - - - -] Device tapba727844-8d cannot be used as it has no MAC address
Dec 02 10:13:17 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "51b857f0-7bbc-47ae-84d3-61ced77d3364_ef18a710-573a-49b3-bf4f-f42593b839e9", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:17 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "51b857f0-7bbc-47ae-84d3-61ced77d3364", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:17 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e218 e218: 6 total, 6 up, 6 in
Dec 02 10:13:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:17.698 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:17 np0005541913.localdomain kernel: device tapba727844-8d entered promiscuous mode
Dec 02 10:13:17 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670397.7065] manager: (tapba727844-8d): new Generic device (/org/freedesktop/NetworkManager/Devices/91)
Dec 02 10:13:17 np0005541913.localdomain systemd-udevd[332056]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:13:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:17Z|00587|binding|INFO|Claiming lport ba727844-8d29-4d76-8d07-d01b3d69d95e for this chassis.
Dec 02 10:13:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:17Z|00588|binding|INFO|ba727844-8d29-4d76-8d07-d01b3d69d95e: Claiming unknown
Dec 02 10:13:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:17.713 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:17.720 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-c967ca81-437f-42d1-b838-c1e08520a79f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c967ca81-437f-42d1-b838-c1e08520a79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07449851-3999-4148-9dbf-2264021c09b5, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=ba727844-8d29-4d76-8d07-d01b3d69d95e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:17.722 160221 INFO neutron.agent.ovn.metadata.agent [-] Port ba727844-8d29-4d76-8d07-d01b3d69d95e in datapath c967ca81-437f-42d1-b838-c1e08520a79f bound to our chassis
Dec 02 10:13:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:17.723 160221 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c967ca81-437f-42d1-b838-c1e08520a79f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 02 10:13:17 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:17.728 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5f85aa78-cd2a-4030-947d-0d31a4c3f78b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:17Z|00589|binding|INFO|Setting lport ba727844-8d29-4d76-8d07-d01b3d69d95e ovn-installed in OVS
Dec 02 10:13:17 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:17Z|00590|binding|INFO|Setting lport ba727844-8d29-4d76-8d07-d01b3d69d95e up in Southbound
Dec 02 10:13:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:17.760 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:17.803 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:17.867 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:18 np0005541913.localdomain ceph-mon[298296]: pgmap v524: 177 pgs: 177 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 18 KiB/s wr, 68 op/s
Dec 02 10:13:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1eb437a1-55f1-4e1b-ab0c-b43c26904e3d", "format": "json"}]: dispatch
Dec 02 10:13:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1eb437a1-55f1-4e1b-ab0c-b43c26904e3d", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:18 np0005541913.localdomain ceph-mon[298296]: osdmap e218: 6 total, 6 up, 6 in
Dec 02 10:13:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2024253158' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2024253158' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2845869502' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2845869502' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e219 e219: 6 total, 6 up, 6 in
Dec 02 10:13:18 np0005541913.localdomain podman[332112]: 
Dec 02 10:13:18 np0005541913.localdomain podman[332112]: 2025-12-02 10:13:18.766394949 +0000 UTC m=+0.103253553 container create c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:13:18 np0005541913.localdomain podman[332112]: 2025-12-02 10:13:18.714267629 +0000 UTC m=+0.051126283 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:13:18 np0005541913.localdomain systemd[1]: Started libpod-conmon-c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51.scope.
Dec 02 10:13:18 np0005541913.localdomain systemd[1]: tmp-crun.buaD2s.mount: Deactivated successfully.
Dec 02 10:13:18 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:18.830 2 INFO neutron.agent.securitygroups_rpc [None req-809c0752-7390-4fe2-a50b-599cd97feccb 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:18 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:13:18 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcf9c65a212f9b7deb6b686dc4b60ab931b38012c56a5e34dd2439979a7e36ba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:13:18 np0005541913.localdomain podman[332112]: 2025-12-02 10:13:18.851284321 +0000 UTC m=+0.188142915 container init c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 02 10:13:18 np0005541913.localdomain podman[332112]: 2025-12-02 10:13:18.860410514 +0000 UTC m=+0.197269088 container start c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 10:13:18 np0005541913.localdomain dnsmasq[332131]: started, version 2.85 cachesize 150
Dec 02 10:13:18 np0005541913.localdomain dnsmasq[332131]: DNS service limited to local subnets
Dec 02 10:13:18 np0005541913.localdomain dnsmasq[332131]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:13:18 np0005541913.localdomain dnsmasq[332131]: warning: no upstream servers configured
Dec 02 10:13:18 np0005541913.localdomain dnsmasq-dhcp[332131]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:13:18 np0005541913.localdomain dnsmasq[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/addn_hosts - 0 addresses
Dec 02 10:13:18 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/host
Dec 02 10:13:18 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/opts
Dec 02 10:13:18 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:18.919 263406 INFO neutron.agent.dhcp.agent [None req-1958b7f1-9d37-44f8-badf-cb00c8acdb2a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:18Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d84f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d8820>], id=fd39b39a-7655-4402-8611-0df2d49b294f, ip_allocation=immediate, mac_address=fa:16:3e:ed:f7:2b, name=tempest-PortsTestJSON-735674509, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:13:16Z, description=, dns_domain=, id=c967ca81-437f-42d1-b838-c1e08520a79f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1850680089, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59962, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3070, status=ACTIVE, subnets=['9d0b22dc-23e6-4d68-9fb6-15b61b72d853'], tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:16Z, vlan_transparent=None, network_id=c967ca81-437f-42d1-b838-c1e08520a79f, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a05fa096-2813-49c8-a900-5ab13174ee5a'], standard_attr_id=3095, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:18Z on network c967ca81-437f-42d1-b838-c1e08520a79f
Dec 02 10:13:19 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:19.117 263406 INFO neutron.agent.dhcp.agent [None req-272d30d7-7c8f-4cc2-a11d-1c4bb34de297 - - - - - -] DHCP configuration for ports {'2269273e-0b74-4448-be38-0cbafd2abcb7'} is completed
Dec 02 10:13:19 np0005541913.localdomain dnsmasq[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/addn_hosts - 1 addresses
Dec 02 10:13:19 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/host
Dec 02 10:13:19 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/opts
Dec 02 10:13:19 np0005541913.localdomain podman[332148]: 2025-12-02 10:13:19.138980138 +0000 UTC m=+0.067267344 container kill c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 10:13:19 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:19.374 263406 INFO neutron.agent.dhcp.agent [None req-9135939d-b8ad-4ffa-b8f7-8fc5da451d46 - - - - - -] DHCP configuration for ports {'fd39b39a-7655-4402-8611-0df2d49b294f'} is completed
Dec 02 10:13:19 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:19Z|00591|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:13:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:19.569 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:19 np0005541913.localdomain ceph-mon[298296]: osdmap e219: 6 total, 6 up, 6 in
Dec 02 10:13:19 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:19.817 2 INFO neutron.agent.securitygroups_rpc [None req-86a7997e-3c35-4f55-af22-9588e0545ddf 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:19 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:19.887 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:19Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a0fdf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a0f340>], id=272e8361-349d-4315-9e1b-553eba6bd9cc, ip_allocation=immediate, mac_address=fa:16:3e:f4:76:53, name=tempest-PortsTestJSON-1039621780, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:13:16Z, description=, dns_domain=, id=c967ca81-437f-42d1-b838-c1e08520a79f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1850680089, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59962, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3070, status=ACTIVE, subnets=['9d0b22dc-23e6-4d68-9fb6-15b61b72d853'], tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:16Z, vlan_transparent=None, network_id=c967ca81-437f-42d1-b838-c1e08520a79f, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a05fa096-2813-49c8-a900-5ab13174ee5a'], standard_attr_id=3101, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:19Z on network c967ca81-437f-42d1-b838-c1e08520a79f
Dec 02 10:13:20 np0005541913.localdomain dnsmasq[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/addn_hosts - 2 addresses
Dec 02 10:13:20 np0005541913.localdomain podman[332187]: 2025-12-02 10:13:20.102233758 +0000 UTC m=+0.062686403 container kill c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 02 10:13:20 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/host
Dec 02 10:13:20 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/opts
Dec 02 10:13:20 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:20.322 263406 INFO neutron.agent.dhcp.agent [None req-2aaa37c5-8f93-44d5-bdf1-7dcd356558b7 - - - - - -] DHCP configuration for ports {'272e8361-349d-4315-9e1b-553eba6bd9cc'} is completed
Dec 02 10:13:20 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:20.546 2 INFO neutron.agent.securitygroups_rpc [None req-173c0738-50d3-48cc-af5a-b78421c8e23c 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:20 np0005541913.localdomain ceph-mon[298296]: pgmap v527: 177 pgs: 5 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 170 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 71 KiB/s wr, 117 op/s
Dec 02 10:13:20 np0005541913.localdomain dnsmasq[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/addn_hosts - 1 addresses
Dec 02 10:13:20 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/host
Dec 02 10:13:20 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/opts
Dec 02 10:13:20 np0005541913.localdomain podman[332226]: 2025-12-02 10:13:20.780079481 +0000 UTC m=+0.060037601 container kill c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:13:20 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:20.979 2 INFO neutron.agent.securitygroups_rpc [None req-a615b87d-8e60-4d08-9004-5c448f1ed91b 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:21 np0005541913.localdomain dnsmasq[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/addn_hosts - 0 addresses
Dec 02 10:13:21 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/host
Dec 02 10:13:21 np0005541913.localdomain podman[332263]: 2025-12-02 10:13:21.217230031 +0000 UTC m=+0.062261271 container kill c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 02 10:13:21 np0005541913.localdomain dnsmasq-dhcp[332131]: read /var/lib/neutron/dhcp/c967ca81-437f-42d1-b838-c1e08520a79f/opts
Dec 02 10:13:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:21.248 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:21 np0005541913.localdomain dnsmasq[332131]: exiting on receipt of SIGTERM
Dec 02 10:13:21 np0005541913.localdomain podman[332300]: 2025-12-02 10:13:21.595256525 +0000 UTC m=+0.053322122 container kill c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:13:21 np0005541913.localdomain systemd[1]: libpod-c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51.scope: Deactivated successfully.
Dec 02 10:13:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:21.637 160221 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port d78a3c66-6a9f-41e0-bcd0-7bef2817af9a with type ""
Dec 02 10:13:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:21.638 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-c967ca81-437f-42d1-b838-c1e08520a79f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c967ca81-437f-42d1-b838-c1e08520a79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07449851-3999-4148-9dbf-2264021c09b5, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=ba727844-8d29-4d76-8d07-d01b3d69d95e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:21 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:21Z|00592|binding|INFO|Removing iface tapba727844-8d ovn-installed in OVS
Dec 02 10:13:21 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:21Z|00593|binding|INFO|Removing lport ba727844-8d29-4d76-8d07-d01b3d69d95e ovn-installed in OVS
Dec 02 10:13:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:21.639 160221 INFO neutron.agent.ovn.metadata.agent [-] Port ba727844-8d29-4d76-8d07-d01b3d69d95e in datapath c967ca81-437f-42d1-b838-c1e08520a79f unbound from our chassis
Dec 02 10:13:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:21.639 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c967ca81-437f-42d1-b838-c1e08520a79f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:13:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:21.639 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:21 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:21.640 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[4989e8a7-fead-43cb-a425-2476da464b1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:21.646 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:21 np0005541913.localdomain podman[332314]: 2025-12-02 10:13:21.662492927 +0000 UTC m=+0.057033251 container died c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:13:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e220 e220: 6 total, 6 up, 6 in
Dec 02 10:13:21 np0005541913.localdomain podman[332314]: 2025-12-02 10:13:21.745710294 +0000 UTC m=+0.140250618 container cleanup c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:13:21 np0005541913.localdomain systemd[1]: libpod-conmon-c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51.scope: Deactivated successfully.
Dec 02 10:13:21 np0005541913.localdomain podman[332316]: 2025-12-02 10:13:21.773784892 +0000 UTC m=+0.157326243 container remove c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c967ca81-437f-42d1-b838-c1e08520a79f, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:13:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-dcf9c65a212f9b7deb6b686dc4b60ab931b38012c56a5e34dd2439979a7e36ba-merged.mount: Deactivated successfully.
Dec 02 10:13:21 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4f75e9268a17a140d407cb663578b693ce2951f17b063fab36a2633787ecf51-userdata-shm.mount: Deactivated successfully.
Dec 02 10:13:21 np0005541913.localdomain kernel: device tapba727844-8d left promiscuous mode
Dec 02 10:13:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:21.838 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:21.858 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:21.873 263406 INFO neutron.agent.dhcp.agent [None req-ee495048-54fe-4af6-8d59-80e9582e2b69 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:21 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:21.873 263406 INFO neutron.agent.dhcp.agent [None req-ee495048-54fe-4af6-8d59-80e9582e2b69 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:21 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dc967ca81\x2d437f\x2d42d1\x2db838\x2dc1e08520a79f.mount: Deactivated successfully.
Dec 02 10:13:22 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:22Z|00594|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:13:22 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:22.028 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:22 np0005541913.localdomain ceph-mon[298296]: pgmap v528: 177 pgs: 5 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 170 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 53 KiB/s wr, 88 op/s
Dec 02 10:13:22 np0005541913.localdomain ceph-mon[298296]: osdmap e220: 6 total, 6 up, 6 in
Dec 02 10:13:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "71fe3ff3-77b1-42b9-a13c-7c107bdd326d", "format": "json"}]: dispatch
Dec 02 10:13:23 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:23.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:24 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:24.100 2 INFO neutron.agent.securitygroups_rpc [None req-1c76d5fd-a4fe-47e2-aa2d-3afed3d7786f 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:24 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:13:24 np0005541913.localdomain podman[332343]: 2025-12-02 10:13:24.442827561 +0000 UTC m=+0.086226669 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 10:13:24 np0005541913.localdomain podman[332343]: 2025-12-02 10:13:24.457076001 +0000 UTC m=+0.100475099 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 02 10:13:24 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:13:24 np0005541913.localdomain ceph-mon[298296]: pgmap v530: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 82 KiB/s wr, 124 op/s
Dec 02 10:13:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:26.252 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e221 e221: 6 total, 6 up, 6 in
Dec 02 10:13:26 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:26.677 2 INFO neutron.agent.securitygroups_rpc [None req-03a5d6f5-e9fc-4da2-b77f-f6e56b5ab3f7 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:26 np0005541913.localdomain ceph-mon[298296]: pgmap v531: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 66 KiB/s wr, 99 op/s
Dec 02 10:13:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "cab3f864-d18a-47fe-ac99-2bece590c4f2", "format": "json"}]: dispatch
Dec 02 10:13:26 np0005541913.localdomain ceph-mon[298296]: osdmap e221: 6 total, 6 up, 6 in
Dec 02 10:13:27 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3797520786' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:27 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3797520786' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:13:28 np0005541913.localdomain podman[332362]: 2025-12-02 10:13:28.437748861 +0000 UTC m=+0.082267843 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 10:13:28 np0005541913.localdomain podman[332362]: 2025-12-02 10:13:28.471947553 +0000 UTC m=+0.116466565 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 02 10:13:28 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:13:28 np0005541913.localdomain ceph-mon[298296]: pgmap v533: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 1.7 KiB/s rd, 8.6 KiB/s wr, 5 op/s
Dec 02 10:13:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:13:29 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:13:29 np0005541913.localdomain podman[332379]: 2025-12-02 10:13:29.432933242 +0000 UTC m=+0.069045200 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:13:29 np0005541913.localdomain podman[332379]: 2025-12-02 10:13:29.446011921 +0000 UTC m=+0.082123929 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:13:29 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:13:29 np0005541913.localdomain podman[332378]: 2025-12-02 10:13:29.504790537 +0000 UTC m=+0.142330544 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Dec 02 10:13:29 np0005541913.localdomain podman[332378]: 2025-12-02 10:13:29.521944614 +0000 UTC m=+0.159484591 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Dec 02 10:13:29 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:13:29 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:29.824 263406 INFO neutron.agent.linux.ip_lib [None req-c25b3346-c026-494e-8d13-bc2f4aad8a67 - - - - - -] Device tap280223a7-c0 cannot be used as it has no MAC address
Dec 02 10:13:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:29.846 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:29 np0005541913.localdomain kernel: device tap280223a7-c0 entered promiscuous mode
Dec 02 10:13:29 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670409.8538] manager: (tap280223a7-c0): new Generic device (/org/freedesktop/NetworkManager/Devices/92)
Dec 02 10:13:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:29.854 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:29 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:29Z|00595|binding|INFO|Claiming lport 280223a7-c06f-4632-bc9d-10fcc2daed96 for this chassis.
Dec 02 10:13:29 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:29Z|00596|binding|INFO|280223a7-c06f-4632-bc9d-10fcc2daed96: Claiming unknown
Dec 02 10:13:29 np0005541913.localdomain systemd-udevd[332431]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:13:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:29.882 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:29 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:29.886 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9b4da99-dc68-46c9-bcf0-a3cfe207d767, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=280223a7-c06f-4632-bc9d-10fcc2daed96) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:29 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:29.887 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 280223a7-c06f-4632-bc9d-10fcc2daed96 in datapath 55499ea7-fec3-45ce-8fdc-4c408cd7abf9 bound to our chassis
Dec 02 10:13:29 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:29.889 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port f89c0c39-9ee7-4005-bfd3-e3f0da74be0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:13:29 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:29Z|00597|binding|INFO|Setting lport 280223a7-c06f-4632-bc9d-10fcc2daed96 ovn-installed in OVS
Dec 02 10:13:29 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:29Z|00598|binding|INFO|Setting lport 280223a7-c06f-4632-bc9d-10fcc2daed96 up in Southbound
Dec 02 10:13:29 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:29.890 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:13:29 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:29.891 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e0b1a3-e909-4de1-a056-5991a83bfeeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:29.892 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:29.923 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:29 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:29.944 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:30 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:30.049 2 INFO neutron.agent.securitygroups_rpc [None req-fd3a2c94-7b8a-4692-92f4-6e53ae5bb9ca 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['b06e62c3-67bb-4248-8ca7-8eec12bdd5e1']
Dec 02 10:13:30 np0005541913.localdomain podman[332486]: 
Dec 02 10:13:30 np0005541913.localdomain podman[332486]: 2025-12-02 10:13:30.737733624 +0000 UTC m=+0.075706399 container create b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:13:30 np0005541913.localdomain systemd[1]: Started libpod-conmon-b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e.scope.
Dec 02 10:13:30 np0005541913.localdomain ceph-mon[298296]: pgmap v534: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 27 KiB/s wr, 46 op/s
Dec 02 10:13:30 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "75cce5b0-a115-45be-bdca-5a004bb97c21", "format": "json"}]: dispatch
Dec 02 10:13:30 np0005541913.localdomain systemd[1]: tmp-crun.0trQqR.mount: Deactivated successfully.
Dec 02 10:13:30 np0005541913.localdomain podman[332486]: 2025-12-02 10:13:30.6989656 +0000 UTC m=+0.036938425 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:13:30 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:13:30 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a4833b9e177edc4a21650603f08c75cabbd06af814a1afc50e1b9f8ede988e6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:13:30 np0005541913.localdomain podman[332486]: 2025-12-02 10:13:30.838337185 +0000 UTC m=+0.176309960 container init b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:13:30 np0005541913.localdomain podman[332486]: 2025-12-02 10:13:30.847067607 +0000 UTC m=+0.185040352 container start b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:13:30 np0005541913.localdomain dnsmasq[332504]: started, version 2.85 cachesize 150
Dec 02 10:13:30 np0005541913.localdomain dnsmasq[332504]: DNS service limited to local subnets
Dec 02 10:13:30 np0005541913.localdomain dnsmasq[332504]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:13:30 np0005541913.localdomain dnsmasq[332504]: warning: no upstream servers configured
Dec 02 10:13:30 np0005541913.localdomain dnsmasq-dhcp[332504]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:13:30 np0005541913.localdomain dnsmasq[332504]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/addn_hosts - 0 addresses
Dec 02 10:13:30 np0005541913.localdomain dnsmasq-dhcp[332504]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/host
Dec 02 10:13:30 np0005541913.localdomain dnsmasq-dhcp[332504]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/opts
Dec 02 10:13:30 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:30.905 263406 INFO neutron.agent.dhcp.agent [None req-4de2bde8-a057-4821-aedb-98767a9144b7 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:29Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a3b7f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a3b910>], id=7eea047d-a0d7-4840-b24f-b6baca53023b, ip_allocation=immediate, mac_address=fa:16:3e:8e:df:50, name=tempest-PortsTestJSON-989628863, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:43Z, description=, dns_domain=, id=55499ea7-fec3-45ce-8fdc-4c408cd7abf9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-1562010144, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17376, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2951, status=ACTIVE, subnets=['bed8ca05-97c3-4b62-a76b-f0a2af362b59'], tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:28Z, vlan_transparent=None, network_id=55499ea7-fec3-45ce-8fdc-4c408cd7abf9, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['b06e62c3-67bb-4248-8ca7-8eec12bdd5e1'], standard_attr_id=3165, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:29Z on network 55499ea7-fec3-45ce-8fdc-4c408cd7abf9
Dec 02 10:13:31 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:31.121 263406 INFO neutron.agent.dhcp.agent [None req-5884a31d-6066-4875-bde8-c00fddef00bb - - - - - -] DHCP configuration for ports {'0bb31a40-2592-41bf-9cb8-279241b602e7', '3d63b6f3-67ae-4c21-b56a-394abd9240e9'} is completed
Dec 02 10:13:31 np0005541913.localdomain dnsmasq[332504]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/addn_hosts - 1 addresses
Dec 02 10:13:31 np0005541913.localdomain dnsmasq-dhcp[332504]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/host
Dec 02 10:13:31 np0005541913.localdomain podman[332522]: 2025-12-02 10:13:31.141285228 +0000 UTC m=+0.061869870 container kill b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:13:31 np0005541913.localdomain dnsmasq-dhcp[332504]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/opts
Dec 02 10:13:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:31.255 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:31 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:31.393 263406 INFO neutron.agent.dhcp.agent [None req-74dd683f-b432-4e84-a1c3-c4b295f95150 - - - - - -] DHCP configuration for ports {'7eea047d-a0d7-4840-b24f-b6baca53023b'} is completed
Dec 02 10:13:31 np0005541913.localdomain dnsmasq[332504]: exiting on receipt of SIGTERM
Dec 02 10:13:31 np0005541913.localdomain podman[332560]: 2025-12-02 10:13:31.57513166 +0000 UTC m=+0.056285111 container kill b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 02 10:13:31 np0005541913.localdomain systemd[1]: libpod-b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e.scope: Deactivated successfully.
Dec 02 10:13:31 np0005541913.localdomain podman[332573]: 2025-12-02 10:13:31.643878952 +0000 UTC m=+0.053738634 container died b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:13:31 np0005541913.localdomain podman[332573]: 2025-12-02 10:13:31.681433642 +0000 UTC m=+0.091293284 container cleanup b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 10:13:31 np0005541913.localdomain systemd[1]: libpod-conmon-b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e.scope: Deactivated successfully.
Dec 02 10:13:31 np0005541913.localdomain podman[332575]: 2025-12-02 10:13:31.728807705 +0000 UTC m=+0.131372032 container remove b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:13:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6a4833b9e177edc4a21650603f08c75cabbd06af814a1afc50e1b9f8ede988e6-merged.mount: Deactivated successfully.
Dec 02 10:13:31 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b46aacbe8c988ebee757b846df8517fff1784c836821a8f42046f67dcbe0790e-userdata-shm.mount: Deactivated successfully.
Dec 02 10:13:32 np0005541913.localdomain sudo[332603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:13:32 np0005541913.localdomain sudo[332603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:13:32 np0005541913.localdomain sudo[332603]: pam_unix(sudo:session): session closed for user root
Dec 02 10:13:32 np0005541913.localdomain sudo[332621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:13:32 np0005541913.localdomain sudo[332621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:13:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:32.442 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:56:e4 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9b4da99-dc68-46c9-bcf0-a3cfe207d767, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3d63b6f3-67ae-4c21-b56a-394abd9240e9) old=Port_Binding(mac=['fa:16:3e:31:56:e4 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:32.445 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3d63b6f3-67ae-4c21-b56a-394abd9240e9 in datapath 55499ea7-fec3-45ce-8fdc-4c408cd7abf9 updated
Dec 02 10:13:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:32.447 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port f89c0c39-9ee7-4005-bfd3-e3f0da74be0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:13:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:32.447 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:13:32 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:32.448 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0a709f-08dd-4a5c-bf5f-29ac9a62f81a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:32 np0005541913.localdomain ceph-mon[298296]: pgmap v535: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 22 KiB/s wr, 39 op/s
Dec 02 10:13:32 np0005541913.localdomain sudo[332621]: pam_unix(sudo:session): session closed for user root
Dec 02 10:13:33 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:33.040 2 INFO neutron.agent.securitygroups_rpc [None req-2f13aa5e-1f09-4188-af19-a84ba5538b10 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['b06e62c3-67bb-4248-8ca7-8eec12bdd5e1', '19b93206-6bbf-441b-abe9-609f462663ba']
Dec 02 10:13:33 np0005541913.localdomain sudo[332690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:13:33 np0005541913.localdomain sudo[332690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:13:33 np0005541913.localdomain sudo[332690]: pam_unix(sudo:session): session closed for user root
Dec 02 10:13:33 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:33.506 2 INFO neutron.agent.securitygroups_rpc [None req-af664360-5183-46be-8a67-9553906db0ca 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['19b93206-6bbf-441b-abe9-609f462663ba']
Dec 02 10:13:33 np0005541913.localdomain podman[332738]: 
Dec 02 10:13:33 np0005541913.localdomain podman[332738]: 2025-12-02 10:13:33.78836676 +0000 UTC m=+0.095921417 container create ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:13:33 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "2b9fdfdd-4f6a-4e8b-9cca-e9a879aa25b8", "format": "json"}]: dispatch
Dec 02 10:13:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:13:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:13:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:13:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:13:33 np0005541913.localdomain systemd[1]: Started libpod-conmon-ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12.scope.
Dec 02 10:13:33 np0005541913.localdomain podman[332738]: 2025-12-02 10:13:33.740232147 +0000 UTC m=+0.047786854 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:13:33 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:13:33 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9646cfbbdb3d2f70d14a5e7560c806dd20cd6e2eebbd709fdb4223591f816bf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:13:33 np0005541913.localdomain podman[332738]: 2025-12-02 10:13:33.87729192 +0000 UTC m=+0.184846587 container init ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:13:33 np0005541913.localdomain podman[332738]: 2025-12-02 10:13:33.888906809 +0000 UTC m=+0.196461466 container start ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:13:33 np0005541913.localdomain dnsmasq[332757]: started, version 2.85 cachesize 150
Dec 02 10:13:33 np0005541913.localdomain dnsmasq[332757]: DNS service limited to local subnets
Dec 02 10:13:33 np0005541913.localdomain dnsmasq[332757]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:13:33 np0005541913.localdomain dnsmasq[332757]: warning: no upstream servers configured
Dec 02 10:13:33 np0005541913.localdomain dnsmasq-dhcp[332757]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:13:33 np0005541913.localdomain dnsmasq-dhcp[332757]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 02 10:13:33 np0005541913.localdomain dnsmasq[332757]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/addn_hosts - 1 addresses
Dec 02 10:13:33 np0005541913.localdomain dnsmasq-dhcp[332757]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/host
Dec 02 10:13:33 np0005541913.localdomain dnsmasq-dhcp[332757]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/opts
Dec 02 10:13:33 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:33.952 263406 INFO neutron.agent.dhcp.agent [None req-8af431b2-be97-49e2-82ee-1b399287cde1 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:29Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088066d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088069a0>], id=7eea047d-a0d7-4840-b24f-b6baca53023b, ip_allocation=immediate, mac_address=fa:16:3e:8e:df:50, name=tempest-PortsTestJSON-109939691, network_id=55499ea7-fec3-45ce-8fdc-4c408cd7abf9, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['19b93206-6bbf-441b-abe9-609f462663ba'], standard_attr_id=3165, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:32Z on network 55499ea7-fec3-45ce-8fdc-4c408cd7abf9
Dec 02 10:13:33 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:33.956 263406 INFO oslo.privsep.daemon [None req-8af431b2-be97-49e2-82ee-1b399287cde1 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmpyw9umdxh/privsep.sock']
Dec 02 10:13:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:13:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:13:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:13:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:13:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:13:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:13:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:13:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:13:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:34.248 263406 INFO neutron.agent.dhcp.agent [None req-49cadb5d-eea7-4b04-8444-c047171737f6 - - - - - -] DHCP configuration for ports {'0bb31a40-2592-41bf-9cb8-279241b602e7', '7eea047d-a0d7-4840-b24f-b6baca53023b', '280223a7-c06f-4632-bc9d-10fcc2daed96', '3d63b6f3-67ae-4c21-b56a-394abd9240e9'} is completed
Dec 02 10:13:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:34.619 263406 INFO oslo.privsep.daemon [None req-8af431b2-be97-49e2-82ee-1b399287cde1 - - - - - -] Spawned new privsep daemon via rootwrap
Dec 02 10:13:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:34.507 332762 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 10:13:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:34.522 332762 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 10:13:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:34.525 332762 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 02 10:13:34 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:34.526 332762 INFO oslo.privsep.daemon [-] privsep daemon running as pid 332762
Dec 02 10:13:34 np0005541913.localdomain dnsmasq-dhcp[332757]: DHCPRELEASE(tap280223a7-c0) 10.100.0.7 fa:16:3e:8e:df:50
Dec 02 10:13:35 np0005541913.localdomain ceph-mon[298296]: pgmap v536: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 24 KiB/s wr, 33 op/s
Dec 02 10:13:35 np0005541913.localdomain dnsmasq[332757]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/addn_hosts - 1 addresses
Dec 02 10:13:35 np0005541913.localdomain dnsmasq-dhcp[332757]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/host
Dec 02 10:13:35 np0005541913.localdomain dnsmasq-dhcp[332757]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/opts
Dec 02 10:13:35 np0005541913.localdomain podman[332782]: 2025-12-02 10:13:35.448750758 +0000 UTC m=+0.048395300 container kill ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:13:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:35.534 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:35.535 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:13:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:35.566 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:35 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:35.688 263406 INFO neutron.agent.dhcp.agent [None req-df178d8d-7293-4451-af7b-6c191eee803c - - - - - -] DHCP configuration for ports {'7eea047d-a0d7-4840-b24f-b6baca53023b'} is completed
Dec 02 10:13:35 np0005541913.localdomain dnsmasq[332757]: exiting on receipt of SIGTERM
Dec 02 10:13:35 np0005541913.localdomain podman[332820]: 2025-12-02 10:13:35.906712432 +0000 UTC m=+0.064253124 container kill ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:13:35 np0005541913.localdomain systemd[1]: libpod-ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12.scope: Deactivated successfully.
Dec 02 10:13:35 np0005541913.localdomain podman[332834]: 2025-12-02 10:13:35.973744398 +0000 UTC m=+0.052059078 container died ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 10:13:36 np0005541913.localdomain podman[332834]: 2025-12-02 10:13:36.003376308 +0000 UTC m=+0.081690958 container cleanup ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:13:36 np0005541913.localdomain systemd[1]: libpod-conmon-ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12.scope: Deactivated successfully.
Dec 02 10:13:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:13:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:13:36 np0005541913.localdomain podman[332836]: 2025-12-02 10:13:36.095263457 +0000 UTC m=+0.168222774 container remove ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 02 10:13:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:13:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156205 "" "Go-http-client/1.1"
Dec 02 10:13:36 np0005541913.localdomain ceph-mon[298296]: pgmap v537: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 24 KiB/s wr, 33 op/s
Dec 02 10:13:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:13:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18780 "" "Go-http-client/1.1"
Dec 02 10:13:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:36.258 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:36 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:36.627 2 INFO neutron.agent.securitygroups_rpc [None req-3559f9f7-1434-4371-a7c8-42d18644b0ee 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['5ce035be-6b85-468c-9f45-e514c3373f72']
Dec 02 10:13:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:13:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-e9646cfbbdb3d2f70d14a5e7560c806dd20cd6e2eebbd709fdb4223591f816bf-merged.mount: Deactivated successfully.
Dec 02 10:13:36 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea8089f72d080e873f522e605c144f5550ca1906c2b4bff2df7ae548ba9bbd12-userdata-shm.mount: Deactivated successfully.
Dec 02 10:13:36 np0005541913.localdomain podman[332876]: 2025-12-02 10:13:36.928741808 +0000 UTC m=+0.073501679 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:13:36 np0005541913.localdomain podman[332876]: 2025-12-02 10:13:36.966194587 +0000 UTC m=+0.110954458 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:13:36 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:13:37 np0005541913.localdomain podman[332926]: 
Dec 02 10:13:37 np0005541913.localdomain podman[332926]: 2025-12-02 10:13:37.583372584 +0000 UTC m=+0.091100479 container create 60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:13:37 np0005541913.localdomain systemd[1]: Started libpod-conmon-60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1.scope.
Dec 02 10:13:37 np0005541913.localdomain podman[332926]: 2025-12-02 10:13:37.53894529 +0000 UTC m=+0.046672795 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:13:37 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:13:37 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a16d668f099b0e7a88d67bc8bce0bf7943325785a7ad60e1335816d6a91e163/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:13:37 np0005541913.localdomain podman[332926]: 2025-12-02 10:13:37.665444391 +0000 UTC m=+0.173171846 container init 60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:13:37 np0005541913.localdomain podman[332926]: 2025-12-02 10:13:37.674353708 +0000 UTC m=+0.182081163 container start 60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:13:37 np0005541913.localdomain dnsmasq[332944]: started, version 2.85 cachesize 150
Dec 02 10:13:37 np0005541913.localdomain dnsmasq[332944]: DNS service limited to local subnets
Dec 02 10:13:37 np0005541913.localdomain dnsmasq[332944]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:13:37 np0005541913.localdomain dnsmasq[332944]: warning: no upstream servers configured
Dec 02 10:13:37 np0005541913.localdomain dnsmasq-dhcp[332944]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 02 10:13:37 np0005541913.localdomain dnsmasq-dhcp[332944]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:13:37 np0005541913.localdomain dnsmasq[332944]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/addn_hosts - 0 addresses
Dec 02 10:13:37 np0005541913.localdomain dnsmasq-dhcp[332944]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/host
Dec 02 10:13:37 np0005541913.localdomain dnsmasq-dhcp[332944]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/opts
Dec 02 10:13:37 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "bbc0db63-e14e-46b1-8a2c-1e2c8a265a54", "format": "json"}]: dispatch
Dec 02 10:13:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:13:37 np0005541913.localdomain systemd[1]: tmp-crun.Ybkp9e.mount: Deactivated successfully.
Dec 02 10:13:37 np0005541913.localdomain dnsmasq[332944]: exiting on receipt of SIGTERM
Dec 02 10:13:37 np0005541913.localdomain podman[332962]: 2025-12-02 10:13:37.954850023 +0000 UTC m=+0.053459926 container kill 60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:13:37 np0005541913.localdomain systemd[1]: libpod-60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1.scope: Deactivated successfully.
Dec 02 10:13:37 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:37.955 263406 INFO neutron.agent.dhcp.agent [None req-36147f73-a4d6-4a96-9776-ffd898695cfe - - - - - -] DHCP configuration for ports {'0bb31a40-2592-41bf-9cb8-279241b602e7', '3d63b6f3-67ae-4c21-b56a-394abd9240e9', '280223a7-c06f-4632-bc9d-10fcc2daed96'} is completed
Dec 02 10:13:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:37.971 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:56:e4 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9b4da99-dc68-46c9-bcf0-a3cfe207d767, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3d63b6f3-67ae-4c21-b56a-394abd9240e9) old=Port_Binding(mac=['fa:16:3e:31:56:e4 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:37.973 160221 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3d63b6f3-67ae-4c21-b56a-394abd9240e9 in datapath 55499ea7-fec3-45ce-8fdc-4c408cd7abf9 updated
Dec 02 10:13:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:37.975 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port f89c0c39-9ee7-4005-bfd3-e3f0da74be0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:13:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:37.975 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:13:37 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:37.976 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[62c6f910-12e6-43c7-aab1-99c7ed5d3a2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:38 np0005541913.localdomain podman[332976]: 2025-12-02 10:13:38.013875547 +0000 UTC m=+0.045921666 container died 60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:13:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1-userdata-shm.mount: Deactivated successfully.
Dec 02 10:13:38 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-5a16d668f099b0e7a88d67bc8bce0bf7943325785a7ad60e1335816d6a91e163-merged.mount: Deactivated successfully.
Dec 02 10:13:38 np0005541913.localdomain podman[332976]: 2025-12-02 10:13:38.059326668 +0000 UTC m=+0.091372717 container remove 60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:13:38 np0005541913.localdomain systemd[1]: libpod-conmon-60ea29a264b98775429dd3977b442fb9d364ef8d722794c5eb5f26cf16d7c2e1.scope: Deactivated successfully.
Dec 02 10:13:38 np0005541913.localdomain ceph-mon[298296]: pgmap v538: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 23 KiB/s wr, 31 op/s
Dec 02 10:13:38 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2836928614' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:38 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2836928614' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:39 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:39.413 2 INFO neutron.agent.securitygroups_rpc [None req-f1fb19ca-fe9a-41d0-b171-21af469f3b04 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['5ce035be-6b85-468c-9f45-e514c3373f72', '4635549b-8be4-4094-becd-47d2d3f392be', '4dd0e6ef-da7b-4d17-b1c7-4a0b0fd81445']
Dec 02 10:13:39 np0005541913.localdomain podman[333051]: 
Dec 02 10:13:39 np0005541913.localdomain podman[333051]: 2025-12-02 10:13:39.471243194 +0000 UTC m=+0.089445165 container create 6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 10:13:39 np0005541913.localdomain systemd[1]: Started libpod-conmon-6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80.scope.
Dec 02 10:13:39 np0005541913.localdomain systemd[1]: tmp-crun.wBbCPU.mount: Deactivated successfully.
Dec 02 10:13:39 np0005541913.localdomain podman[333051]: 2025-12-02 10:13:39.43206907 +0000 UTC m=+0.050271061 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:13:39 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:13:39 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/205472f177633ee42eceac8b70b6938bd41041da927576b34353c1bcb670607c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:13:39 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:39.536 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:13:39 np0005541913.localdomain podman[333051]: 2025-12-02 10:13:39.550437104 +0000 UTC m=+0.168639065 container init 6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:13:39 np0005541913.localdomain podman[333051]: 2025-12-02 10:13:39.559238539 +0000 UTC m=+0.177440500 container start 6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 02 10:13:39 np0005541913.localdomain dnsmasq[333069]: started, version 2.85 cachesize 150
Dec 02 10:13:39 np0005541913.localdomain dnsmasq[333069]: DNS service limited to local subnets
Dec 02 10:13:39 np0005541913.localdomain dnsmasq[333069]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:13:39 np0005541913.localdomain dnsmasq[333069]: warning: no upstream servers configured
Dec 02 10:13:39 np0005541913.localdomain dnsmasq-dhcp[333069]: DHCP, static leases only on 10.100.0.32, lease time 1d
Dec 02 10:13:39 np0005541913.localdomain dnsmasq-dhcp[333069]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 02 10:13:39 np0005541913.localdomain dnsmasq-dhcp[333069]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:13:39 np0005541913.localdomain dnsmasq[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/addn_hosts - 1 addresses
Dec 02 10:13:39 np0005541913.localdomain dnsmasq-dhcp[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/host
Dec 02 10:13:39 np0005541913.localdomain dnsmasq-dhcp[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/opts
Dec 02 10:13:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:39.622 263406 INFO neutron.agent.dhcp.agent [None req-3bcf75ff-0076-42e8-8fd1-340a2fbc9e9c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:36Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a3b730>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a3b790>], id=0518379a-4019-4d66-afc5-66421b387adf, ip_allocation=immediate, mac_address=fa:16:3e:55:a8:d5, name=tempest-PortsTestJSON-39700250, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:12:43Z, description=, dns_domain=, id=55499ea7-fec3-45ce-8fdc-4c408cd7abf9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-1562010144, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17376, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=2951, status=ACTIVE, subnets=['39177969-64aa-42f2-9d18-c7fc4745ec4f', '9c059950-606f-4465-be07-113be9f2db02'], tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:34Z, vlan_transparent=None, network_id=55499ea7-fec3-45ce-8fdc-4c408cd7abf9, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5ce035be-6b85-468c-9f45-e514c3373f72'], standard_attr_id=3212, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:36Z on network 55499ea7-fec3-45ce-8fdc-4c408cd7abf9
Dec 02 10:13:39 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:39.827 263406 INFO neutron.agent.dhcp.agent [None req-618cae67-ddcb-4ae1-89fc-fc2ee67fe0f9 - - - - - -] DHCP configuration for ports {'0bb31a40-2592-41bf-9cb8-279241b602e7', '0518379a-4019-4d66-afc5-66421b387adf', '280223a7-c06f-4632-bc9d-10fcc2daed96', '3d63b6f3-67ae-4c21-b56a-394abd9240e9'} is completed
Dec 02 10:13:39 np0005541913.localdomain podman[333087]: 2025-12-02 10:13:39.898831638 +0000 UTC m=+0.053123747 container kill 6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:13:39 np0005541913.localdomain dnsmasq[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/addn_hosts - 1 addresses
Dec 02 10:13:39 np0005541913.localdomain dnsmasq-dhcp[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/host
Dec 02 10:13:39 np0005541913.localdomain dnsmasq-dhcp[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/opts
Dec 02 10:13:39 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:39.968 2 INFO neutron.agent.securitygroups_rpc [None req-1b71adfa-49cf-47f5-a7a4-715d1b19b4b9 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['4635549b-8be4-4094-becd-47d2d3f392be', '4dd0e6ef-da7b-4d17-b1c7-4a0b0fd81445']
Dec 02 10:13:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:40.083 263406 INFO neutron.agent.dhcp.agent [None req-bc572c6a-cfbb-4c30-94ec-401915257b51 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:13:36Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908781070>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908781d00>], id=0518379a-4019-4d66-afc5-66421b387adf, ip_allocation=immediate, mac_address=fa:16:3e:55:a8:d5, name=tempest-PortsTestJSON-721231550, network_id=55499ea7-fec3-45ce-8fdc-4c408cd7abf9, port_security_enabled=True, project_id=4ac3f69b39e24601806d0f601335ff31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['4635549b-8be4-4094-becd-47d2d3f392be', '4dd0e6ef-da7b-4d17-b1c7-4a0b0fd81445'], standard_attr_id=3212, status=DOWN, tags=[], tenant_id=4ac3f69b39e24601806d0f601335ff31, updated_at=2025-12-02T10:13:39Z on network 55499ea7-fec3-45ce-8fdc-4c408cd7abf9
Dec 02 10:13:40 np0005541913.localdomain dnsmasq-dhcp[333069]: DHCPRELEASE(tap280223a7-c0) 10.100.0.13 fa:16:3e:55:a8:d5
Dec 02 10:13:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:40.145 263406 INFO neutron.agent.dhcp.agent [None req-e234a8c3-06f7-4479-bf1f-7434af4be574 - - - - - -] DHCP configuration for ports {'0518379a-4019-4d66-afc5-66421b387adf'} is completed
Dec 02 10:13:40 np0005541913.localdomain podman[333124]: 2025-12-02 10:13:40.672045603 +0000 UTC m=+0.051580015 container kill 6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:13:40 np0005541913.localdomain dnsmasq[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/addn_hosts - 1 addresses
Dec 02 10:13:40 np0005541913.localdomain dnsmasq-dhcp[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/host
Dec 02 10:13:40 np0005541913.localdomain dnsmasq-dhcp[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/opts
Dec 02 10:13:40 np0005541913.localdomain ceph-mon[298296]: pgmap v539: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 26 KiB/s wr, 37 op/s
Dec 02 10:13:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:40.877 263406 INFO neutron.agent.dhcp.agent [None req-89a5a5e3-8fbd-40c3-9f34-67258da961a1 - - - - - -] DHCP configuration for ports {'0518379a-4019-4d66-afc5-66421b387adf'} is completed
Dec 02 10:13:41 np0005541913.localdomain podman[333161]: 2025-12-02 10:13:41.077984891 +0000 UTC m=+0.063025570 container kill 6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:13:41 np0005541913.localdomain dnsmasq[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/addn_hosts - 0 addresses
Dec 02 10:13:41 np0005541913.localdomain dnsmasq-dhcp[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/host
Dec 02 10:13:41 np0005541913.localdomain dnsmasq-dhcp[333069]: read /var/lib/neutron/dhcp/55499ea7-fec3-45ce-8fdc-4c408cd7abf9/opts
Dec 02 10:13:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:41.260 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:41.265 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:13:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:13:41 np0005541913.localdomain podman[333182]: 2025-12-02 10:13:41.508997387 +0000 UTC m=+0.149519555 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:13:41 np0005541913.localdomain podman[333182]: 2025-12-02 10:13:41.5451256 +0000 UTC m=+0.185647708 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:13:41 np0005541913.localdomain podman[333232]: 2025-12-02 10:13:41.547426951 +0000 UTC m=+0.055550221 container kill 6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:13:41 np0005541913.localdomain dnsmasq[333069]: exiting on receipt of SIGTERM
Dec 02 10:13:41 np0005541913.localdomain systemd[1]: libpod-6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80.scope: Deactivated successfully.
Dec 02 10:13:41 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:13:41 np0005541913.localdomain podman[333183]: 2025-12-02 10:13:41.454800393 +0000 UTC m=+0.092794474 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:13:41 np0005541913.localdomain podman[333261]: 2025-12-02 10:13:41.594171817 +0000 UTC m=+0.032152768 container died 6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 02 10:13:41 np0005541913.localdomain podman[333183]: 2025-12-02 10:13:41.646931082 +0000 UTC m=+0.284925123 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 02 10:13:41 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:13:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80-userdata-shm.mount: Deactivated successfully.
Dec 02 10:13:41 np0005541913.localdomain podman[333261]: 2025-12-02 10:13:41.692567019 +0000 UTC m=+0.130548000 container remove 6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55499ea7-fec3-45ce-8fdc-4c408cd7abf9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 02 10:13:41 np0005541913.localdomain systemd[1]: libpod-conmon-6950087791e588658dbd4cfc3951061644dcb2052cbefefd7c292e3334632f80.scope: Deactivated successfully.
Dec 02 10:13:41 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "bbc0db63-e14e-46b1-8a2c-1e2c8a265a54_19841362-7310-4db1-9177-f7698f0587e5", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:41 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "bbc0db63-e14e-46b1-8a2c-1e2c8a265a54", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:41.925 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:41 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:41Z|00599|binding|INFO|Releasing lport 280223a7-c06f-4632-bc9d-10fcc2daed96 from this chassis (sb_readonly=0)
Dec 02 10:13:41 np0005541913.localdomain kernel: device tap280223a7-c0 left promiscuous mode
Dec 02 10:13:41 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:41Z|00600|binding|INFO|Setting lport 280223a7-c06f-4632-bc9d-10fcc2daed96 down in Southbound
Dec 02 10:13:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:41.934 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28 10.100.0.35/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55499ea7-fec3-45ce-8fdc-4c408cd7abf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ac3f69b39e24601806d0f601335ff31', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9b4da99-dc68-46c9-bcf0-a3cfe207d767, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=280223a7-c06f-4632-bc9d-10fcc2daed96) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:13:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:41.936 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 280223a7-c06f-4632-bc9d-10fcc2daed96 in datapath 55499ea7-fec3-45ce-8fdc-4c408cd7abf9 unbound from our chassis
Dec 02 10:13:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:41.939 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55499ea7-fec3-45ce-8fdc-4c408cd7abf9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:13:41 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:13:41.940 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[5b061811-e138-4386-9843-30a30dfd054d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:13:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:41.947 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:41.949 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:42.027 263406 INFO neutron.agent.dhcp.agent [None req-a0302869-e0c8-4bf3-b050-286ac267d1c1 - - - - - -] DHCP configuration for ports {'0bb31a40-2592-41bf-9cb8-279241b602e7', '3d63b6f3-67ae-4c21-b56a-394abd9240e9', '280223a7-c06f-4632-bc9d-10fcc2daed96'} is completed
Dec 02 10:13:42 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:13:42.059 2 INFO neutron.agent.securitygroups_rpc [None req-0f8f5643-0b03-43e9-aad8-6bac530a8f71 8a48cd892c354d1695f4e180869e6d08 4ac3f69b39e24601806d0f601335ff31 - - default default] Security group member updated ['a05fa096-2813-49c8-a900-5ab13174ee5a']
Dec 02 10:13:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:42.186 263406 INFO neutron.agent.dhcp.agent [None req-c65976d6-4cad-4f96-b417-9742b58e99de - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:42.187 263406 INFO neutron.agent.dhcp.agent [None req-c65976d6-4cad-4f96-b417-9742b58e99de - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:42 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-205472f177633ee42eceac8b70b6938bd41041da927576b34353c1bcb670607c-merged.mount: Deactivated successfully.
Dec 02 10:13:42 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d55499ea7\x2dfec3\x2d45ce\x2d8fdc\x2d4c408cd7abf9.mount: Deactivated successfully.
Dec 02 10:13:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:13:42.689 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:13:42 np0005541913.localdomain ceph-mon[298296]: pgmap v540: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 6.7 KiB/s rd, 14 KiB/s wr, 9 op/s
Dec 02 10:13:42 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:13:42Z|00601|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:13:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:42.892 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:44 np0005541913.localdomain ceph-mon[298296]: pgmap v541: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 25 KiB/s wr, 16 op/s
Dec 02 10:13:44 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "2b9fdfdd-4f6a-4e8b-9cca-e9a879aa25b8_3ce82e88-98cf-4623-9819-20bc79fd24ff", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:44 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "2b9fdfdd-4f6a-4e8b-9cca-e9a879aa25b8", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:46.265 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:46 np0005541913.localdomain ceph-mon[298296]: pgmap v542: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 17 KiB/s wr, 16 op/s
Dec 02 10:13:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "75cce5b0-a115-45be-bdca-5a004bb97c21_5c7747e5-6063-411a-b429-32a057c35acd", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "75cce5b0-a115-45be-bdca-5a004bb97c21", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:47 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3643059818' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:47 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3643059818' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:47 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e222 e222: 6 total, 6 up, 6 in
Dec 02 10:13:48 np0005541913.localdomain ceph-mon[298296]: pgmap v543: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 17 KiB/s wr, 16 op/s
Dec 02 10:13:48 np0005541913.localdomain ceph-mon[298296]: osdmap e222: 6 total, 6 up, 6 in
Dec 02 10:13:48 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e223 e223: 6 total, 6 up, 6 in
Dec 02 10:13:49 np0005541913.localdomain ceph-mon[298296]: osdmap e223: 6 total, 6 up, 6 in
Dec 02 10:13:50 np0005541913.localdomain ceph-mon[298296]: pgmap v546: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 71 KiB/s wr, 36 op/s
Dec 02 10:13:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:13:50 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4272430941' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:13:50 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4272430941' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:51.266 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e224 e224: 6 total, 6 up, 6 in
Dec 02 10:13:51 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4272430941' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:51 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4272430941' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:51 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "cab3f864-d18a-47fe-ac99-2bece590c4f2_66e86b61-d3e8-4024-bc38-f520916b3578", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:51 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "cab3f864-d18a-47fe-ac99-2bece590c4f2", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:51 np0005541913.localdomain ceph-mon[298296]: osdmap e224: 6 total, 6 up, 6 in
Dec 02 10:13:52 np0005541913.localdomain ceph-mon[298296]: pgmap v547: 177 pgs: 177 active+clean; 201 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 54 KiB/s wr, 25 op/s
Dec 02 10:13:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2051237495' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:52 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2051237495' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:53 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3242656336' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:53 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3242656336' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:13:54 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2476172005' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:13:54 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2476172005' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:54 np0005541913.localdomain ceph-mon[298296]: pgmap v549: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 116 KiB/s rd, 103 KiB/s wr, 165 op/s
Dec 02 10:13:54 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2476172005' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:13:54 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2476172005' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:13:55 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:13:55 np0005541913.localdomain podman[333291]: 2025-12-02 10:13:55.453578675 +0000 UTC m=+0.090394510 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:13:55 np0005541913.localdomain podman[333291]: 2025-12-02 10:13:55.465084841 +0000 UTC m=+0.101900706 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 02 10:13:55 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:13:55 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "71fe3ff3-77b1-42b9-a13c-7c107bdd326d_d24362ed-a55a-4034-ba1b-811e27325111", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:55 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "snap_name": "71fe3ff3-77b1-42b9-a13c-7c107bdd326d", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:56.269 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:13:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:13:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e225 e225: 6 total, 6 up, 6 in
Dec 02 10:13:56 np0005541913.localdomain ceph-mon[298296]: pgmap v550: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 94 KiB/s rd, 84 KiB/s wr, 134 op/s
Dec 02 10:13:56 np0005541913.localdomain ceph-mon[298296]: osdmap e225: 6 total, 6 up, 6 in
Dec 02 10:13:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e226 e226: 6 total, 6 up, 6 in
Dec 02 10:13:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:13:57.840 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:13:58 np0005541913.localdomain ceph-mon[298296]: pgmap v552: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 23 KiB/s wr, 98 op/s
Dec 02 10:13:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "format": "json"}]: dispatch
Dec 02 10:13:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "afb0a218-82d6-4848-bc26-a77f5d927675", "force": true, "format": "json"}]: dispatch
Dec 02 10:13:58 np0005541913.localdomain ceph-mon[298296]: osdmap e226: 6 total, 6 up, 6 in
Dec 02 10:13:58 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e227 e227: 6 total, 6 up, 6 in
Dec 02 10:13:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:13:59 np0005541913.localdomain systemd[1]: tmp-crun.UEUqKd.mount: Deactivated successfully.
Dec 02 10:13:59 np0005541913.localdomain podman[333311]: 2025-12-02 10:13:59.459165529 +0000 UTC m=+0.096944605 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:13:59 np0005541913.localdomain podman[333311]: 2025-12-02 10:13:59.467982824 +0000 UTC m=+0.105761950 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 02 10:13:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:13:59 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:13:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:13:59 np0005541913.localdomain podman[333329]: 2025-12-02 10:13:59.601886773 +0000 UTC m=+0.108117323 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:13:59 np0005541913.localdomain podman[333329]: 2025-12-02 10:13:59.615001051 +0000 UTC m=+0.121231581 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:13:59 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:13:59 np0005541913.localdomain podman[333348]: 2025-12-02 10:13:59.70688424 +0000 UTC m=+0.093755469 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6)
Dec 02 10:13:59 np0005541913.localdomain podman[333348]: 2025-12-02 10:13:59.723060742 +0000 UTC m=+0.109931961 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 10:13:59 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:13:59 np0005541913.localdomain ceph-mon[298296]: osdmap e227: 6 total, 6 up, 6 in
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2655722515' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2655722515' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:00 np0005541913.localdomain systemd[1]: tmp-crun.yNWv0y.mount: Deactivated successfully.
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: pgmap v555: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 56 KiB/s wr, 29 op/s
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2655722515' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2655722515' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1778989957' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:14:00 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1778989957' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:01.271 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:14:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:01.275 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:01.275 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:14:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:01.276 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:14:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:01.276 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:14:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:01.279 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1778989957' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1778989957' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4053296986' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/4053296986' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:14:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1151473184' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:14:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1151473184' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:02 np0005541913.localdomain ceph-mon[298296]: pgmap v556: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 56 KiB/s wr, 29 op/s
Dec 02 10:14:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1151473184' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1151473184' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:02.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:14:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:02.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:14:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:02.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:14:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:03.057 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:14:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:03.057 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:14:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:03.058 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.302 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.303 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.303 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.303 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:14:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e228 e228: 6 total, 6 up, 6 in
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.951 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.967 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.967 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.968 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.968 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.969 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.990 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.991 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.991 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.991 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:14:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:03.992 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:14:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:14:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:14:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:14:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:14:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:14:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:14:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:14:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:14:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2984015493' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.418 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.477 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.478 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.685 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.686 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11120MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.686 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.687 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.749 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.750 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.750 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.764 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 10:14:04 np0005541913.localdomain ceph-mon[298296]: pgmap v557: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 104 KiB/s rd, 58 KiB/s wr, 150 op/s
Dec 02 10:14:04 np0005541913.localdomain ceph-mon[298296]: osdmap e228: 6 total, 6 up, 6 in
Dec 02 10:14:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2495621874' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2495621874' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2266163141' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2266163141' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2984015493' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.963 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.964 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 10:14:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:04.991 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 10:14:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:05.014 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_ABM,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 10:14:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:05.055 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:14:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:14:05 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3165726944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:14:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:05.489 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:14:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:05.495 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:14:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:05.510 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:14:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:05.512 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:14:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:05.513 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:14:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3165726944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:14:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:14:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:14:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:14:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:14:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:14:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1"
Dec 02 10:14:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:06.275 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e229 e229: 6 total, 6 up, 6 in
Dec 02 10:14:06 np0005541913.localdomain ceph-mon[298296]: pgmap v559: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 51 KiB/s wr, 130 op/s
Dec 02 10:14:06 np0005541913.localdomain ceph-mon[298296]: osdmap e229: 6 total, 6 up, 6 in
Dec 02 10:14:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:14:07 np0005541913.localdomain systemd[1]: tmp-crun.HjbQ6E.mount: Deactivated successfully.
Dec 02 10:14:07 np0005541913.localdomain podman[333417]: 2025-12-02 10:14:07.443921215 +0000 UTC m=+0.085924082 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:14:07 np0005541913.localdomain podman[333417]: 2025-12-02 10:14:07.456955702 +0000 UTC m=+0.098958589 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 02 10:14:07 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:14:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e230 e230: 6 total, 6 up, 6 in
Dec 02 10:14:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/793956818' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:14:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3658023958' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3658023958' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:07 np0005541913.localdomain ceph-mon[298296]: osdmap e230: 6 total, 6 up, 6 in
Dec 02 10:14:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3646158002' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:14:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:08.373 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:14:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:08.373 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:14:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:08.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:14:08 np0005541913.localdomain ceph-mon[298296]: pgmap v561: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 5.2 KiB/s wr, 99 op/s
Dec 02 10:14:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/219637169' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/219637169' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:09.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:14:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1054534984' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1054534984' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e231 e231: 6 total, 6 up, 6 in
Dec 02 10:14:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:10.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:14:10 np0005541913.localdomain ceph-mon[298296]: pgmap v563: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 27 KiB/s wr, 89 op/s
Dec 02 10:14:10 np0005541913.localdomain ceph-mon[298296]: osdmap e231: 6 total, 6 up, 6 in
Dec 02 10:14:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:11.278 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:11.282 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:14:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/232663665' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:14:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/232663665' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:11.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:14:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/232663665' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/232663665' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:14:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:14:12 np0005541913.localdomain podman[333437]: 2025-12-02 10:14:12.460396929 +0000 UTC m=+0.090558855 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Dec 02 10:14:12 np0005541913.localdomain podman[333436]: 2025-12-02 10:14:12.435110595 +0000 UTC m=+0.071384004 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:14:12 np0005541913.localdomain podman[333436]: 2025-12-02 10:14:12.521113476 +0000 UTC m=+0.157386845 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 10:14:12 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:14:12 np0005541913.localdomain podman[333437]: 2025-12-02 10:14:12.538199252 +0000 UTC m=+0.168361168 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 02 10:14:12 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:14:12 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:14:12Z|00602|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 02 10:14:13 np0005541913.localdomain ceph-mon[298296]: pgmap v565: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 27 KiB/s wr, 89 op/s
Dec 02 10:14:14 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:14 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "10335e0e-f484-4bf5-b0cc-29a04393ec4e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "10335e0e-f484-4bf5-b0cc-29a04393ec4e", "format": "json"}]: dispatch
Dec 02 10:14:15 np0005541913.localdomain ceph-mon[298296]: pgmap v566: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.3 MiB/s rd, 27 KiB/s wr, 182 op/s
Dec 02 10:14:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "235a8d4c-ab29-4d51-b38b-3a594da63103", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "235a8d4c-ab29-4d51-b38b-3a594da63103", "format": "json"}]: dispatch
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.107 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.132 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.132 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f37bb8e3-8239-48bc-8433-55630bd11d1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:14:16.107540', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a856bf2c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': '9723340741946c8fcd7f107655aa97b7d737f672cd88005eb2c14d8c035a6660'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:14:16.107540', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a856c99a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': '0c7e9cf85351109bc035e3fc4fd8ec48e93335d90e68beef9a5289032e539c70'}]}, 'timestamp': '2025-12-02 10:14:16.132757', '_unique_id': 'c6c9ca5766b842cc9333b7abe5012d8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.133 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.137 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '989d654f-bb9d-46cc-9b20-128f0a2df808', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.134395', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a8578d6c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': 'f251a00ba28dad66b8feba7f0d322eafa5ecb9499aefc2f3891cec7fad4c587a'}]}, 'timestamp': '2025-12-02 10:14:16.137796', '_unique_id': '009aa6ecdca14817a3d5d172e8468624'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.148 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.148 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4725d00d-5aef-4cb1-b59d-a4b703003917', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:14:16.138880', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a859366c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.357920107, 'message_signature': '70b2e85cb4e52d8b845406e3a6d583a77d51b661de8b37710da826b6b3873bc1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:14:16.138880', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8593fae-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.357920107, 'message_signature': '12db747c1e75361d4c856d52cc4bb2a6d32bd44671593ee431668f3feb365a9e'}]}, 'timestamp': '2025-12-02 10:14:16.148875', '_unique_id': 'e2c80c2b4dcc48ffa22a00b3aac6b3ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd465bc5-36a2-4850-877c-4b82da95462e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:14:16.149982', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a85972da-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': '73d0af520d76c7a6a71f6b7a011ae26cf128ccde90a9f0a9b82239a8428fb49b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:14:16.149982', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8597a3c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': '9ebadf4d50b7f64492a557bc428646e54f37e4841a3b07ab31119a5083125beb'}]}, 'timestamp': '2025-12-02 10:14:16.150371', '_unique_id': '07459e9e70c647708f0126dfe2582e9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.150 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.151 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.151 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.151 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.151 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd97bbe44-ce3d-4d3d-af31-59c09ef95b12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:14:16.151510', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a859aeee-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.357920107, 'message_signature': 'f40c58d2d140af5e20680b5673f90724e5bd79dafbc959af212b7605baf31d90'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:14:16.151510', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a859b63c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.357920107, 'message_signature': '2ba74f3c5315a0b560caa0653a48364882e37acb7e3b4920a120f41ca3664fb3'}]}, 'timestamp': '2025-12-02 10:14:16.151906', '_unique_id': 'c792a07e43b34ef3ab8a6efbdbef44a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.152 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.167 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 19640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e887434d-3d7a-4262-a4c5-10f8a2d7798b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19640000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:14:16.152899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a85c1dfa-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.386399996, 'message_signature': '42d0a87754bed12bd62444f99575fd12f7b371d83f269cdddb592b89e887faca'}]}, 'timestamp': '2025-12-02 10:14:16.167711', '_unique_id': '64969483ca794319a7d624e61473c254'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.168 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc5a2a7a-f05e-4e87-8b49-af72bdc84d93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:14:16.168827', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a85c5306-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': '78f88254559be46262428dbcbd5a5ff000337803a8bc22ea37c4b1435b20544f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:14:16.168827', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a85c5bf8-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': 'a765744e2a04a86265c31ead38e242efc87039f8cb67ed0a9594bebbc79a5646'}]}, 'timestamp': '2025-12-02 10:14:16.169259', '_unique_id': '5e6809f62b68445fb77b7beec736f7ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.169 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ad86f35-1bbb-461a-a117-31c77fa74224', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:14:16.170259', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a85c8ad8-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.386399996, 'message_signature': 'a48dab2d0643e0f7afd5cefeb892f23253a0379ae9b76f927bbb0606819c9af8'}]}, 'timestamp': '2025-12-02 10:14:16.170462', '_unique_id': 'c9daf1d9fd154f3eac787ca49254a74a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.170 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.171 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.171 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a95014c-3a7e-44ae-97cd-16a97a13b290', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:14:16.171441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a85cb8f0-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': '44fab6215b23fb9a5e2792bb7678016ddb9c89d95f66340451d177fc83e024ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:14:16.171441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a85cc0de-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': 'bb0525c0cb07109ff8b28ba6555e6301585c3883a5d6474138dd20b19d47af63'}]}, 'timestamp': '2025-12-02 10:14:16.171838', '_unique_id': '97448fb6fa964ca6bedd4964d3a359f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.173 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.173 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2da6ead-ed29-4be6-bc0c-3fda0dc0bf99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:14:16.173009', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a85cf806-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.357920107, 'message_signature': 'dbfe0fa0e50122da96e91beefcc6be662052e432d48e7bd8b44976d37407155e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:14:16.173009', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a85d01ca-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.357920107, 'message_signature': '8676bb19213844229cbd3f843214b317e2a4dc2e5d98a98ca1982990b9b07718'}]}, 'timestamp': '2025-12-02 10:14:16.173538', '_unique_id': 'fbce09f86e47423480d92fc057fde8c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.174 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa91af7b-7f8d-43b4-98b2-4d13897fb273', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.174706', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a85d38a2-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': 'f5ea2312f206ad5a2f1ad312777b49b63e6d633a2244f6febf0bd620239d5f03'}]}, 'timestamp': '2025-12-02 10:14:16.174918', '_unique_id': '3165e548de4a4631b7dde255c364b770'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.175 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '950c3472-2fe4-4af5-a8e5-c3c7e64447e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:14:16.175867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a85d65ca-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': '33ab1073d47798292ec163f3d30d1746b3cd4d3edeb1061e7186bdf53f7b0749'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:14:16.175867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a85d6cd2-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': '19104fdeb9739b16a814863bb5ef0525c38940914072e8625f63f524b62b3325'}]}, 'timestamp': '2025-12-02 10:14:16.176240', '_unique_id': '8dd580276b884d6781eaa0b5173d4799'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.176 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea47b1b6-b889-4ec9-b579-ed43291bf564', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.177230', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a85d9b30-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': '77a6c23262067e2c1d150a6814df116c20b1df7f43bf9ff797dbd889b2ab503e'}]}, 'timestamp': '2025-12-02 10:14:16.177441', '_unique_id': '821800e62082406d801e97f522ea753f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.177 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.178 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.178 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'faaea1a2-7975-455a-8d1b-bf56c7c09ec8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.178382', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a85dc81c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': 'a2d5a64b38c6e528b41e74da7390ee7558dff62d1a5333b442853e81425d679c'}]}, 'timestamp': '2025-12-02 10:14:16.178590', '_unique_id': '2da665abab774a6eb89a25421bfb4c86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.179 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fcd09ec-9615-4b08-963b-3d23405d277d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.179595', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a85df81e-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': '32be730c6244800a895ae62a5f7584371ea3ca7079d0130999d2c451f88ee54c'}]}, 'timestamp': '2025-12-02 10:14:16.179821', '_unique_id': '3929b4f0225e4ee3a11b7f1c0e4bee45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.180 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ed51565-9b5e-4281-a2f6-f9be2f9a16ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.180821', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a85e276c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': '4d7d734603458d99e59f7b1bdb0215465721b1fa6b0e045cd7796315bedc3c80'}]}, 'timestamp': '2025-12-02 10:14:16.181050', '_unique_id': 'c8acf6c0bc94418e8d7b754bff21d542'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.181 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af01bc8e-0198-474f-9a6b-c2743b743211', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.182121', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a85e5a16-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': 'c973c4175094f64d5cb60c7758222109271b72faedc70103cdcd7f9e424df75e'}]}, 'timestamp': '2025-12-02 10:14:16.182327', '_unique_id': '5f39666e9d004bb4be36d4f88a04d2fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.182 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1a52e21-c373-42fc-8be4-0595dcfd2c2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.183261', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a85e86b2-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': 'd2188805f1fb9f3644eb46163e9e4999b5acd850d9521544b6f6dcaf687790ac'}]}, 'timestamp': '2025-12-02 10:14:16.183470', '_unique_id': '95806585c4a345be9b0467ff876128bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.183 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.184 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d3f55b7-65d9-4c5d-8137-fc72c0377061', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.184455', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a85eb556-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': '3d55957b5b1094c2afee9ba410be9ef37c5b1ef278047f6e31bc88b65eecddba'}]}, 'timestamp': '2025-12-02 10:14:16.184679', '_unique_id': '27a9cc3d07e64fce9aa87def267355d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.185 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a104d50-a4ef-44e2-b438-b6ce8e561405', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:14:16.185766', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'a85ee88c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.353446467, 'message_signature': 'aa6d5c30ce56c930e74466a8e65789b0032bd13979754868276700bf5e5bcdd0'}]}, 'timestamp': '2025-12-02 10:14:16.185975', '_unique_id': '5fa9cf410ad544678be2cc1d8b09d04e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.186 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.187 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.187 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b9aa1db-a9c1-469d-b214-2390272a39c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:14:16.187209', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a85f2108-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': 'd0e041c6e14079f37278356163876cd8cda6b63b86d285a14a0c519ee805be44'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:14:16.187209', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a85f2810-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12618.326583072, 'message_signature': 'c4252962afd0894d87ef3e0d182a4dbfcd14f00d20ba38019a9bef4641bfc3be'}]}, 'timestamp': '2025-12-02 10:14:16.187586', '_unique_id': 'f876ee59c48646a6a877f225712a52cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:14:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:14:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:14:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:16.281 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e232 e232: 6 total, 6 up, 6 in
Dec 02 10:14:17 np0005541913.localdomain ceph-mon[298296]: pgmap v567: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 22 KiB/s wr, 149 op/s
Dec 02 10:14:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3676005525' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:14:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-185695304", "format": "json"} : dispatch
Dec 02 10:14:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-185695304", "caps": ["mds", "allow rw path=/volumes/_nogroup/10335e0e-f484-4bf5-b0cc-29a04393ec4e/555d1535-2ead-4b78-97f7-0c5bf5ade719", "osd", "allow rw pool=manila_data namespace=fsvolumens_10335e0e-f484-4bf5-b0cc-29a04393ec4e", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:14:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-185695304", "caps": ["mds", "allow rw path=/volumes/_nogroup/10335e0e-f484-4bf5-b0cc-29a04393ec4e/555d1535-2ead-4b78-97f7-0c5bf5ade719", "osd", "allow rw pool=manila_data namespace=fsvolumens_10335e0e-f484-4bf5-b0cc-29a04393ec4e", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:14:17 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-185695304", "caps": ["mds", "allow rw path=/volumes/_nogroup/10335e0e-f484-4bf5-b0cc-29a04393ec4e/555d1535-2ead-4b78-97f7-0c5bf5ade719", "osd", "allow rw pool=manila_data namespace=fsvolumens_10335e0e-f484-4bf5-b0cc-29a04393ec4e", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:14:17 np0005541913.localdomain ceph-mon[298296]: osdmap e232: 6 total, 6 up, 6 in
Dec 02 10:14:17 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3935642216' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:14:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "10335e0e-f484-4bf5-b0cc-29a04393ec4e", "auth_id": "tempest-cephx-id-185695304", "tenant_id": "5974c1b38c02486098e277d58b491dac", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:14:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "feb9c965-7b4c-4671-ab34-1817317dacc0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "feb9c965-7b4c-4671-ab34-1817317dacc0", "format": "json"}]: dispatch
Dec 02 10:14:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-185695304", "format": "json"} : dispatch
Dec 02 10:14:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-185695304"} : dispatch
Dec 02 10:14:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-185695304"} : dispatch
Dec 02 10:14:18 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-185695304"}]': finished
Dec 02 10:14:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e233 e233: 6 total, 6 up, 6 in
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "10335e0e-f484-4bf5-b0cc-29a04393ec4e", "auth_id": "tempest-cephx-id-185695304", "format": "json"}]: dispatch
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: pgmap v569: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.4 KiB/s wr, 81 op/s
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "10335e0e-f484-4bf5-b0cc-29a04393ec4e", "auth_id": "tempest-cephx-id-185695304", "format": "json"}]: dispatch
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "10335e0e-f484-4bf5-b0cc-29a04393ec4e", "format": "json"}]: dispatch
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "10335e0e-f484-4bf5-b0cc-29a04393ec4e", "force": true, "format": "json"}]: dispatch
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ed816090-7c9e-4964-a11f-502383746c0b", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ed816090-7c9e-4964-a11f-502383746c0b", "format": "json"}]: dispatch
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: osdmap e233: 6 total, 6 up, 6 in
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/570460872' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/570460872' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e234 e234: 6 total, 6 up, 6 in
Dec 02 10:14:20 np0005541913.localdomain ceph-mon[298296]: pgmap v571: 177 pgs: 177 active+clean; 249 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 2.8 MiB/s wr, 157 op/s
Dec 02 10:14:20 np0005541913.localdomain ceph-mon[298296]: osdmap e234: 6 total, 6 up, 6 in
Dec 02 10:14:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/257893321' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/257893321' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e235 e235: 6 total, 6 up, 6 in
Dec 02 10:14:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:21.284 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:21 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "feb9c965-7b4c-4671-ab34-1817317dacc0", "format": "json"}]: dispatch
Dec 02 10:14:21 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "feb9c965-7b4c-4671-ab34-1817317dacc0", "force": true, "format": "json"}]: dispatch
Dec 02 10:14:21 np0005541913.localdomain ceph-mon[298296]: osdmap e235: 6 total, 6 up, 6 in
Dec 02 10:14:21 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2158575046' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:21 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2158575046' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:21 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "590dca3f-4f85-48ff-a801-1b49410a7fa1", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "590dca3f-4f85-48ff-a801-1b49410a7fa1", "format": "json"}]: dispatch
Dec 02 10:14:22 np0005541913.localdomain ceph-mon[298296]: pgmap v574: 177 pgs: 177 active+clean; 249 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 78 KiB/s rd, 4.9 MiB/s wr, 133 op/s
Dec 02 10:14:23 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e236 e236: 6 total, 6 up, 6 in
Dec 02 10:14:24 np0005541913.localdomain ceph-mon[298296]: pgmap v575: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 158 KiB/s rd, 3.8 MiB/s wr, 241 op/s
Dec 02 10:14:24 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "235a8d4c-ab29-4d51-b38b-3a594da63103", "format": "json"}]: dispatch
Dec 02 10:14:24 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "235a8d4c-ab29-4d51-b38b-3a594da63103", "force": true, "format": "json"}]: dispatch
Dec 02 10:14:24 np0005541913.localdomain ceph-mon[298296]: osdmap e236: 6 total, 6 up, 6 in
Dec 02 10:14:25 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e237 e237: 6 total, 6 up, 6 in
Dec 02 10:14:25 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ed816090-7c9e-4964-a11f-502383746c0b", "format": "json"}]: dispatch
Dec 02 10:14:25 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ed816090-7c9e-4964-a11f-502383746c0b", "force": true, "format": "json"}]: dispatch
Dec 02 10:14:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:26.288 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:26 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:14:26 np0005541913.localdomain podman[333484]: 2025-12-02 10:14:26.447386396 +0000 UTC m=+0.090490703 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0)
Dec 02 10:14:26 np0005541913.localdomain podman[333484]: 2025-12-02 10:14:26.458296537 +0000 UTC m=+0.101400824 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 02 10:14:26 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:14:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e238 e238: 6 total, 6 up, 6 in
Dec 02 10:14:26 np0005541913.localdomain ceph-mon[298296]: pgmap v577: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 99 KiB/s rd, 78 KiB/s wr, 140 op/s
Dec 02 10:14:26 np0005541913.localdomain ceph-mon[298296]: osdmap e237: 6 total, 6 up, 6 in
Dec 02 10:14:26 np0005541913.localdomain ceph-mon[298296]: osdmap e238: 6 total, 6 up, 6 in
Dec 02 10:14:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e239 e239: 6 total, 6 up, 6 in
Dec 02 10:14:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f9ec3f6d-7d6e-4cd3-a305-e37d986864dd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f9ec3f6d-7d6e-4cd3-a305-e37d986864dd", "format": "json"}]: dispatch
Dec 02 10:14:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:28 np0005541913.localdomain ceph-mon[298296]: pgmap v580: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 99 KiB/s rd, 78 KiB/s wr, 140 op/s
Dec 02 10:14:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "590dca3f-4f85-48ff-a801-1b49410a7fa1", "format": "json"}]: dispatch
Dec 02 10:14:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "590dca3f-4f85-48ff-a801-1b49410a7fa1", "force": true, "format": "json"}]: dispatch
Dec 02 10:14:28 np0005541913.localdomain ceph-mon[298296]: osdmap e239: 6 total, 6 up, 6 in
Dec 02 10:14:29 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e240 e240: 6 total, 6 up, 6 in
Dec 02 10:14:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:14:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:14:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:14:30 np0005541913.localdomain podman[333505]: 2025-12-02 10:14:30.459566709 +0000 UTC m=+0.087109282 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 10:14:30 np0005541913.localdomain podman[333505]: 2025-12-02 10:14:30.470209813 +0000 UTC m=+0.097752416 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 10:14:30 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:14:30 np0005541913.localdomain podman[333504]: 2025-12-02 10:14:30.530480589 +0000 UTC m=+0.160736644 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true)
Dec 02 10:14:30 np0005541913.localdomain podman[333504]: 2025-12-02 10:14:30.565070591 +0000 UTC m=+0.195326696 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 02 10:14:30 np0005541913.localdomain systemd[1]: tmp-crun.2YUcE9.mount: Deactivated successfully.
Dec 02 10:14:30 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:14:30 np0005541913.localdomain podman[333506]: 2025-12-02 10:14:30.58568597 +0000 UTC m=+0.206563396 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:14:30 np0005541913.localdomain podman[333506]: 2025-12-02 10:14:30.599217671 +0000 UTC m=+0.220095107 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:14:30 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:14:30 np0005541913.localdomain ceph-mon[298296]: pgmap v582: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 95 KiB/s wr, 107 op/s
Dec 02 10:14:30 np0005541913.localdomain ceph-mon[298296]: osdmap e240: 6 total, 6 up, 6 in
Dec 02 10:14:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:31.289 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f9ec3f6d-7d6e-4cd3-a305-e37d986864dd", "format": "json"}]: dispatch
Dec 02 10:14:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f9ec3f6d-7d6e-4cd3-a305-e37d986864dd", "force": true, "format": "json"}]: dispatch
Dec 02 10:14:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e241 e241: 6 total, 6 up, 6 in
Dec 02 10:14:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:14:32 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3513386139' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:14:32 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3513386139' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:32 np0005541913.localdomain ceph-mon[298296]: pgmap v584: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 78 KiB/s rd, 95 KiB/s wr, 107 op/s
Dec 02 10:14:32 np0005541913.localdomain ceph-mon[298296]: osdmap e241: 6 total, 6 up, 6 in
Dec 02 10:14:32 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3513386139' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:32 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3513386139' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:33 np0005541913.localdomain sudo[333565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:14:33 np0005541913.localdomain sudo[333565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:14:33 np0005541913.localdomain sudo[333565]: pam_unix(sudo:session): session closed for user root
Dec 02 10:14:33 np0005541913.localdomain sudo[333583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:14:33 np0005541913.localdomain sudo[333583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:14:33 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e242 e242: 6 total, 6 up, 6 in
Dec 02 10:14:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:14:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:14:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:14:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:14:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:14:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:14:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:14:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:14:34 np0005541913.localdomain sudo[333583]: pam_unix(sudo:session): session closed for user root
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3236740117' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3236740117' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:34 np0005541913.localdomain sudo[333633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:14:34 np0005541913.localdomain sudo[333633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:14:34 np0005541913.localdomain sudo[333633]: pam_unix(sudo:session): session closed for user root
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: pgmap v586: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 141 KiB/s rd, 140 KiB/s wr, 196 op/s
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: osdmap e242: 6 total, 6 up, 6 in
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3236740117' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:34 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3236740117' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:35.767 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:14:35 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:35.768 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:14:35 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:35.817 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:35 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e243 e243: 6 total, 6 up, 6 in
Dec 02 10:14:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:14:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:14:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:14:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:14:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:14:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18786 "" "Go-http-client/1.1"
Dec 02 10:14:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:36.295 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e244 e244: 6 total, 6 up, 6 in
Dec 02 10:14:36 np0005541913.localdomain ceph-mon[298296]: pgmap v588: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 54 KiB/s wr, 99 op/s
Dec 02 10:14:36 np0005541913.localdomain ceph-mon[298296]: osdmap e243: 6 total, 6 up, 6 in
Dec 02 10:14:36 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:36 np0005541913.localdomain ceph-mon[298296]: osdmap e244: 6 total, 6 up, 6 in
Dec 02 10:14:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "format": "json"}]: dispatch
Dec 02 10:14:38 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3031033791' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:14:38 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3031033791' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:14:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:14:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:14:38 np0005541913.localdomain podman[333651]: 2025-12-02 10:14:38.460548019 +0000 UTC m=+0.089744153 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:14:38 np0005541913.localdomain podman[333651]: 2025-12-02 10:14:38.471887721 +0000 UTC m=+0.101083815 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:14:38 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:14:39 np0005541913.localdomain ceph-mon[298296]: pgmap v591: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 81 KiB/s rd, 61 KiB/s wr, 112 op/s
Dec 02 10:14:39 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:39.770 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:14:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Dec 02 10:14:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f/0f93d180-183a-4fa4-8649-7ba3ef8441e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:14:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f/0f93d180-183a-4fa4-8649-7ba3ef8441e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:14:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f/0f93d180-183a-4fa4-8649-7ba3ef8441e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:14:40 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:14:40.662 263406 INFO neutron.agent.linux.ip_lib [None req-6ca1148e-65b8-484e-8841-ece0613bc433 - - - - - -] Device tapf119cdef-09 cannot be used as it has no MAC address
Dec 02 10:14:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:40.686 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:40 np0005541913.localdomain kernel: device tapf119cdef-09 entered promiscuous mode
Dec 02 10:14:40 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670480.6962] manager: (tapf119cdef-09): new Generic device (/org/freedesktop/NetworkManager/Devices/93)
Dec 02 10:14:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:40.698 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:14:40Z|00603|binding|INFO|Claiming lport f119cdef-0974-4d2c-8acd-8d7464640ca9 for this chassis.
Dec 02 10:14:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:14:40Z|00604|binding|INFO|f119cdef-0974-4d2c-8acd-8d7464640ca9: Claiming unknown
Dec 02 10:14:40 np0005541913.localdomain systemd-udevd[333681]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:14:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:40.709 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-8703a229-8c49-443e-95c6-aff62a358434', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8703a229-8c49-443e-95c6-aff62a358434', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd858413a9b01463f96545916d2abe5ab', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22d83034-71a8-46e9-a33a-f696e74c13f0, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=f119cdef-0974-4d2c-8acd-8d7464640ca9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:14:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:40.711 160221 INFO neutron.agent.ovn.metadata.agent [-] Port f119cdef-0974-4d2c-8acd-8d7464640ca9 in datapath 8703a229-8c49-443e-95c6-aff62a358434 bound to our chassis
Dec 02 10:14:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:40.713 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port eba8a1ff-9260-4962-baad-7ee950876ce0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:14:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:40.714 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8703a229-8c49-443e-95c6-aff62a358434, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:14:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:14:40.715 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[8e567293-9308-4275-af4a-f12ae6530c35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:14:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf119cdef-09: No such device
Dec 02 10:14:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:14:40Z|00605|binding|INFO|Setting lport f119cdef-0974-4d2c-8acd-8d7464640ca9 ovn-installed in OVS
Dec 02 10:14:40 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:14:40Z|00606|binding|INFO|Setting lport f119cdef-0974-4d2c-8acd-8d7464640ca9 up in Southbound
Dec 02 10:14:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:40.730 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf119cdef-09: No such device
Dec 02 10:14:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:40.732 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf119cdef-09: No such device
Dec 02 10:14:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf119cdef-09: No such device
Dec 02 10:14:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf119cdef-09: No such device
Dec 02 10:14:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf119cdef-09: No such device
Dec 02 10:14:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf119cdef-09: No such device
Dec 02 10:14:40 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tapf119cdef-09: No such device
Dec 02 10:14:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:40.777 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:40.810 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:41 np0005541913.localdomain ceph-mon[298296]: pgmap v592: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 70 KiB/s rd, 28 KiB/s wr, 94 op/s
Dec 02 10:14:41 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "auth_id": "eve49", "tenant_id": "8f75117f8554499b9fbaa9c9062eeeef", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:14:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:41.295 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:41.299 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:41.571 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:41 np0005541913.localdomain podman[333752]: 
Dec 02 10:14:41 np0005541913.localdomain podman[333752]: 2025-12-02 10:14:41.678072623 +0000 UTC m=+0.077612459 container create 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:14:41 np0005541913.localdomain systemd[1]: Started libpod-conmon-069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5.scope.
Dec 02 10:14:41 np0005541913.localdomain systemd[1]: tmp-crun.FXGEDv.mount: Deactivated successfully.
Dec 02 10:14:41 np0005541913.localdomain podman[333752]: 2025-12-02 10:14:41.630251678 +0000 UTC m=+0.029791574 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:14:41 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:14:41 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a0ee3792bce35a0d1020085525a61984484526151f073e1f67a4a8079d9209d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:14:41 np0005541913.localdomain podman[333752]: 2025-12-02 10:14:41.762782551 +0000 UTC m=+0.162322417 container init 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:14:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e245 e245: 6 total, 6 up, 6 in
Dec 02 10:14:41 np0005541913.localdomain podman[333752]: 2025-12-02 10:14:41.768600506 +0000 UTC m=+0.168140382 container start 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:14:41 np0005541913.localdomain dnsmasq[333770]: started, version 2.85 cachesize 150
Dec 02 10:14:41 np0005541913.localdomain dnsmasq[333770]: DNS service limited to local subnets
Dec 02 10:14:41 np0005541913.localdomain dnsmasq[333770]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:14:41 np0005541913.localdomain dnsmasq[333770]: warning: no upstream servers configured
Dec 02 10:14:41 np0005541913.localdomain dnsmasq-dhcp[333770]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:14:41 np0005541913.localdomain dnsmasq[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/addn_hosts - 0 addresses
Dec 02 10:14:41 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/host
Dec 02 10:14:41 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/opts
Dec 02 10:14:41 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:14:41.921 263406 INFO neutron.agent.dhcp.agent [None req-8ac12d5b-b1aa-4eb4-818a-48834904d48d - - - - - -] DHCP configuration for ports {'37cd0238-9054-48a1-8d6c-4a73284b3493'} is completed
Dec 02 10:14:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:14:42.554 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:14:42Z, description=, device_id=3692a4cb-56a0-4a89-90aa-c2a2654d3e13, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b58490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908b585e0>], id=7d96b16f-bbce-4bbd-b3cf-c85a927f8c04, ip_allocation=immediate, mac_address=fa:16:3e:6d:09:0a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:14:38Z, description=, dns_domain=, id=8703a229-8c49-443e-95c6-aff62a358434, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-1306125232-network, port_security_enabled=True, project_id=d858413a9b01463f96545916d2abe5ab, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=770, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3448, status=ACTIVE, subnets=['9d626c62-851c-4a11-822f-bd4dadd5e8b1'], tags=[], tenant_id=d858413a9b01463f96545916d2abe5ab, updated_at=2025-12-02T10:14:38Z, vlan_transparent=None, network_id=8703a229-8c49-443e-95c6-aff62a358434, port_security_enabled=False, project_id=d858413a9b01463f96545916d2abe5ab, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3478, status=DOWN, tags=[], tenant_id=d858413a9b01463f96545916d2abe5ab, updated_at=2025-12-02T10:14:42Z on network 8703a229-8c49-443e-95c6-aff62a358434
Dec 02 10:14:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:14:42 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:14:42 np0005541913.localdomain podman[333772]: 2025-12-02 10:14:42.683635719 +0000 UTC m=+0.068001553 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 02 10:14:42 np0005541913.localdomain systemd[1]: tmp-crun.BBE2DB.mount: Deactivated successfully.
Dec 02 10:14:42 np0005541913.localdomain podman[333817]: 2025-12-02 10:14:42.753247014 +0000 UTC m=+0.056210038 container kill 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 02 10:14:42 np0005541913.localdomain dnsmasq[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/addn_hosts - 1 addresses
Dec 02 10:14:42 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/host
Dec 02 10:14:42 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/opts
Dec 02 10:14:42 np0005541913.localdomain ceph-mon[298296]: pgmap v593: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 23 KiB/s wr, 77 op/s
Dec 02 10:14:42 np0005541913.localdomain ceph-mon[298296]: osdmap e245: 6 total, 6 up, 6 in
Dec 02 10:14:42 np0005541913.localdomain podman[333772]: 2025-12-02 10:14:42.775044786 +0000 UTC m=+0.159410650 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:14:42 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:14:42 np0005541913.localdomain podman[333771]: 2025-12-02 10:14:42.81084427 +0000 UTC m=+0.195987774 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:14:42 np0005541913.localdomain podman[333771]: 2025-12-02 10:14:42.821959476 +0000 UTC m=+0.207102970 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:14:42 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:14:42 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:14:42.979 263406 INFO neutron.agent.dhcp.agent [None req-5ae50839-ca1e-474b-a69d-832c8f395340 - - - - - -] DHCP configuration for ports {'7d96b16f-bbce-4bbd-b3cf-c85a927f8c04'} is completed
Dec 02 10:14:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Dec 02 10:14:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f/0f93d180-183a-4fa4-8649-7ba3ef8441e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:14:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f/0f93d180-183a-4fa4-8649-7ba3ef8441e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:14:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f/0f93d180-183a-4fa4-8649-7ba3ef8441e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:14:44 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:14:44.054 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:14:42Z, description=, device_id=3692a4cb-56a0-4a89-90aa-c2a2654d3e13, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089c4190>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99089c4d00>], id=7d96b16f-bbce-4bbd-b3cf-c85a927f8c04, ip_allocation=immediate, mac_address=fa:16:3e:6d:09:0a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:14:38Z, description=, dns_domain=, id=8703a229-8c49-443e-95c6-aff62a358434, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-1306125232-network, port_security_enabled=True, project_id=d858413a9b01463f96545916d2abe5ab, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=770, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3448, status=ACTIVE, subnets=['9d626c62-851c-4a11-822f-bd4dadd5e8b1'], tags=[], tenant_id=d858413a9b01463f96545916d2abe5ab, updated_at=2025-12-02T10:14:38Z, vlan_transparent=None, network_id=8703a229-8c49-443e-95c6-aff62a358434, port_security_enabled=False, project_id=d858413a9b01463f96545916d2abe5ab, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3478, status=DOWN, tags=[], tenant_id=d858413a9b01463f96545916d2abe5ab, updated_at=2025-12-02T10:14:42Z on network 8703a229-8c49-443e-95c6-aff62a358434
Dec 02 10:14:44 np0005541913.localdomain podman[333874]: 2025-12-02 10:14:44.266535222 +0000 UTC m=+0.060425321 container kill 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:14:44 np0005541913.localdomain dnsmasq[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/addn_hosts - 1 addresses
Dec 02 10:14:44 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/host
Dec 02 10:14:44 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/opts
Dec 02 10:14:44 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:14:44.526 263406 INFO neutron.agent.dhcp.agent [None req-63405262-5246-4eec-a3b7-b293c01031be - - - - - -] DHCP configuration for ports {'7d96b16f-bbce-4bbd-b3cf-c85a927f8c04'} is completed
Dec 02 10:14:44 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "auth_id": "eve48", "tenant_id": "8f75117f8554499b9fbaa9c9062eeeef", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:14:44 np0005541913.localdomain ceph-mon[298296]: pgmap v595: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 64 KiB/s wr, 80 op/s
Dec 02 10:14:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:46.300 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:46 np0005541913.localdomain ceph-mon[298296]: pgmap v596: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 55 KiB/s wr, 69 op/s
Dec 02 10:14:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Dec 02 10:14:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Dec 02 10:14:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Dec 02 10:14:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Dec 02 10:14:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "334c2711-d0f6-419e-922d-408205cc4ec2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "334c2711-d0f6-419e-922d-408205cc4ec2", "format": "json"}]: dispatch
Dec 02 10:14:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "auth_id": "eve48", "format": "json"}]: dispatch
Dec 02 10:14:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "auth_id": "eve48", "format": "json"}]: dispatch
Dec 02 10:14:48 np0005541913.localdomain ceph-mon[298296]: pgmap v597: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 47 KiB/s wr, 58 op/s
Dec 02 10:14:48 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:49 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "25a2b72d-c9e0-4927-869c-054b3b3fd314", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:49 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "25a2b72d-c9e0-4927-869c-054b3b3fd314", "format": "json"}]: dispatch
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.146092) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670490146169, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 2301, "num_deletes": 265, "total_data_size": 3072457, "memory_usage": 3119864, "flush_reason": "Manual Compaction"}
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670490160564, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1997238, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 32012, "largest_seqno": 34308, "table_properties": {"data_size": 1987914, "index_size": 5705, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 22490, "raw_average_key_size": 22, "raw_value_size": 1968271, "raw_average_value_size": 1948, "num_data_blocks": 246, "num_entries": 1010, "num_filter_entries": 1010, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670381, "oldest_key_time": 1764670381, "file_creation_time": 1764670490, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 14558 microseconds, and 6376 cpu microseconds.
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.160659) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1997238 bytes OK
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.160683) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.163466) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.163490) EVENT_LOG_v1 {"time_micros": 1764670490163483, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.163513) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3061557, prev total WAL file size 3061557, number of live WAL files 2.
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.164550) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end)
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1950KB)], [54(17MB)]
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670490164643, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 20451874, "oldest_snapshot_seqno": -1}
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 13998 keys, 18903857 bytes, temperature: kUnknown
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670490269047, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 18903857, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18821342, "index_size": 46446, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35013, "raw_key_size": 374706, "raw_average_key_size": 26, "raw_value_size": 18580746, "raw_average_value_size": 1327, "num_data_blocks": 1750, "num_entries": 13998, "num_filter_entries": 13998, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670490, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.269460) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 18903857 bytes
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.271358) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.7 rd, 180.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 17.6 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(19.7) write-amplify(9.5) OK, records in: 14541, records dropped: 543 output_compression: NoCompression
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.271389) EVENT_LOG_v1 {"time_micros": 1764670490271376, "job": 32, "event": "compaction_finished", "compaction_time_micros": 104516, "compaction_time_cpu_micros": 52126, "output_level": 6, "num_output_files": 1, "total_output_size": 18903857, "num_input_records": 14541, "num_output_records": 13998, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670490272095, "job": 32, "event": "table_file_deletion", "file_number": 56}
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670490275033, "job": 32, "event": "table_file_deletion", "file_number": 54}
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.164444) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.275120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.275126) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.275129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.275132) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:14:50.275135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: pgmap v598: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 90 KiB/s wr, 14 op/s
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "334c2711-d0f6-419e-922d-408205cc4ec2", "snap_name": "4297d647-9a4c-4f1f-9f4b-d5919a5d649a", "format": "json"}]: dispatch
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "auth_id": "eve47", "tenant_id": "8f75117f8554499b9fbaa9c9062eeeef", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f/0f93d180-183a-4fa4-8649-7ba3ef8441e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f/0f93d180-183a-4fa4-8649-7ba3ef8441e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:14:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f/0f93d180-183a-4fa4-8649-7ba3ef8441e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:14:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:51.304 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:51.308 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:52 np0005541913.localdomain ceph-mon[298296]: pgmap v599: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 90 KiB/s wr, 14 op/s
Dec 02 10:14:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "25a2b72d-c9e0-4927-869c-054b3b3fd314", "format": "json"}]: dispatch
Dec 02 10:14:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "25a2b72d-c9e0-4927-869c-054b3b3fd314", "force": true, "format": "json"}]: dispatch
Dec 02 10:14:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2237002208' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e246 e246: 6 total, 6 up, 6 in
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ff6ef38f-1e0f-40c8-83c7-811b055e05a4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ff6ef38f-1e0f-40c8-83c7-811b055e05a4", "format": "json"}]: dispatch
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2237002208' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Dec 02 10:14:53 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Dec 02 10:14:54 np0005541913.localdomain ceph-mon[298296]: pgmap v600: 177 pgs: 177 active+clean; 252 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.8 MiB/s rd, 2.0 MiB/s wr, 52 op/s
Dec 02 10:14:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "794bbe46-18dc-46bb-ae82-7c247e68f409", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "794bbe46-18dc-46bb-ae82-7c247e68f409", "format": "json"}]: dispatch
Dec 02 10:14:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "auth_id": "eve47", "format": "json"}]: dispatch
Dec 02 10:14:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Dec 02 10:14:54 np0005541913.localdomain ceph-mon[298296]: osdmap e246: 6 total, 6 up, 6 in
Dec 02 10:14:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "auth_id": "eve47", "format": "json"}]: dispatch
Dec 02 10:14:54 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e247 e247: 6 total, 6 up, 6 in
Dec 02 10:14:55 np0005541913.localdomain ceph-mon[298296]: osdmap e247: 6 total, 6 up, 6 in
Dec 02 10:14:55 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:14:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:56.307 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:14:56 np0005541913.localdomain ceph-mon[298296]: pgmap v603: 177 pgs: 177 active+clean; 252 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.8 MiB/s wr, 73 op/s
Dec 02 10:14:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ec95b4f7-9427-4d16-81f2-f3cade322496", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:14:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ec95b4f7-9427-4d16-81f2-f3cade322496", "format": "json"}]: dispatch
Dec 02 10:14:57 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:14:57 np0005541913.localdomain podman[333895]: 2025-12-02 10:14:57.444601735 +0000 UTC m=+0.084895614 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 10:14:57 np0005541913.localdomain podman[333895]: 2025-12-02 10:14:57.487275322 +0000 UTC m=+0.127569111 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:14:57 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:14:57 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "794bbe46-18dc-46bb-ae82-7c247e68f409", "format": "json"}]: dispatch
Dec 02 10:14:57 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "794bbe46-18dc-46bb-ae82-7c247e68f409", "force": true, "format": "json"}]: dispatch
Dec 02 10:14:57 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Dec 02 10:14:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Dec 02 10:14:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Dec 02 10:14:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Dec 02 10:14:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1466419913' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:14:59 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e248 e248: 6 total, 6 up, 6 in
Dec 02 10:14:59 np0005541913.localdomain ceph-mon[298296]: pgmap v604: 177 pgs: 177 active+clean; 252 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 2.7 MiB/s wr, 57 op/s
Dec 02 10:14:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "auth_id": "eve49", "format": "json"}]: dispatch
Dec 02 10:14:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "auth_id": "eve49", "format": "json"}]: dispatch
Dec 02 10:14:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "format": "json"}]: dispatch
Dec 02 10:14:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "aa1ea9b3-cd6e-4bc7-a88f-b8893a4beb4f", "force": true, "format": "json"}]: dispatch
Dec 02 10:14:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:14:59.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:15:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ec95b4f7-9427-4d16-81f2-f3cade322496", "format": "json"}]: dispatch
Dec 02 10:15:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ec95b4f7-9427-4d16-81f2-f3cade322496", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:00 np0005541913.localdomain ceph-mon[298296]: osdmap e248: 6 total, 6 up, 6 in
Dec 02 10:15:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e249 e249: 6 total, 6 up, 6 in
Dec 02 10:15:01 np0005541913.localdomain ceph-mon[298296]: pgmap v606: 177 pgs: 177 active+clean; 299 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.7 MiB/s wr, 75 op/s
Dec 02 10:15:01 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:01.308 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:01.312 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:15:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:15:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:15:01 np0005541913.localdomain podman[333916]: 2025-12-02 10:15:01.454697669 +0000 UTC m=+0.087861922 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 10:15:01 np0005541913.localdomain podman[333917]: 2025-12-02 10:15:01.509457239 +0000 UTC m=+0.142604862 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:15:01 np0005541913.localdomain podman[333916]: 2025-12-02 10:15:01.517265896 +0000 UTC m=+0.150430129 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm)
Dec 02 10:15:01 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:15:01 np0005541913.localdomain podman[333915]: 2025-12-02 10:15:01.554843258 +0000 UTC m=+0.191577647 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:15:01 np0005541913.localdomain podman[333915]: 2025-12-02 10:15:01.563977581 +0000 UTC m=+0.200712010 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:15:01 np0005541913.localdomain podman[333917]: 2025-12-02 10:15:01.575319433 +0000 UTC m=+0.208467036 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:15:01 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:15:01 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:15:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:15:01 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2758567838' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:15:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:15:01 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2758567838' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:15:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "95ddc110-cc3c-4c61-8c87-bf390fb060a5", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "95ddc110-cc3c-4c61-8c87-bf390fb060a5", "format": "json"}]: dispatch
Dec 02 10:15:02 np0005541913.localdomain ceph-mon[298296]: osdmap e249: 6 total, 6 up, 6 in
Dec 02 10:15:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2758567838' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:15:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2758567838' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:15:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:02.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:15:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:02.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:15:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:15:03.058 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:15:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:15:03.058 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:15:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:15:03.059 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:15:03 np0005541913.localdomain ceph-mon[298296]: pgmap v608: 177 pgs: 177 active+clean; 299 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 3.5 MiB/s wr, 72 op/s
Dec 02 10:15:03 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ff6ef38f-1e0f-40c8-83c7-811b055e05a4", "format": "json"}]: dispatch
Dec 02 10:15:03 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ff6ef38f-1e0f-40c8-83c7-811b055e05a4", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e250 e250: 6 total, 6 up, 6 in
Dec 02 10:15:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:03.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:15:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:03.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:15:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:03.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:15:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:03.920 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:15:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:03.921 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:15:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:03.921 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:15:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:03.922 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:15:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:15:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:15:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:15:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:15:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:15:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:15:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: osdmap e250: 6 total, 6 up, 6 in
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: pgmap v610: 177 pgs: 177 active+clean; 299 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 7.2 MiB/s rd, 7.3 MiB/s wr, 200 op/s
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "95ddc110-cc3c-4c61-8c87-bf390fb060a5", "format": "json"}]: dispatch
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "95ddc110-cc3c-4c61-8c87-bf390fb060a5", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/878020292' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/878020292' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.443 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.459 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.460 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.460 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.476 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.476 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.477 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.477 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.478 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:15:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/154205066' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:04.981 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.055 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.056 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/878020292' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/878020292' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "format": "json"}]: dispatch
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/154205066' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "9fcb2cae-930f-42ab-bc64-d18acc6b4eec", "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e251 e251: 6 total, 6 up, 6 in
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.247 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.248 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11070MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.249 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.249 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.331 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.331 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.331 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.365 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:15:05 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/153256759' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.808 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:15:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:05.815 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:15:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:15:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:15:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:06.051 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:15:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:06.054 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:15:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:06.054 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:15:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:15:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156107 "" "Go-http-client/1.1"
Dec 02 10:15:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:15:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19260 "" "Go-http-client/1.1"
Dec 02 10:15:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:06.312 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:06.315 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:06 np0005541913.localdomain ceph-mon[298296]: osdmap e251: 6 total, 6 up, 6 in
Dec 02 10:15:06 np0005541913.localdomain ceph-mon[298296]: pgmap v612: 177 pgs: 177 active+clean; 299 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.6 MiB/s wr, 124 op/s
Dec 02 10:15:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/153256759' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e252 e252: 6 total, 6 up, 6 in
Dec 02 10:15:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3613759236' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:15:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3613759236' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:15:07 np0005541913.localdomain ceph-mon[298296]: osdmap e252: 6 total, 6 up, 6 in
Dec 02 10:15:07 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:07 np0005541913.localdomain ceph-mgr[288059]: client.0 ms_handle_reset on v2:172.18.0.108:6810/4212177170
Dec 02 10:15:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "874ecabd-a028-4aa4-9a5c-9d15f18fe0c8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "874ecabd-a028-4aa4-9a5c-9d15f18fe0c8", "format": "json"}]: dispatch
Dec 02 10:15:08 np0005541913.localdomain ceph-mon[298296]: pgmap v614: 177 pgs: 177 active+clean; 299 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.6 MiB/s wr, 124 op/s
Dec 02 10:15:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2291623585' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2041560746' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:08 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:15:08.864 2 INFO neutron.agent.securitygroups_rpc [None req-e4074800-d361-45b9-b812-e8981daf28f3 0e5c738ba752455b908099b234a743a2 d858413a9b01463f96545916d2abe5ab - - default default] Security group rule updated ['10785715-ddea-43bb-82fa-9f44a2fb1faa']
Dec 02 10:15:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:15:09 np0005541913.localdomain systemd[1]: tmp-crun.aho2ke.mount: Deactivated successfully.
Dec 02 10:15:09 np0005541913.localdomain podman[334019]: 2025-12-02 10:15:09.197896684 +0000 UTC m=+0.099332018 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:15:09 np0005541913.localdomain podman[334019]: 2025-12-02 10:15:09.212304498 +0000 UTC m=+0.113739822 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Dec 02 10:15:09 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:15:09 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:15:09.250 2 INFO neutron.agent.securitygroups_rpc [None req-ee552935-4da7-44ca-8e38-6eb6181199e8 0e5c738ba752455b908099b234a743a2 d858413a9b01463f96545916d2abe5ab - - default default] Security group rule updated ['10785715-ddea-43bb-82fa-9f44a2fb1faa']
Dec 02 10:15:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:09.422 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:15:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:09.422 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:15:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:09.423 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:15:09 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "9fcb2cae-930f-42ab-bc64-d18acc6b4eec", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:09 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d194b0f5-d0ac-4694-aaca-c67668af8e04", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:09 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d194b0f5-d0ac-4694-aaca-c67668af8e04", "format": "json"}]: dispatch
Dec 02 10:15:10 np0005541913.localdomain ceph-mon[298296]: pgmap v615: 177 pgs: 177 active+clean; 207 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 62 KiB/s wr, 80 op/s
Dec 02 10:15:10 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:10 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "format": "json"}]: dispatch
Dec 02 10:15:10 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:11.317 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:11.321 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e253 e253: 6 total, 6 up, 6 in
Dec 02 10:15:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:11.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:15:11 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "874ecabd-a028-4aa4-9a5c-9d15f18fe0c8", "format": "json"}]: dispatch
Dec 02 10:15:11 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "874ecabd-a028-4aa4-9a5c-9d15f18fe0c8", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:11 np0005541913.localdomain ceph-mon[298296]: osdmap e253: 6 total, 6 up, 6 in
Dec 02 10:15:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/d194b0f5-d0ac-4694-aaca-c67668af8e04/f0230cb5-166a-4bc3-a680-7635315554d3", "osd", "allow rw pool=manila_data namespace=fsvolumens_d194b0f5-d0ac-4694-aaca-c67668af8e04", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/d194b0f5-d0ac-4694-aaca-c67668af8e04/f0230cb5-166a-4bc3-a680-7635315554d3", "osd", "allow rw pool=manila_data namespace=fsvolumens_d194b0f5-d0ac-4694-aaca-c67668af8e04", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:11 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/d194b0f5-d0ac-4694-aaca-c67668af8e04/f0230cb5-166a-4bc3-a680-7635315554d3", "osd", "allow rw pool=manila_data namespace=fsvolumens_d194b0f5-d0ac-4694-aaca-c67668af8e04", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:12.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:15:12 np0005541913.localdomain ceph-mon[298296]: pgmap v616: 177 pgs: 177 active+clean; 207 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 47 KiB/s wr, 61 op/s
Dec 02 10:15:12 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "20d14646-9b62-4b24-984f-6434ad453069", "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:12 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "d194b0f5-d0ac-4694-aaca-c67668af8e04", "auth_id": "tempest-cephx-id-1696860369", "tenant_id": "82d5a09e66904b8ca3c7a7850f1e5c52", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:15:13 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:15:13 np0005541913.localdomain podman[334039]: 2025-12-02 10:15:13.445592451 +0000 UTC m=+0.078819401 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller)
Dec 02 10:15:13 np0005541913.localdomain podman[334038]: 2025-12-02 10:15:13.507402379 +0000 UTC m=+0.139623742 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:15:13 np0005541913.localdomain podman[334039]: 2025-12-02 10:15:13.51421115 +0000 UTC m=+0.147438130 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 02 10:15:13 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:15:13 np0005541913.localdomain podman[334038]: 2025-12-02 10:15:13.566595366 +0000 UTC m=+0.198816719 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 10:15:13 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:15:13 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:15:13.700 2 INFO neutron.agent.securitygroups_rpc [req-a541e13d-87f6-4580-832f-af5d7aef99a4 req-c15ffc3e-ba6d-409e-8103-3b4ea0d7e66e 0e5c738ba752455b908099b234a743a2 d858413a9b01463f96545916d2abe5ab - - default default] Security group member updated ['10785715-ddea-43bb-82fa-9f44a2fb1faa']
Dec 02 10:15:13 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:15:13.826 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:15:13Z, description=, device_id=e4135ac9-548a-4e8d-99d6-cde8dedb2c77, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d0430>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087d0d90>], id=5312b3e8-70f6-4e16-95ba-31b46130d41f, ip_allocation=immediate, mac_address=fa:16:3e:77:0c:21, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:14:38Z, description=, dns_domain=, id=8703a229-8c49-443e-95c6-aff62a358434, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-1306125232-network, port_security_enabled=True, project_id=d858413a9b01463f96545916d2abe5ab, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=770, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3448, status=ACTIVE, subnets=['9d626c62-851c-4a11-822f-bd4dadd5e8b1'], tags=[], tenant_id=d858413a9b01463f96545916d2abe5ab, updated_at=2025-12-02T10:14:38Z, vlan_transparent=None, network_id=8703a229-8c49-443e-95c6-aff62a358434, port_security_enabled=True, project_id=d858413a9b01463f96545916d2abe5ab, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['10785715-ddea-43bb-82fa-9f44a2fb1faa'], standard_attr_id=3579, status=DOWN, tags=[], tenant_id=d858413a9b01463f96545916d2abe5ab, updated_at=2025-12-02T10:15:13Z on network 8703a229-8c49-443e-95c6-aff62a358434
Dec 02 10:15:13 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/703101738' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:13 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:13 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:14 np0005541913.localdomain dnsmasq[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/addn_hosts - 2 addresses
Dec 02 10:15:14 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/host
Dec 02 10:15:14 np0005541913.localdomain podman[334104]: 2025-12-02 10:15:14.062586574 +0000 UTC m=+0.068743363 container kill 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 02 10:15:14 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/opts
Dec 02 10:15:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:15:14.324 263406 INFO neutron.agent.dhcp.agent [None req-d128d369-b0b2-4225-a1b6-72f55a995efa - - - - - -] DHCP configuration for ports {'5312b3e8-70f6-4e16-95ba-31b46130d41f'} is completed
Dec 02 10:15:14 np0005541913.localdomain systemd[1]: tmp-crun.Q3Q78u.mount: Deactivated successfully.
Dec 02 10:15:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:15:14.567 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005541914.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:15:13Z, description=, device_id=e4135ac9-548a-4e8d-99d6-cde8dedb2c77, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087eebb0>], dns_domain=, dns_name=tempest-volumesbackupstest-instance-296444076, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99087ee040>], id=5312b3e8-70f6-4e16-95ba-31b46130d41f, ip_allocation=immediate, mac_address=fa:16:3e:77:0c:21, name=, network_id=8703a229-8c49-443e-95c6-aff62a358434, port_security_enabled=True, project_id=d858413a9b01463f96545916d2abe5ab, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['10785715-ddea-43bb-82fa-9f44a2fb1faa'], standard_attr_id=3579, status=DOWN, tags=[], tenant_id=d858413a9b01463f96545916d2abe5ab, updated_at=2025-12-02T10:15:14Z on network 8703a229-8c49-443e-95c6-aff62a358434
Dec 02 10:15:14 np0005541913.localdomain dnsmasq[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/addn_hosts - 2 addresses
Dec 02 10:15:14 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/host
Dec 02 10:15:14 np0005541913.localdomain podman[334141]: 2025-12-02 10:15:14.756494145 +0000 UTC m=+0.049591932 container kill 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 02 10:15:14 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/opts
Dec 02 10:15:14 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "07b7e455-1272-48fc-92f9-fd54c3fafcb0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:14 np0005541913.localdomain ceph-mon[298296]: pgmap v618: 177 pgs: 177 active+clean; 208 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 138 KiB/s wr, 67 op/s
Dec 02 10:15:14 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "07b7e455-1272-48fc-92f9-fd54c3fafcb0", "format": "json"}]: dispatch
Dec 02 10:15:14 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f9911edf-08c0-404a-9b15-1750f599217e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:14 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f9911edf-08c0-404a-9b15-1750f599217e", "format": "json"}]: dispatch
Dec 02 10:15:14 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:15:14.947 263406 INFO neutron.agent.dhcp.agent [None req-93515900-132c-4ee3-b665-942fd11f9c32 - - - - - -] DHCP configuration for ports {'5312b3e8-70f6-4e16-95ba-31b46130d41f'} is completed
Dec 02 10:15:15 np0005541913.localdomain sshd[334163]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:15:15 np0005541913.localdomain sshd[334163]: Received disconnect from 193.46.255.103 port 15278:11:  [preauth]
Dec 02 10:15:15 np0005541913.localdomain sshd[334163]: Disconnected from authenticating user root 193.46.255.103 port 15278 [preauth]
Dec 02 10:15:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "20d14646-9b62-4b24-984f-6434ad453069", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "d194b0f5-d0ac-4694-aaca-c67668af8e04", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:15 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"}]': finished
Dec 02 10:15:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2486108379' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:16.322 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:15:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:16.324 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:16.325 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:15:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:16.325 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:15:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:16.325 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:15:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:16.327 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: pgmap v619: 177 pgs: 177 active+clean; 208 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 131 KiB/s wr, 63 op/s
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "d194b0f5-d0ac-4694-aaca-c67668af8e04", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d194b0f5-d0ac-4694-aaca-c67668af8e04", "format": "json"}]: dispatch
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d194b0f5-d0ac-4694-aaca-c67668af8e04", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3419331710' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/07b7e455-1272-48fc-92f9-fd54c3fafcb0/1c901b4c-031b-4c12-b1cb-8ac5e6296378", "osd", "allow rw pool=manila_data namespace=fsvolumens_07b7e455-1272-48fc-92f9-fd54c3fafcb0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/07b7e455-1272-48fc-92f9-fd54c3fafcb0/1c901b4c-031b-4c12-b1cb-8ac5e6296378", "osd", "allow rw pool=manila_data namespace=fsvolumens_07b7e455-1272-48fc-92f9-fd54c3fafcb0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:16 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/07b7e455-1272-48fc-92f9-fd54c3fafcb0/1c901b4c-031b-4c12-b1cb-8ac5e6296378", "osd", "allow rw pool=manila_data namespace=fsvolumens_07b7e455-1272-48fc-92f9-fd54c3fafcb0", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:17 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "07b7e455-1272-48fc-92f9-fd54c3fafcb0", "auth_id": "Joe", "tenant_id": "0fe90f11d3f64e12b3591732792a929e", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:17 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f9911edf-08c0-404a-9b15-1750f599217e", "format": "json"}]: dispatch
Dec 02 10:15:17 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f9911edf-08c0-404a-9b15-1750f599217e", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:19 np0005541913.localdomain ceph-mon[298296]: pgmap v620: 177 pgs: 177 active+clean; 208 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 111 KiB/s wr, 53 op/s
Dec 02 10:15:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f6f1c661-5fb8-4466-9254-d282f758f450", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f6f1c661-5fb8-4466-9254-d282f758f450", "format": "json"}]: dispatch
Dec 02 10:15:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2971877282' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:20 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4b34f061-715a-44a3-9eab-41d055e085ea", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:20 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4b34f061-715a-44a3-9eab-41d055e085ea", "format": "json"}]: dispatch
Dec 02 10:15:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3740406768' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:21 np0005541913.localdomain ceph-mon[298296]: pgmap v621: 177 pgs: 177 active+clean; 255 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 2.3 MiB/s wr, 57 op/s
Dec 02 10:15:21 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:21 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "format": "json"}]: dispatch
Dec 02 10:15:21 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:21 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:21.326 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ff1ce34f-180e-4cd7-80c2-e7cc0a1e2489", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ff1ce34f-180e-4cd7-80c2-e7cc0a1e2489", "format": "json"}]: dispatch
Dec 02 10:15:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/4b34f061-715a-44a3-9eab-41d055e085ea/952959d8-8df4-478f-98b8-ef136b3959a9", "osd", "allow rw pool=manila_data namespace=fsvolumens_4b34f061-715a-44a3-9eab-41d055e085ea", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/4b34f061-715a-44a3-9eab-41d055e085ea/952959d8-8df4-478f-98b8-ef136b3959a9", "osd", "allow rw pool=manila_data namespace=fsvolumens_4b34f061-715a-44a3-9eab-41d055e085ea", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:22 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/4b34f061-715a-44a3-9eab-41d055e085ea/952959d8-8df4-478f-98b8-ef136b3959a9", "osd", "allow rw pool=manila_data namespace=fsvolumens_4b34f061-715a-44a3-9eab-41d055e085ea", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:23 np0005541913.localdomain ceph-mon[298296]: pgmap v622: 177 pgs: 177 active+clean; 255 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 2.3 MiB/s wr, 57 op/s
Dec 02 10:15:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f6f1c661-5fb8-4466-9254-d282f758f450", "format": "json"}]: dispatch
Dec 02 10:15:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f6f1c661-5fb8-4466-9254-d282f758f450", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4b34f061-715a-44a3-9eab-41d055e085ea", "auth_id": "tempest-cephx-id-1696860369", "tenant_id": "82d5a09e66904b8ca3c7a7850f1e5c52", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:24 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "auth_id": "Joe", "tenant_id": "3212fac1e026474b9022ee93e4d925a9", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:24 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Dec 02 10:15:25 np0005541913.localdomain ceph-mon[298296]: pgmap v623: 177 pgs: 177 active+clean; 256 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 2.1 MiB/s wr, 119 op/s
Dec 02 10:15:25 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ff1ce34f-180e-4cd7-80c2-e7cc0a1e2489", "format": "json"}]: dispatch
Dec 02 10:15:25 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ff1ce34f-180e-4cd7-80c2-e7cc0a1e2489", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:25 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:25 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"}]': finished
Dec 02 10:15:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "816532a3-40a4-4c5f-a808-14898d84932f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "816532a3-40a4-4c5f-a808-14898d84932f", "format": "json"}]: dispatch
Dec 02 10:15:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4b34f061-715a-44a3-9eab-41d055e085ea", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4b34f061-715a-44a3-9eab-41d055e085ea", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4b34f061-715a-44a3-9eab-41d055e085ea", "format": "json"}]: dispatch
Dec 02 10:15:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4b34f061-715a-44a3-9eab-41d055e085ea", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:26.328 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:15:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:27 np0005541913.localdomain ceph-mon[298296]: pgmap v624: 177 pgs: 177 active+clean; 256 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 1.9 MiB/s wr, 110 op/s
Dec 02 10:15:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2071519372", "format": "json"} : dispatch
Dec 02 10:15:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2071519372", "caps": ["mds", "allow rw path=/volumes/_nogroup/a1ba20ee-ed37-461f-8a6b-289e0637343e/44bd8d01-8657-4e23-ba40-e9561a6ed94b", "osd", "allow rw pool=manila_data namespace=fsvolumens_a1ba20ee-ed37-461f-8a6b-289e0637343e", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2071519372", "caps": ["mds", "allow rw path=/volumes/_nogroup/a1ba20ee-ed37-461f-8a6b-289e0637343e/44bd8d01-8657-4e23-ba40-e9561a6ed94b", "osd", "allow rw pool=manila_data namespace=fsvolumens_a1ba20ee-ed37-461f-8a6b-289e0637343e", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:27 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2071519372", "caps": ["mds", "allow rw path=/volumes/_nogroup/a1ba20ee-ed37-461f-8a6b-289e0637343e/44bd8d01-8657-4e23-ba40-e9561a6ed94b", "osd", "allow rw pool=manila_data namespace=fsvolumens_a1ba20ee-ed37-461f-8a6b-289e0637343e", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "auth_id": "tempest-cephx-id-2071519372", "tenant_id": "3212fac1e026474b9022ee93e4d925a9", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fb9dc736-c0fd-42af-8ddc-944e8a1e50c5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fb9dc736-c0fd-42af-8ddc-944e8a1e50c5", "format": "json"}]: dispatch
Dec 02 10:15:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "334c2711-d0f6-419e-922d-408205cc4ec2", "snap_name": "4297d647-9a4c-4f1f-9f4b-d5919a5d649a_de240f12-a8e2-4a29-90c2-24d0d5497a6c", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "334c2711-d0f6-419e-922d-408205cc4ec2", "snap_name": "4297d647-9a4c-4f1f-9f4b-d5919a5d649a", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:28 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:15:28 np0005541913.localdomain podman[334165]: 2025-12-02 10:15:28.458685744 +0000 UTC m=+0.086848296 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 10:15:28 np0005541913.localdomain podman[334165]: 2025-12-02 10:15:28.474120804 +0000 UTC m=+0.102283356 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 02 10:15:28 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:15:29 np0005541913.localdomain ceph-mon[298296]: pgmap v625: 177 pgs: 177 active+clean; 256 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 1.9 MiB/s wr, 110 op/s
Dec 02 10:15:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "format": "json"}]: dispatch
Dec 02 10:15:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "816532a3-40a4-4c5f-a808-14898d84932f", "format": "json"}]: dispatch
Dec 02 10:15:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "816532a3-40a4-4c5f-a808-14898d84932f", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:30 np0005541913.localdomain ceph-mon[298296]: pgmap v626: 177 pgs: 177 active+clean; 257 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 2.0 MiB/s wr, 117 op/s
Dec 02 10:15:30 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 02 10:15:30 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 02 10:15:30 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fb9dc736-c0fd-42af-8ddc-944e8a1e50c5", "snap_name": "fd340733-2ac3-47a8-9e18-7daf7e9911c9", "format": "json"}]: dispatch
Dec 02 10:15:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "334c2711-d0f6-419e-922d-408205cc4ec2", "format": "json"}]: dispatch
Dec 02 10:15:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "334c2711-d0f6-419e-922d-408205cc4ec2", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:31 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:31.333 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:15:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:32 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "auth_id": "tempest-cephx-id-1696860369", "tenant_id": "82d5a09e66904b8ca3c7a7850f1e5c52", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2/97908fdd-14b6-443f-bfcc-d98424d8ba49", "osd", "allow rw pool=manila_data namespace=fsvolumens_76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2/97908fdd-14b6-443f-bfcc-d98424d8ba49", "osd", "allow rw pool=manila_data namespace=fsvolumens_76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:32 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2/97908fdd-14b6-443f-bfcc-d98424d8ba49", "osd", "allow rw pool=manila_data namespace=fsvolumens_76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:32 np0005541913.localdomain ceph-mon[298296]: pgmap v627: 177 pgs: 177 active+clean; 257 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 150 KiB/s wr, 73 op/s
Dec 02 10:15:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:15:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:15:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:15:32 np0005541913.localdomain podman[334185]: 2025-12-02 10:15:32.450262756 +0000 UTC m=+0.088265243 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, distribution-scope=public)
Dec 02 10:15:32 np0005541913.localdomain podman[334185]: 2025-12-02 10:15:32.465982395 +0000 UTC m=+0.103984872 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:15:32 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:15:32 np0005541913.localdomain systemd[1]: tmp-crun.sGSVw0.mount: Deactivated successfully.
Dec 02 10:15:32 np0005541913.localdomain podman[334186]: 2025-12-02 10:15:32.553906907 +0000 UTC m=+0.185134764 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:15:32 np0005541913.localdomain podman[334186]: 2025-12-02 10:15:32.563946995 +0000 UTC m=+0.195174862 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:15:32 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:15:32 np0005541913.localdomain podman[334184]: 2025-12-02 10:15:32.607866466 +0000 UTC m=+0.244418005 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:15:32 np0005541913.localdomain podman[334184]: 2025-12-02 10:15:32.641126333 +0000 UTC m=+0.277677852 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 02 10:15:32 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:15:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e254 e254: 6 total, 6 up, 6 in
Dec 02 10:15:33 np0005541913.localdomain ceph-mon[298296]: osdmap e254: 6 total, 6 up, 6 in
Dec 02 10:15:33 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "auth_id": "tempest-cephx-id-2071519372", "format": "json"}]: dispatch
Dec 02 10:15:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2071519372", "format": "json"} : dispatch
Dec 02 10:15:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2071519372"} : dispatch
Dec 02 10:15:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2071519372"} : dispatch
Dec 02 10:15:33 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2071519372"}]': finished
Dec 02 10:15:33 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "auth_id": "tempest-cephx-id-2071519372", "format": "json"}]: dispatch
Dec 02 10:15:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:15:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:15:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:15:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:15:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:15:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:15:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:15:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:15:34 np0005541913.localdomain sudo[334244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:15:34 np0005541913.localdomain sudo[334244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:15:34 np0005541913.localdomain sudo[334244]: pam_unix(sudo:session): session closed for user root
Dec 02 10:15:34 np0005541913.localdomain sudo[334262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:15:34 np0005541913.localdomain sudo[334262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:15:35 np0005541913.localdomain ceph-mon[298296]: pgmap v629: 177 pgs: 2 active+clean+snaptrim, 2 active+clean+snaptrim_wait, 173 active+clean; 290 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 KiB/s rd, 2.7 MiB/s wr, 92 op/s
Dec 02 10:15:35 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fb9dc736-c0fd-42af-8ddc-944e8a1e50c5", "snap_name": "fd340733-2ac3-47a8-9e18-7daf7e9911c9_af7ac55c-f3f5-4ae4-aac4-245996ebb306", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:35 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fb9dc736-c0fd-42af-8ddc-944e8a1e50c5", "snap_name": "fd340733-2ac3-47a8-9e18-7daf7e9911c9", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:35 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:35 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:35 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:35 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"}]': finished
Dec 02 10:15:35 np0005541913.localdomain sudo[334262]: pam_unix(sudo:session): session closed for user root
Dec 02 10:15:35 np0005541913.localdomain sudo[334312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:15:35 np0005541913.localdomain sudo[334312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:15:35 np0005541913.localdomain sudo[334312]: pam_unix(sudo:session): session closed for user root
Dec 02 10:15:36 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:36 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:36 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "format": "json"}]: dispatch
Dec 02 10:15:36 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "76f373b7-a3c0-41f8-a1fb-77eeaafdd9b2", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:15:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:15:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:15:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:15:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:15:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:15:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:15:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156107 "" "Go-http-client/1.1"
Dec 02 10:15:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:15:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19268 "" "Go-http-client/1.1"
Dec 02 10:15:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:36.336 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:37 np0005541913.localdomain ceph-mon[298296]: pgmap v630: 177 pgs: 2 active+clean+snaptrim, 2 active+clean+snaptrim_wait, 173 active+clean; 290 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 KiB/s rd, 2.7 MiB/s wr, 92 op/s
Dec 02 10:15:37 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "07b7e455-1272-48fc-92f9-fd54c3fafcb0", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 02 10:15:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Dec 02 10:15:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Dec 02 10:15:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Dec 02 10:15:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Dec 02 10:15:37 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e255 e255: 6 total, 6 up, 6 in
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "07b7e455-1272-48fc-92f9-fd54c3fafcb0", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fb9dc736-c0fd-42af-8ddc-944e8a1e50c5", "format": "json"}]: dispatch
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fb9dc736-c0fd-42af-8ddc-944e8a1e50c5", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: osdmap e255: 6 total, 6 up, 6 in
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 02 10:15:38 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1723398924' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:39 np0005541913.localdomain ceph-mon[298296]: pgmap v631: 177 pgs: 2 active+clean+snaptrim, 2 active+clean+snaptrim_wait, 173 active+clean; 290 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 KiB/s rd, 2.7 MiB/s wr, 92 op/s
Dec 02 10:15:39 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "tenant_id": "82d5a09e66904b8ca3c7a7850f1e5c52", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:39 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1723398924' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:15:39 np0005541913.localdomain podman[334330]: 2025-12-02 10:15:39.456374101 +0000 UTC m=+0.098473025 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec 02 10:15:39 np0005541913.localdomain podman[334330]: 2025-12-02 10:15:39.468973227 +0000 UTC m=+0.111072131 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 10:15:39 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:15:40 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e256 e256: 6 total, 6 up, 6 in
Dec 02 10:15:40 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Dec 02 10:15:40 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3360648141' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:40 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 02 10:15:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:15:40.338 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:15:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:15:40.339 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:15:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:40.340 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: pgmap v633: 177 pgs: 177 active+clean; 291 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 533 KiB/s rd, 3.4 MiB/s wr, 116 op/s
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "auth_id": "admin", "tenant_id": "0fe90f11d3f64e12b3591732792a929e", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: osdmap e256: 6 total, 6 up, 6 in
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"}]': finished
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e257 e257: 6 total, 6 up, 6 in
Dec 02 10:15:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:41.342 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e258 e258: 6 total, 6 up, 6 in
Dec 02 10:15:42 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:42 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:42 np0005541913.localdomain ceph-mon[298296]: osdmap e257: 6 total, 6 up, 6 in
Dec 02 10:15:42 np0005541913.localdomain ceph-mon[298296]: osdmap e258: 6 total, 6 up, 6 in
Dec 02 10:15:43 np0005541913.localdomain ceph-mon[298296]: pgmap v636: 177 pgs: 177 active+clean; 291 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.2 KiB/s rd, 205 KiB/s wr, 15 op/s
Dec 02 10:15:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Dec 02 10:15:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/738f4ca9-41a9-48cc-8ca1-8d9ae9041202/ae52ead4-7b68-47be-8dae-42ce82602ac7", "osd", "allow rw pool=manila_data namespace=fsvolumens_738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/738f4ca9-41a9-48cc-8ca1-8d9ae9041202/ae52ead4-7b68-47be-8dae-42ce82602ac7", "osd", "allow rw pool=manila_data namespace=fsvolumens_738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/738f4ca9-41a9-48cc-8ca1-8d9ae9041202/ae52ead4-7b68-47be-8dae-42ce82602ac7", "osd", "allow rw pool=manila_data namespace=fsvolumens_738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:43 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/257232119' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:15:43.340 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:15:44 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "auth_id": "david", "tenant_id": "0fe90f11d3f64e12b3591732792a929e", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:44 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e259 e259: 6 total, 6 up, 6 in
Dec 02 10:15:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:15:44 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:15:44 np0005541913.localdomain podman[334350]: 2025-12-02 10:15:44.507951751 +0000 UTC m=+0.144563864 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:15:44 np0005541913.localdomain podman[334349]: 2025-12-02 10:15:44.487443834 +0000 UTC m=+0.129230525 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:15:44 np0005541913.localdomain podman[334349]: 2025-12-02 10:15:44.573042185 +0000 UTC m=+0.214828836 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:15:44 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:15:44 np0005541913.localdomain podman[334350]: 2025-12-02 10:15:44.590287235 +0000 UTC m=+0.226899358 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:15:44 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:15:45 np0005541913.localdomain ceph-mon[298296]: pgmap v638: 177 pgs: 177 active+clean; 292 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 367 KiB/s wr, 64 op/s
Dec 02 10:15:45 np0005541913.localdomain ceph-mon[298296]: osdmap e259: 6 total, 6 up, 6 in
Dec 02 10:15:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:45 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e260 e260: 6 total, 6 up, 6 in
Dec 02 10:15:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "tenant_id": "82d5a09e66904b8ca3c7a7850f1e5c52", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:46 np0005541913.localdomain ceph-mon[298296]: pgmap v640: 177 pgs: 177 active+clean; 292 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 150 KiB/s wr, 50 op/s
Dec 02 10:15:46 np0005541913.localdomain ceph-mon[298296]: osdmap e260: 6 total, 6 up, 6 in
Dec 02 10:15:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5cdc679c-4ca6-4876-b423-0e54f450bff3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5cdc679c-4ca6-4876-b423-0e54f450bff3", "format": "json"}]: dispatch
Dec 02 10:15:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:46.344 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:15:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:46.345 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:46.345 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:15:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:46.345 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:15:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:46.346 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:15:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:47 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e261 e261: 6 total, 6 up, 6 in
Dec 02 10:15:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ffd086bb-506f-4c57-a27d-657caefc8485", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:15:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ffd086bb-506f-4c57-a27d-657caefc8485", "format": "json"}]: dispatch
Dec 02 10:15:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:47 np0005541913.localdomain ceph-mon[298296]: osdmap e261: 6 total, 6 up, 6 in
Dec 02 10:15:47 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3835753396' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: pgmap v643: 177 pgs: 177 active+clean; 292 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 141 KiB/s wr, 47 op/s
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"}]': finished
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:48 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3835753396' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:15:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:48.930 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:49 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e262 e262: 6 total, 6 up, 6 in
Dec 02 10:15:50 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e263 e263: 6 total, 6 up, 6 in
Dec 02 10:15:50 np0005541913.localdomain ceph-mon[298296]: osdmap e262: 6 total, 6 up, 6 in
Dec 02 10:15:50 np0005541913.localdomain ceph-mon[298296]: pgmap v645: 177 pgs: 177 active+clean; 293 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 138 KiB/s wr, 75 op/s
Dec 02 10:15:50 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "5cdc679c-4ca6-4876-b423-0e54f450bff3", "auth_id": "david", "tenant_id": "3212fac1e026474b9022ee93e4d925a9", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:50 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Dec 02 10:15:51 np0005541913.localdomain ceph-mon[298296]: osdmap e263: 6 total, 6 up, 6 in
Dec 02 10:15:51 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "ffd086bb-506f-4c57-a27d-657caefc8485", "snap_name": "79e50957-8d03-44cb-99af-cee54fecf7f3", "format": "json"}]: dispatch
Dec 02 10:15:51 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e264 e264: 6 total, 6 up, 6 in
Dec 02 10:15:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:51.347 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:51.352 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e265 e265: 6 total, 6 up, 6 in
Dec 02 10:15:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "tenant_id": "82d5a09e66904b8ca3c7a7850f1e5c52", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:52 np0005541913.localdomain ceph-mon[298296]: pgmap v647: 177 pgs: 177 active+clean; 293 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 121 KiB/s wr, 66 op/s
Dec 02 10:15:52 np0005541913.localdomain ceph-mon[298296]: osdmap e264: 6 total, 6 up, 6 in
Dec 02 10:15:52 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:52 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:52 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:52 np0005541913.localdomain ceph-mon[298296]: osdmap e265: 6 total, 6 up, 6 in
Dec 02 10:15:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:52.588 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:52 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e266 e266: 6 total, 6 up, 6 in
Dec 02 10:15:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "5cdc679c-4ca6-4876-b423-0e54f450bff3", "auth_id": "david", "format": "json"}]: dispatch
Dec 02 10:15:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "5cdc679c-4ca6-4876-b423-0e54f450bff3", "auth_id": "david", "format": "json"}]: dispatch
Dec 02 10:15:53 np0005541913.localdomain ceph-mon[298296]: osdmap e266: 6 total, 6 up, 6 in
Dec 02 10:15:54 np0005541913.localdomain ceph-mon[298296]: pgmap v651: 177 pgs: 177 active+clean; 293 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 99 KiB/s rd, 157 KiB/s wr, 139 op/s
Dec 02 10:15:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ffd086bb-506f-4c57-a27d-657caefc8485", "snap_name": "79e50957-8d03-44cb-99af-cee54fecf7f3_2ca4c212-9f32-4ac2-b7ea-e7caebf48841", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ffd086bb-506f-4c57-a27d-657caefc8485", "snap_name": "79e50957-8d03-44cb-99af-cee54fecf7f3", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:15:54 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"}]': finished
Dec 02 10:15:55 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:55 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:15:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:56.350 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:56.355 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:15:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:15:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e267 e267: 6 total, 6 up, 6 in
Dec 02 10:15:56 np0005541913.localdomain ceph-mon[298296]: pgmap v652: 177 pgs: 177 active+clean; 293 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 125 KiB/s wr, 110 op/s
Dec 02 10:15:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "auth_id": "david", "format": "json"}]: dispatch
Dec 02 10:15:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Dec 02 10:15:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Dec 02 10:15:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Dec 02 10:15:56 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Dec 02 10:15:56 np0005541913.localdomain ceph-mon[298296]: osdmap e267: 6 total, 6 up, 6 in
Dec 02 10:15:57 np0005541913.localdomain neutron_sriov_agent[256494]: 2025-12-02 10:15:57.570 2 INFO neutron.agent.securitygroups_rpc [req-3475e8cc-5e11-46e8-9664-ecb90f3bf921 req-bbad3521-a7cd-468f-9368-bc82a5a5c437 0e5c738ba752455b908099b234a743a2 d858413a9b01463f96545916d2abe5ab - - default default] Security group member updated ['10785715-ddea-43bb-82fa-9f44a2fb1faa']
Dec 02 10:15:57 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e268 e268: 6 total, 6 up, 6 in
Dec 02 10:15:57 np0005541913.localdomain dnsmasq[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/addn_hosts - 1 addresses
Dec 02 10:15:57 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/host
Dec 02 10:15:57 np0005541913.localdomain podman[334415]: 2025-12-02 10:15:57.915243633 +0000 UTC m=+0.061391377 container kill 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:15:57 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/opts
Dec 02 10:15:57 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "auth_id": "david", "format": "json"}]: dispatch
Dec 02 10:15:57 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ffd086bb-506f-4c57-a27d-657caefc8485", "format": "json"}]: dispatch
Dec 02 10:15:57 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ffd086bb-506f-4c57-a27d-657caefc8485", "force": true, "format": "json"}]: dispatch
Dec 02 10:15:57 np0005541913.localdomain ceph-mon[298296]: osdmap e268: 6 total, 6 up, 6 in
Dec 02 10:15:58 np0005541913.localdomain ceph-mon[298296]: pgmap v654: 177 pgs: 177 active+clean; 293 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 105 KiB/s wr, 93 op/s
Dec 02 10:15:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "tenant_id": "82d5a09e66904b8ca3c7a7850f1e5c52", "access_level": "rw", "format": "json"}]: dispatch
Dec 02 10:15:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:15:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"} : dispatch
Dec 02 10:15:58 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1696860369", "caps": ["mds", "allow rw path=/volumes/_nogroup/5aafe356-dc3f-4e86-bea5-6655303e90b0/5ba23353-e45c-4844-8c4d-be87f063ddd4", "osd", "allow rw pool=manila_data namespace=fsvolumens_5aafe356-dc3f-4e86-bea5-6655303e90b0", "mon", "allow r"], "format": "json"}]': finished
Dec 02 10:15:58 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2709406224' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:15:59 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:15:59 np0005541913.localdomain podman[334435]: 2025-12-02 10:15:59.453137496 +0000 UTC m=+0.091360626 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:15:59 np0005541913.localdomain podman[334435]: 2025-12-02 10:15:59.469117801 +0000 UTC m=+0.107340961 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:15:59 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:15:59 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:15:59Z|00607|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:15:59 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:15:59.963 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:00 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:00.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:00 np0005541913.localdomain ceph-mon[298296]: pgmap v656: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 103 KiB/s rd, 243 KiB/s wr, 156 op/s
Dec 02 10:16:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5cdc679c-4ca6-4876-b423-0e54f450bff3", "format": "json"}]: dispatch
Dec 02 10:16:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5cdc679c-4ca6-4876-b423-0e54f450bff3", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:01.354 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:01.359 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e269 e269: 6 total, 6 up, 6 in
Dec 02 10:16:01 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "718c525a-962f-42e3-9573-9fc3919d4aa7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:01 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "718c525a-962f-42e3-9573-9fc3919d4aa7", "format": "json"}]: dispatch
Dec 02 10:16:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:16:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1696860369", "format": "json"} : dispatch
Dec 02 10:16:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"} : dispatch
Dec 02 10:16:01 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1696860369"}]': finished
Dec 02 10:16:01 np0005541913.localdomain ceph-mon[298296]: osdmap e269: 6 total, 6 up, 6 in
Dec 02 10:16:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e270 e270: 6 total, 6 up, 6 in
Dec 02 10:16:02 np0005541913.localdomain ceph-mon[298296]: pgmap v657: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 117 KiB/s wr, 56 op/s
Dec 02 10:16:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:16:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "auth_id": "tempest-cephx-id-1696860369", "format": "json"}]: dispatch
Dec 02 10:16:02 np0005541913.localdomain ceph-mon[298296]: osdmap e270: 6 total, 6 up, 6 in
Dec 02 10:16:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:03.059 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:16:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:03.060 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:16:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:03.060 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:16:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:16:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:16:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:16:03 np0005541913.localdomain podman[334455]: 2025-12-02 10:16:03.452574758 +0000 UTC m=+0.091055817 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:16:03 np0005541913.localdomain podman[334457]: 2025-12-02 10:16:03.51345838 +0000 UTC m=+0.145038386 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:16:03 np0005541913.localdomain podman[334457]: 2025-12-02 10:16:03.523194079 +0000 UTC m=+0.154774095 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:16:03 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:16:03 np0005541913.localdomain podman[334455]: 2025-12-02 10:16:03.53857538 +0000 UTC m=+0.177056499 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:16:03 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:16:03 np0005541913.localdomain podman[334456]: 2025-12-02 10:16:03.616605559 +0000 UTC m=+0.249269933 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, name=ubi9-minimal)
Dec 02 10:16:03 np0005541913.localdomain podman[334456]: 2025-12-02 10:16:03.634171067 +0000 UTC m=+0.266835491 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 02 10:16:03 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:16:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:03.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:03.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:16:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:03.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:16:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e271 e271: 6 total, 6 up, 6 in
Dec 02 10:16:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "format": "json"}]: dispatch
Dec 02 10:16:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a1ba20ee-ed37-461f-8a6b-289e0637343e", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:04 np0005541913.localdomain ceph-mon[298296]: osdmap e271: 6 total, 6 up, 6 in
Dec 02 10:16:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:16:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:16:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:16:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:16:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:16:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:16:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.099 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.099 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.100 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.100 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.700 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.743 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.743 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.744 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.745 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.901 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.902 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.902 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.902 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:16:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:04.903 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:16:05 np0005541913.localdomain ceph-mon[298296]: pgmap v660: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 325 KiB/s wr, 113 op/s
Dec 02 10:16:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "718c525a-962f-42e3-9573-9fc3919d4aa7", "format": "json"}]: dispatch
Dec 02 10:16:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "718c525a-962f-42e3-9573-9fc3919d4aa7", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2684550647' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:16:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2684550647' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:16:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:16:05 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3357420336' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:16:05 np0005541913.localdomain dnsmasq[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/addn_hosts - 0 addresses
Dec 02 10:16:05 np0005541913.localdomain podman[334553]: 2025-12-02 10:16:05.400369215 +0000 UTC m=+0.040155772 container kill 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:16:05 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/host
Dec 02 10:16:05 np0005541913.localdomain dnsmasq-dhcp[333770]: read /var/lib/neutron/dhcp/8703a229-8c49-443e-95c6-aff62a358434/opts
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.405 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:16:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:16:05Z|00608|binding|INFO|Releasing lport f119cdef-0974-4d2c-8acd-8d7464640ca9 from this chassis (sb_readonly=0)
Dec 02 10:16:05 np0005541913.localdomain kernel: device tapf119cdef-09 left promiscuous mode
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.612 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:16:05Z|00609|binding|INFO|Setting lport f119cdef-0974-4d2c-8acd-8d7464640ca9 down in Southbound
Dec 02 10:16:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:05.636 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-8703a229-8c49-443e-95c6-aff62a358434', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8703a229-8c49-443e-95c6-aff62a358434', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd858413a9b01463f96545916d2abe5ab', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22d83034-71a8-46e9-a33a-f696e74c13f0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=f119cdef-0974-4d2c-8acd-8d7464640ca9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.638 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.639 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:05.640 160221 INFO neutron.agent.ovn.metadata.agent [-] Port f119cdef-0974-4d2c-8acd-8d7464640ca9 in datapath 8703a229-8c49-443e-95c6-aff62a358434 unbound from our chassis
Dec 02 10:16:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:05.642 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8703a229-8c49-443e-95c6-aff62a358434, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:16:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:05.645 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[57e8f51b-7bc4-4eb8-beea-2b23991f8753]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.646 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.647 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.831 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.832 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11051MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.833 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.833 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.934 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.935 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.936 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:16:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:05.989 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:16:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3357420336' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:16:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:16:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:16:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:16:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156107 "" "Go-http-client/1.1"
Dec 02 10:16:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:16:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19268 "" "Go-http-client/1.1"
Dec 02 10:16:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:06.356 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:06.368 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:16:06 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2602515044' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:16:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:06.464 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:16:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:06.472 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:16:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:06.533 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:16:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:06.535 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:16:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:06.535 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:16:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e272 e272: 6 total, 6 up, 6 in
Dec 02 10:16:07 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "format": "json"}]: dispatch
Dec 02 10:16:07 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5aafe356-dc3f-4e86-bea5-6655303e90b0", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:07 np0005541913.localdomain ceph-mon[298296]: pgmap v662: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 168 KiB/s wr, 38 op/s
Dec 02 10:16:07 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "07b7e455-1272-48fc-92f9-fd54c3fafcb0", "format": "json"}]: dispatch
Dec 02 10:16:07 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "07b7e455-1272-48fc-92f9-fd54c3fafcb0", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2602515044' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:16:07 np0005541913.localdomain ceph-mon[298296]: osdmap e272: 6 total, 6 up, 6 in
Dec 02 10:16:07 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:16:07Z|00610|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:16:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:07.208 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:07 np0005541913.localdomain podman[334618]: 2025-12-02 10:16:07.798153422 +0000 UTC m=+0.057346328 container kill 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 02 10:16:07 np0005541913.localdomain dnsmasq[333770]: exiting on receipt of SIGTERM
Dec 02 10:16:07 np0005541913.localdomain systemd[1]: libpod-069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5.scope: Deactivated successfully.
Dec 02 10:16:07 np0005541913.localdomain podman[334632]: 2025-12-02 10:16:07.872864934 +0000 UTC m=+0.055456139 container died 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:16:07 np0005541913.localdomain podman[334632]: 2025-12-02 10:16:07.911342819 +0000 UTC m=+0.093933984 container cleanup 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 02 10:16:07 np0005541913.localdomain systemd[1]: libpod-conmon-069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5.scope: Deactivated successfully.
Dec 02 10:16:07 np0005541913.localdomain podman[334633]: 2025-12-02 10:16:07.99766578 +0000 UTC m=+0.177203604 container remove 069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8703a229-8c49-443e-95c6-aff62a358434, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:16:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d6f9e4a2-dde5-48d3-9ade-72d59a880bf2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:16:08.125 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:16:08 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:16:08.373 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:16:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-6a0ee3792bce35a0d1020085525a61984484526151f073e1f67a4a8079d9209d-merged.mount: Deactivated successfully.
Dec 02 10:16:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-069cefe05362f43be01d929639e0342a01388cc1f055e1f2cef05f17f8de56a5-userdata-shm.mount: Deactivated successfully.
Dec 02 10:16:08 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d8703a229\x2d8c49\x2d443e\x2d95c6\x2daff62a358434.mount: Deactivated successfully.
Dec 02 10:16:09 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d6f9e4a2-dde5-48d3-9ade-72d59a880bf2", "format": "json"}]: dispatch
Dec 02 10:16:09 np0005541913.localdomain ceph-mon[298296]: pgmap v664: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 185 KiB/s wr, 41 op/s
Dec 02 10:16:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:09.537 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1781133973' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:16:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:16:10 np0005541913.localdomain systemd[1]: tmp-crun.Bizifj.mount: Deactivated successfully.
Dec 02 10:16:10 np0005541913.localdomain podman[334661]: 2025-12-02 10:16:10.442194833 +0000 UTC m=+0.081344528 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 02 10:16:10 np0005541913.localdomain podman[334661]: 2025-12-02 10:16:10.480282739 +0000 UTC m=+0.119432494 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Dec 02 10:16:10 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:16:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:10.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:10.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:11 np0005541913.localdomain ceph-mon[298296]: pgmap v665: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 233 KiB/s wr, 67 op/s
Dec 02 10:16:11 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "auth_id": "admin", "format": "json"}]: dispatch
Dec 02 10:16:11 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "format": "json"}]: dispatch
Dec 02 10:16:11 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "738f4ca9-41a9-48cc-8ca1-8d9ae9041202", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2839302655' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:16:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:11.360 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:11.365 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e273 e273: 6 total, 6 up, 6 in
Dec 02 10:16:12 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "d6f9e4a2-dde5-48d3-9ade-72d59a880bf2", "snap_name": "395e084c-5f31-4d0b-b40b-8a631da3af09", "format": "json"}]: dispatch
Dec 02 10:16:12 np0005541913.localdomain ceph-mon[298296]: osdmap e273: 6 total, 6 up, 6 in
Dec 02 10:16:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:12.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e274 e274: 6 total, 6 up, 6 in
Dec 02 10:16:13 np0005541913.localdomain ceph-mon[298296]: pgmap v666: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 61 KiB/s wr, 25 op/s
Dec 02 10:16:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:13.920 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:14 np0005541913.localdomain ceph-mon[298296]: osdmap e274: 6 total, 6 up, 6 in
Dec 02 10:16:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:14.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:16:15 np0005541913.localdomain ceph-mon[298296]: pgmap v669: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 127 KiB/s wr, 38 op/s
Dec 02 10:16:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "d6f9e4a2-dde5-48d3-9ade-72d59a880bf2", "snap_name": "395e084c-5f31-4d0b-b40b-8a631da3af09", "target_sub_name": "c4aff2e0-53b6-4b58-8317-036e112a5bcd", "format": "json"}]: dispatch
Dec 02 10:16:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c4aff2e0-53b6-4b58-8317-036e112a5bcd", "format": "json"}]: dispatch
Dec 02 10:16:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e275 e275: 6 total, 6 up, 6 in
Dec 02 10:16:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:16:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:16:15 np0005541913.localdomain podman[334680]: 2025-12-02 10:16:15.450981255 +0000 UTC m=+0.093185444 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:16:15 np0005541913.localdomain podman[334680]: 2025-12-02 10:16:15.459344798 +0000 UTC m=+0.101548997 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:16:15 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:16:15 np0005541913.localdomain podman[334681]: 2025-12-02 10:16:15.546301875 +0000 UTC m=+0.185017182 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller)
Dec 02 10:16:15 np0005541913.localdomain podman[334681]: 2025-12-02 10:16:15.610027873 +0000 UTC m=+0.248743150 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 10:16:15 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.109 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.122 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.123 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '384c6f7b-8d6b-4d27-a0b9-5763798a973d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:16:16.110220', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'efdbe84a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.329293654, 'message_signature': 'da23f97963522e22b76e0fc36e4c7024e4b8b92f9a25905b80ab2364fa57f085'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:16:16.110220', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'efdbfcf4-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.329293654, 'message_signature': '8ca65ce0ea7f80085edf03d1c1a7c026d99bbf930b8e62f6fd92be56fdf48663'}]}, 'timestamp': '2025-12-02 10:16:16.124050', '_unique_id': '811abed902d14843b1a2225e2ddde55c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.125 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.131 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aafa79b0-20ec-49b8-ac11-c48861938f26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.127512', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efdd3556-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': '79b499edfd2810cab9381e0b08e3148894980a81023c70f29f4052d9b79f01d8'}]}, 'timestamp': '2025-12-02 10:16:16.132072', '_unique_id': 'c78aa8d947a04894bb9fa64c4858a2b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.133 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.163 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.164 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45c4b23d-fac3-4cdf-8e05-f57dcf1379da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:16:16.134657', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'efe215f8-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': '38f76dbe30c0b80738e4be4a3d6cc7a410479922a2113f769c59df432cb7149b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:16:16.134657', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'efe2276e-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': '87cc43201fa40cf57585b4cd198bc85a9f86d0fd5f0979311b59046d1e240535'}]}, 'timestamp': '2025-12-02 10:16:16.164507', '_unique_id': '95668f3ec59a4ad7af61f8dc7863e4c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.165 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.166 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.183 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f8b4f2a-c978-4bdd-a8d5-2c57318f55bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:16:16.166861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'efe52d74-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.402767421, 'message_signature': '5d1a2ff275b30f4a18ad6956150165a25c90224870a96e0f84a5aab0e4476087'}]}, 'timestamp': '2025-12-02 10:16:16.184270', '_unique_id': '4863cdb3e67b4d6288402be555ca31cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.185 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.186 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce7e66f9-043a-499c-9dcb-2364ff0d3f96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.186462', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efe596b0-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': '1847052e504a67c27440598c222fef98c734ce084cd427e7d23ab58da4b42f19'}]}, 'timestamp': '2025-12-02 10:16:16.186985', '_unique_id': '7bc979690889437aa88caaeed0863f8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.188 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.190 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51d15e19-739d-4c28-a6f8-05d36ba3209d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.190010', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efe61f36-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': '6969806eb63c6cd95e9cc01956ceb79e92c6f02aa458a3b0157fdee546ffc247'}]}, 'timestamp': '2025-12-02 10:16:16.190469', '_unique_id': '1ee88e6823104cc1823451612825f3e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.191 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.192 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.192 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 20280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dc50bf0-7364-45a9-8126-7e059a8dac09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20280000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:16:16.192589', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'efe68502-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.402767421, 'message_signature': '0d2eefe7d5bcb8ef6efface0b99b8509c5f8ca80d1abce7497cbe0266f403004'}]}, 'timestamp': '2025-12-02 10:16:16.193061', '_unique_id': '88f2c79615004ca4ac224cb0102f65d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.195 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd5e8859-9003-4fc6-81df-3abb51f11464', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.195146', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efe6e768-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': 'b3a9cbc57f64a0bf6da429c5f8e24338ab87f0d0848c08d9561312b23368428c'}]}, 'timestamp': '2025-12-02 10:16:16.195591', '_unique_id': 'ba4a41dce4ea464ebe7fa417dbae916c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.197 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.197 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36f9a57d-7aeb-4891-9a70-512bfefec90c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.197894', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efe75324-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': 'b3b1ed57b873574ee573fc15f3d434cefa5f68aa86ed904cc86dc5861a432b8c'}]}, 'timestamp': '2025-12-02 10:16:16.198353', '_unique_id': '21a4dcf685c5429a86712d029231781c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.200 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45e17246-7dbf-4e6d-987f-f54fdc532fa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.200450', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efe7b7d8-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': 'eb34a09fb8a3bbe12aeb65f50ea2a8034f41fbc9075a63a3d044a0a1edabe9cb'}]}, 'timestamp': '2025-12-02 10:16:16.200930', '_unique_id': '571b02040beb46b8af26bc2c9ff5f785'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '685b3a15-7521-43c7-bd16-709d2ee48664', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.203419', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efe82e0c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': '4cafdf1b9c617bd313368f9faca267ab30185c7f2fd15081e7115dd03957d7c8'}]}, 'timestamp': '2025-12-02 10:16:16.203962', '_unique_id': 'a76258976aa648e7bd43f8d375b41443'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2d00319-44e4-4688-952f-bb25449c9f6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:16:16.206143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'efe894e6-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': '5eefbdf6f393195d64391f03b801d79dda1f5201138ca89e85b0d7aeb30377bc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:16:16.206143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'efe8a634-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': '19f1dd4811754367317b1b1ac451769e2bd462af8702ff9540e28d2f4bb4d29c'}]}, 'timestamp': '2025-12-02 10:16:16.207004', '_unique_id': 'd99195e5a4fe49bca882cc5864728855'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.209 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb2dcc13-b7ce-47f1-a8f8-88165d7596cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.209162', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efe90b42-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': '4b274c2fa3c77937cbbcb8cd525fbca3649941b597c47b07b82d93b3fac71398'}]}, 'timestamp': '2025-12-02 10:16:16.209768', '_unique_id': 'e9476f4e79f2484c937c5bcb63c4d476'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.211 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.212 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec5736f8-1045-4c45-9208-b633685476e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:16:16.211931', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'efe97730-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': '74f442e2d211bed5233d85972d2aced14c95535970b6d99a9665dd16c9354864'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:16:16.211931', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'efe98a4a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': '221d9d623041a7427a4e6587e55e9c1b28b851a61d3a6a098d097f6767281d7f'}]}, 'timestamp': '2025-12-02 10:16:16.212850', '_unique_id': 'd8c3daa17340452bac210156ec255414'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55bc436e-da75-4641-86ca-7f574dea5888', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:16:16.214988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'efe9ee9a-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': 'f0d15cc03f50e3f35eefe9fbfbb82f84f736fecc3efa8d20aef47e695c93f60b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:16:16.214988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'efe9ffc0-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': '27eee6abdb1a90903d04ab24d26902fbbf244faa4f814ff6054974decfff9777'}]}, 'timestamp': '2025-12-02 10:16:16.215853', '_unique_id': '6c1f224ed1554704b481dbdde6db971a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.218 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acc41f9a-34d7-4ae8-a912-cf0045b3e391', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.218108', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efea6898-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': '748ae6c16a3ec738c907d2b3dfa72a2d685326c294cc245b6a7ef9781a5b7c1e'}]}, 'timestamp': '2025-12-02 10:16:16.218590', '_unique_id': '494f4265192a45a0b888701bc9e20fc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.221 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff6c4df2-1847-4e45-a836-b2df467b4aee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:16:16.221036', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': 'efeadb02-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.346596115, 'message_signature': 'e46dfcb4b9d755d0b02ad9e71e048bafcb544b7077262443c7965dcc10f2c5b6'}]}, 'timestamp': '2025-12-02 10:16:16.221492', '_unique_id': '36f2a5cabf704cb5b8b3afac3a7901d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.224 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '801d4154-d05b-4c59-87fa-1321b5cbda0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:16:16.223753', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'efeb454c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.329293654, 'message_signature': 'ec8dd40f05aef34a17e002aa55fe10610bbe9c37fddec32a0d46c4c6ed558018'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:16:16.223753', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'efeb56c2-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.329293654, 'message_signature': '4e2e0682f4f650743bad3303b3d34c45f6fd4744577d84bf709d4cddbb1616e1'}]}, 'timestamp': '2025-12-02 10:16:16.224602', '_unique_id': 'f8bee13cb9dd4865b6f12e23f268802c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.226 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.226 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e150967-5638-4182-81c7-482221daf55b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:16:16.226159', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'efeba3c0-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.329293654, 'message_signature': 'f59ddec9aa546b8fbc845d9eee34699841a7238e7b92331743fe76ed31da73e8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:16:16.226159', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'efebaf82-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.329293654, 'message_signature': 'b445f1e065b196feba06f7a228a2fcce8d66281d1740d5ac0e77a60f15514596'}]}, 'timestamp': '2025-12-02 10:16:16.226830', '_unique_id': 'ea119117ca404d388a22f1f281d5ec71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.227 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.228 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.228 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.228 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.228 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e29952dc-92a4-4fb4-899e-002938535753', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:16:16.228561', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'efebff8c-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': 'fa3a0dc8e08996646b0eea3db39bb212c8a5a5de7ae64933f1ba3fbdbbe1ecf2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:16:16.228561', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'efec0a0e-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': 'e594fc089842fec66f905830a8b08fd8623c218fef707d9018b91bad91aab92c'}]}, 'timestamp': '2025-12-02 10:16:16.229146', '_unique_id': '8a70c2e6bbfb4571a51d1a8c6b3a2cc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.229 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.230 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.230 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c571a32-4bc9-49ae-ad8d-cd79205d7c4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:16:16.230473', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'efec492e-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': 'd359e676d9291ceac028f00c1a423884785aec31883c80ceba6ab9d7dcfb4d1b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:16:16.230473', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'efec532e-cf67-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12738.353735384, 'message_signature': '862e08512396c44825869055de50d3e8c99adc63338db03ebe64d8a3cdd8c1b4'}]}, 'timestamp': '2025-12-02 10:16:16.231017', '_unique_id': '3481efa4a43f41798712622702176463'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:16:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:16:16.231 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:16:16 np0005541913.localdomain ceph-mon[298296]: osdmap e275: 6 total, 6 up, 6 in
Dec 02 10:16:16 np0005541913.localdomain ceph-mon[298296]: mgrmap e51: np0005541914.lljzmk(active, since 16m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:16:16 np0005541913.localdomain ceph-mon[298296]: pgmap v671: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 56 KiB/s wr, 6 op/s
Dec 02 10:16:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:16.362 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:16.368 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2273007360' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:16:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2273007360' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:16:18 np0005541913.localdomain ceph-mon[298296]: pgmap v672: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 56 KiB/s wr, 6 op/s
Dec 02 10:16:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3d3f9f10-b22d-485e-b12b-97dbef75415d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3d3f9f10-b22d-485e-b12b-97dbef75415d", "format": "json"}]: dispatch
Dec 02 10:16:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3867755322' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:16:20 np0005541913.localdomain ceph-mon[298296]: pgmap v673: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 98 KiB/s wr, 57 op/s
Dec 02 10:16:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1826029349' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:16:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:21.364 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:21.370 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e276 e276: 6 total, 6 up, 6 in
Dec 02 10:16:21 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:21 np0005541913.localdomain ceph-mon[298296]: osdmap e276: 6 total, 6 up, 6 in
Dec 02 10:16:22 np0005541913.localdomain ceph-mon[298296]: pgmap v674: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 49 KiB/s wr, 47 op/s
Dec 02 10:16:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "75b01de1-fd46-4d42-88bc-75b04e569dcb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "75b01de1-fd46-4d42-88bc-75b04e569dcb", "format": "json"}]: dispatch
Dec 02 10:16:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3d3f9f10-b22d-485e-b12b-97dbef75415d", "snap_name": "5212963f-950b-468a-8f66-9155c3dfc1c6", "format": "json"}]: dispatch
Dec 02 10:16:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8710895b-fa91-4f50-bc6f-341cddce5e76", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:24 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8710895b-fa91-4f50-bc6f-341cddce5e76", "format": "json"}]: dispatch
Dec 02 10:16:24 np0005541913.localdomain ceph-mon[298296]: pgmap v676: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 98 KiB/s wr, 52 op/s
Dec 02 10:16:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:16:25 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Cumulative writes: 4681 writes, 36K keys, 4681 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s
                                                           Cumulative WAL: 4681 writes, 4681 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2537 writes, 13K keys, 2537 commit groups, 1.0 writes per commit group, ingest: 19.08 MB, 0.03 MB/s
                                                           Interval WAL: 2537 writes, 2537 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    117.0      0.35              0.11        16    0.022       0      0       0.0       0.0
                                                             L6      1/0   18.03 MB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   6.2    181.7    166.7      1.50              0.65        15    0.100    193K   7757       0.0       0.0
                                                            Sum      1/0   18.03 MB   0.0      0.3     0.0      0.2       0.3      0.1       0.0   7.2    147.6    157.4      1.85              0.76        31    0.060    193K   7757       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0  11.8    163.0    165.8      0.77              0.35        14    0.055     95K   3783       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   0.0    181.7    166.7      1.50              0.65        15    0.100    193K   7757       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    124.0      0.33              0.11        15    0.022       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.019       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.040, interval 0.011
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.28 GB write, 0.24 MB/s write, 0.27 GB read, 0.23 MB/s read, 1.8 seconds
                                                           Interval compaction: 0.13 GB write, 0.21 MB/s write, 0.12 GB read, 0.21 MB/s read, 0.8 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x563183c47350#2 capacity: 304.00 MB usage: 38.19 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000264 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(2537,36.92 MB,12.1445%) FilterBlock(31,565.42 KB,0.181635%) IndexBlock(31,738.52 KB,0.237239%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 02 10:16:25 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:26.366 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:26.372 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:26 np0005541913.localdomain ceph-mon[298296]: pgmap v677: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 79 KiB/s wr, 42 op/s
Dec 02 10:16:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d7d186aa-42ff-406e-975a-236ed40d3d49", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:26 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d7d186aa-42ff-406e-975a-236ed40d3d49", "format": "json"}]: dispatch
Dec 02 10:16:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3d3f9f10-b22d-485e-b12b-97dbef75415d", "snap_name": "5212963f-950b-468a-8f66-9155c3dfc1c6_f8a18cca-a1e3-4a3b-ab79-627d72357f7e", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3d3f9f10-b22d-485e-b12b-97dbef75415d", "snap_name": "5212963f-950b-468a-8f66-9155c3dfc1c6", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:29 np0005541913.localdomain ceph-mon[298296]: pgmap v678: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 79 KiB/s wr, 42 op/s
Dec 02 10:16:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8710895b-fa91-4f50-bc6f-341cddce5e76", "format": "json"}]: dispatch
Dec 02 10:16:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8710895b-fa91-4f50-bc6f-341cddce5e76", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:30 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d07c53ab-0584-4998-92b9-1d7bd9006b39", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:30 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d07c53ab-0584-4998-92b9-1d7bd9006b39", "format": "json"}]: dispatch
Dec 02 10:16:30 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:16:30 np0005541913.localdomain podman[334728]: 2025-12-02 10:16:30.438806675 +0000 UTC m=+0.082365696 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=edpm)
Dec 02 10:16:30 np0005541913.localdomain podman[334728]: 2025-12-02 10:16:30.449742656 +0000 UTC m=+0.093301667 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:16:30 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:16:31 np0005541913.localdomain ceph-mon[298296]: pgmap v679: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 88 KiB/s wr, 6 op/s
Dec 02 10:16:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3d3f9f10-b22d-485e-b12b-97dbef75415d", "format": "json"}]: dispatch
Dec 02 10:16:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3d3f9f10-b22d-485e-b12b-97dbef75415d", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:31.368 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:31.375 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:32 np0005541913.localdomain ceph-mon[298296]: pgmap v680: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 88 KiB/s wr, 6 op/s
Dec 02 10:16:32 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:32 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e277 e277: 6 total, 6 up, 6 in
Dec 02 10:16:33 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "997ff79f-92e7-4de5-90ed-58387671be8e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:33 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "997ff79f-92e7-4de5-90ed-58387671be8e", "format": "json"}]: dispatch
Dec 02 10:16:33 np0005541913.localdomain ceph-mon[298296]: osdmap e277: 6 total, 6 up, 6 in
Dec 02 10:16:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:16:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:16:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:16:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:16:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:16:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:16:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:16:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:16:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:16:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:16:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:16:34 np0005541913.localdomain podman[334746]: 2025-12-02 10:16:34.440363723 +0000 UTC m=+0.078000489 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 02 10:16:34 np0005541913.localdomain podman[334747]: 2025-12-02 10:16:34.507336688 +0000 UTC m=+0.139438277 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm)
Dec 02 10:16:34 np0005541913.localdomain podman[334747]: 2025-12-02 10:16:34.546193904 +0000 UTC m=+0.178295503 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public)
Dec 02 10:16:34 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:16:34 np0005541913.localdomain podman[334748]: 2025-12-02 10:16:34.560414513 +0000 UTC m=+0.189442640 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:16:34 np0005541913.localdomain podman[334748]: 2025-12-02 10:16:34.572160826 +0000 UTC m=+0.201188963 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:16:34 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:16:34 np0005541913.localdomain podman[334746]: 2025-12-02 10:16:34.628923938 +0000 UTC m=+0.266560704 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 10:16:34 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:16:34 np0005541913.localdomain ceph-mon[298296]: pgmap v682: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 103 KiB/s wr, 7 op/s
Dec 02 10:16:35 np0005541913.localdomain systemd[1]: tmp-crun.dcQR2B.mount: Deactivated successfully.
Dec 02 10:16:35 np0005541913.localdomain sudo[334803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:16:35 np0005541913.localdomain sudo[334803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:16:35 np0005541913.localdomain sudo[334803]: pam_unix(sudo:session): session closed for user root
Dec 02 10:16:35 np0005541913.localdomain sudo[334821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:16:35 np0005541913.localdomain sudo[334821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:16:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:16:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:16:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:16:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:16:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:16:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18776 "" "Go-http-client/1.1"
Dec 02 10:16:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:36.371 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:36.376 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:36 np0005541913.localdomain sudo[334821]: pam_unix(sudo:session): session closed for user root
Dec 02 10:16:36 np0005541913.localdomain sudo[334870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:16:36 np0005541913.localdomain sudo[334870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:16:36 np0005541913.localdomain sudo[334870]: pam_unix(sudo:session): session closed for user root
Dec 02 10:16:36 np0005541913.localdomain ceph-mon[298296]: pgmap v683: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 103 KiB/s wr, 7 op/s
Dec 02 10:16:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:16:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:16:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:16:36 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:16:37 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "997ff79f-92e7-4de5-90ed-58387671be8e", "format": "json"}]: dispatch
Dec 02 10:16:37 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "997ff79f-92e7-4de5-90ed-58387671be8e", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:16:38 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:16:38Z|00611|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec 02 10:16:38 np0005541913.localdomain ceph-mon[298296]: pgmap v684: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 103 KiB/s wr, 7 op/s
Dec 02 10:16:38 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:39 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7e7db66c-701d-40e6-a69b-fbd4e0d8a416", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:39 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7e7db66c-701d-40e6-a69b-fbd4e0d8a416", "format": "json"}]: dispatch
Dec 02 10:16:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:40.683 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:16:40 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:40.684 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:16:40 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:40.724 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:40 np0005541913.localdomain ceph-mon[298296]: pgmap v685: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 86 KiB/s wr, 5 op/s
Dec 02 10:16:40 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d07c53ab-0584-4998-92b9-1d7bd9006b39", "format": "json"}]: dispatch
Dec 02 10:16:40 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d07c53ab-0584-4998-92b9-1d7bd9006b39", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:16:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:41.374 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:41.379 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:41 np0005541913.localdomain podman[334888]: 2025-12-02 10:16:41.454223216 +0000 UTC m=+0.092207728 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:16:41 np0005541913.localdomain podman[334888]: 2025-12-02 10:16:41.466791011 +0000 UTC m=+0.104775503 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:16:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:41 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:16:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e278 e278: 6 total, 6 up, 6 in
Dec 02 10:16:42 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:16:42.685 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:16:42 np0005541913.localdomain ceph-mon[298296]: pgmap v686: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 86 KiB/s wr, 5 op/s
Dec 02 10:16:42 np0005541913.localdomain ceph-mon[298296]: osdmap e278: 6 total, 6 up, 6 in
Dec 02 10:16:43 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d7d186aa-42ff-406e-975a-236ed40d3d49", "format": "json"}]: dispatch
Dec 02 10:16:43 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d7d186aa-42ff-406e-975a-236ed40d3d49", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:44 np0005541913.localdomain ceph-mon[298296]: pgmap v688: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 66 KiB/s wr, 3 op/s
Dec 02 10:16:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:16:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:16:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:46.378 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:46.381 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:46 np0005541913.localdomain podman[334908]: 2025-12-02 10:16:46.456075028 +0000 UTC m=+0.090541047 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 02 10:16:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:46 np0005541913.localdomain podman[334908]: 2025-12-02 10:16:46.495009201 +0000 UTC m=+0.129475180 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 10:16:46 np0005541913.localdomain systemd[1]: tmp-crun.oIbzSX.mount: Deactivated successfully.
Dec 02 10:16:46 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:16:46 np0005541913.localdomain podman[334907]: 2025-12-02 10:16:46.516775475 +0000 UTC m=+0.152294763 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:16:46 np0005541913.localdomain podman[334907]: 2025-12-02 10:16:46.551255969 +0000 UTC m=+0.186775167 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:16:46 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:16:46 np0005541913.localdomain ceph-mon[298296]: pgmap v689: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 66 KiB/s wr, 3 op/s
Dec 02 10:16:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7e7db66c-701d-40e6-a69b-fbd4e0d8a416", "format": "json"}]: dispatch
Dec 02 10:16:46 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7e7db66c-701d-40e6-a69b-fbd4e0d8a416", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "75b01de1-fd46-4d42-88bc-75b04e569dcb", "format": "json"}]: dispatch
Dec 02 10:16:47 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "75b01de1-fd46-4d42-88bc-75b04e569dcb", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:48 np0005541913.localdomain ceph-mon[298296]: pgmap v690: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 66 KiB/s wr, 3 op/s
Dec 02 10:16:50 np0005541913.localdomain sshd[334954]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:16:50 np0005541913.localdomain ceph-mon[298296]: pgmap v691: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 87 KiB/s wr, 5 op/s
Dec 02 10:16:50 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c4aff2e0-53b6-4b58-8317-036e112a5bcd", "format": "json"}]: dispatch
Dec 02 10:16:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:51.380 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:51.384 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:51.864079) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670611864147, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2830, "num_deletes": 263, "total_data_size": 4018992, "memory_usage": 4084312, "flush_reason": "Manual Compaction"}
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670611878676, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2166306, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34313, "largest_seqno": 37138, "table_properties": {"data_size": 2156116, "index_size": 6055, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27564, "raw_average_key_size": 22, "raw_value_size": 2133473, "raw_average_value_size": 1769, "num_data_blocks": 260, "num_entries": 1206, "num_filter_entries": 1206, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670490, "oldest_key_time": 1764670490, "file_creation_time": 1764670611, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 14638 microseconds, and 6277 cpu microseconds.
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:51.878724) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2166306 bytes OK
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:51.878749) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:51.880838) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:51.880859) EVENT_LOG_v1 {"time_micros": 1764670611880853, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:51.880877) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 4005507, prev total WAL file size 4006256, number of live WAL files 2.
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:51.881992) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323537' seq:72057594037927935, type:22 .. '6D6772737461740034353038' seq:0, type:0; will stop at (end)
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(2115KB)], [57(18MB)]
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670611882034, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 21070163, "oldest_snapshot_seqno": -1}
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 14714 keys, 19405484 bytes, temperature: kUnknown
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670611998895, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 19405484, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19319569, "index_size": 48054, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36805, "raw_key_size": 391841, "raw_average_key_size": 26, "raw_value_size": 19067858, "raw_average_value_size": 1295, "num_data_blocks": 1812, "num_entries": 14714, "num_filter_entries": 14714, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670611, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:16:51 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:51.999270) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 19405484 bytes
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.001805) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 180.1 rd, 165.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 18.0 +0.0 blob) out(18.5 +0.0 blob), read-write-amplify(18.7) write-amplify(9.0) OK, records in: 15204, records dropped: 490 output_compression: NoCompression
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.001833) EVENT_LOG_v1 {"time_micros": 1764670612001821, "job": 34, "event": "compaction_finished", "compaction_time_micros": 116983, "compaction_time_cpu_micros": 53611, "output_level": 6, "num_output_files": 1, "total_output_size": 19405484, "num_input_records": 15204, "num_output_records": 14714, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670612002262, "job": 34, "event": "table_file_deletion", "file_number": 59}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670612004849, "job": 34, "event": "table_file_deletion", "file_number": 57}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:51.881886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.004950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.004956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.004959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.004962) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.004965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.005370) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670612005459, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 267, "num_deletes": 251, "total_data_size": 23050, "memory_usage": 28352, "flush_reason": "Manual Compaction"}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670612008136, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 14027, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37143, "largest_seqno": 37405, "table_properties": {"data_size": 12221, "index_size": 50, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5023, "raw_average_key_size": 19, "raw_value_size": 8710, "raw_average_value_size": 33, "num_data_blocks": 2, "num_entries": 262, "num_filter_entries": 262, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670611, "oldest_key_time": 1764670611, "file_creation_time": 1764670612, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 2803 microseconds, and 1106 cpu microseconds.
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.008180) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 14027 bytes OK
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.008200) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.010696) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.010717) EVENT_LOG_v1 {"time_micros": 1764670612010710, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.010738) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 20978, prev total WAL file size 29535, number of live WAL files 2.
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.011150) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end)
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(13KB)], [60(18MB)]
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670612011194, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 19419511, "oldest_snapshot_seqno": -1}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 14465 keys, 17859584 bytes, temperature: kUnknown
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670612125928, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 17859584, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17777368, "index_size": 44931, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36229, "raw_key_size": 387110, "raw_average_key_size": 26, "raw_value_size": 17532042, "raw_average_value_size": 1212, "num_data_blocks": 1676, "num_entries": 14465, "num_filter_entries": 14465, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670612, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.126222) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 17859584 bytes
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.128700) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 169.1 rd, 155.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 18.5 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(2657.7) write-amplify(1273.2) OK, records in: 14976, records dropped: 511 output_compression: NoCompression
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.128730) EVENT_LOG_v1 {"time_micros": 1764670612128717, "job": 36, "event": "compaction_finished", "compaction_time_micros": 114822, "compaction_time_cpu_micros": 48423, "output_level": 6, "num_output_files": 1, "total_output_size": 17859584, "num_input_records": 14976, "num_output_records": 14465, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670612128891, "job": 36, "event": "table_file_deletion", "file_number": 62}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670612131969, "job": 36, "event": "table_file_deletion", "file_number": 60}
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.011086) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.132046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.132052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.132053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.132054) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:52 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:16:52.132055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:16:53 np0005541913.localdomain ceph-mon[298296]: pgmap v692: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 87 KiB/s wr, 5 op/s
Dec 02 10:16:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c4aff2e0-53b6-4b58-8317-036e112a5bcd", "format": "json"}]: dispatch
Dec 02 10:16:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:53 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2188bdca-035f-45be-90c0-127aae7698b7", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:54 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2188bdca-035f-45be-90c0-127aae7698b7", "format": "json"}]: dispatch
Dec 02 10:16:54 np0005541913.localdomain sshd[334954]: Connection reset by 205.210.31.196 port 58208 [preauth]
Dec 02 10:16:55 np0005541913.localdomain ceph-mon[298296]: pgmap v693: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 894 B/s rd, 101 KiB/s wr, 6 op/s
Dec 02 10:16:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c4aff2e0-53b6-4b58-8317-036e112a5bcd", "format": "json"}]: dispatch
Dec 02 10:16:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c4aff2e0-53b6-4b58-8317-036e112a5bcd", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "2188bdca-035f-45be-90c0-127aae7698b7", "snap_name": "db7eb361-3904-47e6-9098-d364d08f2cbb", "format": "json"}]: dispatch
Dec 02 10:16:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:56.384 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:16:56.386 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:16:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:16:57 np0005541913.localdomain ceph-mon[298296]: pgmap v694: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 67 KiB/s wr, 3 op/s
Dec 02 10:16:57 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:16:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ce71e0bd-fac0-489e-baae-8568840b81a1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:16:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ce71e0bd-fac0-489e-baae-8568840b81a1", "format": "json"}]: dispatch
Dec 02 10:16:59 np0005541913.localdomain ceph-mon[298296]: pgmap v695: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 67 KiB/s wr, 3 op/s
Dec 02 10:16:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d6f9e4a2-dde5-48d3-9ade-72d59a880bf2", "snap_name": "395e084c-5f31-4d0b-b40b-8a631da3af09_e203339e-4ade-455c-bcd8-fd80834b9e84", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d6f9e4a2-dde5-48d3-9ade-72d59a880bf2", "snap_name": "395e084c-5f31-4d0b-b40b-8a631da3af09", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2188bdca-035f-45be-90c0-127aae7698b7", "snap_name": "db7eb361-3904-47e6-9098-d364d08f2cbb_f3eeec94-da59-4afe-97b3-e80e98f55085", "force": true, "format": "json"}]: dispatch
Dec 02 10:16:59 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2188bdca-035f-45be-90c0-127aae7698b7", "snap_name": "db7eb361-3904-47e6-9098-d364d08f2cbb", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:01 np0005541913.localdomain ceph-mon[298296]: pgmap v696: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 102 KiB/s wr, 6 op/s
Dec 02 10:17:01 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "ce71e0bd-fac0-489e-baae-8568840b81a1", "snap_name": "7af3a8b2-5504-4261-9144-956137288f3e", "format": "json"}]: dispatch
Dec 02 10:17:01 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5b0a592a-aac6-453e-a44c-9563c7dadce2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:01 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5b0a592a-aac6-453e-a44c-9563c7dadce2", "format": "json"}]: dispatch
Dec 02 10:17:01 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:17:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:01.386 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:01.392 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:01 np0005541913.localdomain podman[334956]: 2025-12-02 10:17:01.449375136 +0000 UTC m=+0.092217542 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:17:01 np0005541913.localdomain podman[334956]: 2025-12-02 10:17:01.465078827 +0000 UTC m=+0.107921223 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:17:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:01 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:17:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d6f9e4a2-dde5-48d3-9ade-72d59a880bf2", "format": "json"}]: dispatch
Dec 02 10:17:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d6f9e4a2-dde5-48d3-9ade-72d59a880bf2", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2188bdca-035f-45be-90c0-127aae7698b7", "format": "json"}]: dispatch
Dec 02 10:17:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:02.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:17:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e279 e279: 6 total, 6 up, 6 in
Dec 02 10:17:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:17:03.060 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:17:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:17:03.061 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:17:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:17:03.062 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:17:03 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2188bdca-035f-45be-90c0-127aae7698b7", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:03 np0005541913.localdomain ceph-mon[298296]: pgmap v697: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 59 KiB/s wr, 4 op/s
Dec 02 10:17:03 np0005541913.localdomain ceph-mon[298296]: osdmap e279: 6 total, 6 up, 6 in
Dec 02 10:17:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:03.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:17:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:03.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:17:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:17:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:17:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:17:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:17:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:17:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:17:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:17:04 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8db7e7dd-0638-45f8-969b-0cba743185e7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:05 np0005541913.localdomain ceph-mon[298296]: pgmap v699: 177 pgs: 177 active+clean; 221 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 119 KiB/s wr, 7 op/s
Dec 02 10:17:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8db7e7dd-0638-45f8-969b-0cba743185e7", "format": "json"}]: dispatch
Dec 02 10:17:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "ce71e0bd-fac0-489e-baae-8568840b81a1", "snap_name": "7af3a8b2-5504-4261-9144-956137288f3e", "target_sub_name": "a33ca4d0-df57-473d-9fc9-9e83431eec70", "format": "json"}]: dispatch
Dec 02 10:17:05 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a33ca4d0-df57-473d-9fc9-9e83431eec70", "format": "json"}]: dispatch
Dec 02 10:17:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2864597683' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:17:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2864597683' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:17:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:17:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:17:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:17:05 np0005541913.localdomain systemd[1]: tmp-crun.3Ti2hZ.mount: Deactivated successfully.
Dec 02 10:17:05 np0005541913.localdomain podman[334976]: 2025-12-02 10:17:05.717361147 +0000 UTC m=+0.352302942 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Dec 02 10:17:05 np0005541913.localdomain podman[334977]: 2025-12-02 10:17:05.752587611 +0000 UTC m=+0.383380265 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:17:05 np0005541913.localdomain podman[334977]: 2025-12-02 10:17:05.784359473 +0000 UTC m=+0.415152107 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:17:05 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:17:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:05.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:17:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:05.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:17:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:05.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:17:05 np0005541913.localdomain podman[334976]: 2025-12-02 10:17:05.85368415 +0000 UTC m=+0.488625985 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:17:05 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:17:05 np0005541913.localdomain podman[334975]: 2025-12-02 10:17:05.862790275 +0000 UTC m=+0.501805609 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:17:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:05.930 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:17:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:05.930 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:17:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:05.930 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:17:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:05.931 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:17:05 np0005541913.localdomain podman[334975]: 2025-12-02 10:17:05.942470949 +0000 UTC m=+0.581486264 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:17:05 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:17:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:17:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:17:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:17:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:17:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:17:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18784 "" "Go-http-client/1.1"
Dec 02 10:17:06 np0005541913.localdomain ceph-mon[298296]: pgmap v700: 177 pgs: 177 active+clean; 221 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 119 KiB/s wr, 7 op/s
Dec 02 10:17:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:06.397 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:06 np0005541913.localdomain systemd[1]: tmp-crun.5jA9vy.mount: Deactivated successfully.
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.083 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.114 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.115 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.116 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.205 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.206 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.206 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.207 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.207 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:17:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:17:07 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3859412889' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.667 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.770 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:17:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:07.771 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:17:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3859412889' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.006 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.007 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11051MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.007 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.008 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.175 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.176 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.176 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.210 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:17:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:17:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/477887898' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.671 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.677 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.744 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.747 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:17:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:08.748 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.740s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:17:08 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "421f49de-9caa-4e96-8ed7-c70fac5c9582", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:08 np0005541913.localdomain ceph-mon[298296]: pgmap v701: 177 pgs: 177 active+clean; 221 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 119 KiB/s wr, 7 op/s
Dec 02 10:17:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/477887898' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:17:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:09.460 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:17:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:17:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 20K writes, 78K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s
                                                          Cumulative WAL: 20K writes, 7167 syncs, 2.83 writes per sync, written: 0.06 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 12K writes, 44K keys, 12K commit groups, 1.0 writes per commit group, ingest: 33.24 MB, 0.06 MB/s
                                                          Interval WAL: 12K writes, 5261 syncs, 2.33 writes per sync, written: 0.03 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 10:17:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:10.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:17:10 np0005541913.localdomain ceph-mon[298296]: pgmap v702: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 120 KiB/s wr, 7 op/s
Dec 02 10:17:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:11.396 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:11.401 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e280 e280: 6 total, 6 up, 6 in
Dec 02 10:17:12 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:17:12 np0005541913.localdomain podman[335077]: 2025-12-02 10:17:12.434110881 +0000 UTC m=+0.074614720 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:17:12 np0005541913.localdomain podman[335077]: 2025-12-02 10:17:12.446099913 +0000 UTC m=+0.086603792 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:17:12 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:17:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:12.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:17:12 np0005541913.localdomain ceph-mon[298296]: pgmap v703: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 120 KiB/s wr, 7 op/s
Dec 02 10:17:12 np0005541913.localdomain ceph-mon[298296]: osdmap e280: 6 total, 6 up, 6 in
Dec 02 10:17:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3832367767' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:17:12 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "421f49de-9caa-4e96-8ed7-c70fac5c9582", "format": "json"}]: dispatch
Dec 02 10:17:12 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:13 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8db7e7dd-0638-45f8-969b-0cba743185e7", "format": "json"}]: dispatch
Dec 02 10:17:13 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8db7e7dd-0638-45f8-969b-0cba743185e7", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:13 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/4112839142' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:17:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:14.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:17:14 np0005541913.localdomain ceph-mon[298296]: pgmap v705: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 107 KiB/s wr, 6 op/s
Dec 02 10:17:14 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:17:15 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.2 total, 600.0 interval
                                                          Cumulative writes: 24K writes, 93K keys, 24K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 24K writes, 8890 syncs, 2.79 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 14K writes, 53K keys, 14K commit groups, 1.0 writes per commit group, ingest: 39.13 MB, 0.07 MB/s
                                                          Interval WAL: 14K writes, 6185 syncs, 2.36 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 10:17:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:15.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:17:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "421f49de-9caa-4e96-8ed7-c70fac5c9582", "snap_name": "887f67e9-2bf7-45b5-84dd-6cbee4d7656b", "format": "json"}]: dispatch
Dec 02 10:17:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ee9ab43f-a3e3-4447-9084-6b663c27a445", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ee9ab43f-a3e3-4447-9084-6b663c27a445", "format": "json"}]: dispatch
Dec 02 10:17:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "bf788ca1-8e50-4d58-9737-f8c6482ff48b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "bf788ca1-8e50-4d58-9737-f8c6482ff48b", "format": "json"}]: dispatch
Dec 02 10:17:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:16.399 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:16.401 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:16 np0005541913.localdomain ceph-mon[298296]: pgmap v706: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 107 KiB/s wr, 6 op/s
Dec 02 10:17:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:17:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:17:17 np0005541913.localdomain systemd[1]: tmp-crun.jZaVWy.mount: Deactivated successfully.
Dec 02 10:17:17 np0005541913.localdomain podman[335098]: 2025-12-02 10:17:17.462164581 +0000 UTC m=+0.099290141 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:17:17 np0005541913.localdomain podman[335097]: 2025-12-02 10:17:17.54832231 +0000 UTC m=+0.189459028 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:17:17 np0005541913.localdomain podman[335098]: 2025-12-02 10:17:17.579953677 +0000 UTC m=+0.217079247 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 02 10:17:17 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:17:17 np0005541913.localdomain podman[335097]: 2025-12-02 10:17:17.635905797 +0000 UTC m=+0.277042515 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:17:17 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:17:18 np0005541913.localdomain ceph-mon[298296]: pgmap v707: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 107 KiB/s wr, 6 op/s
Dec 02 10:17:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "421f49de-9caa-4e96-8ed7-c70fac5c9582", "snap_name": "887f67e9-2bf7-45b5-84dd-6cbee4d7656b_adf6c68d-e84b-4410-ac7f-adf3a353b05d", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "421f49de-9caa-4e96-8ed7-c70fac5c9582", "snap_name": "887f67e9-2bf7-45b5-84dd-6cbee4d7656b", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ee9ab43f-a3e3-4447-9084-6b663c27a445", "format": "json"}]: dispatch
Dec 02 10:17:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ee9ab43f-a3e3-4447-9084-6b663c27a445", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:19 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "bf788ca1-8e50-4d58-9737-f8c6482ff48b", "snap_name": "04e237e6-bdd1-4932-bf28-2abac8ca1d29", "format": "json"}]: dispatch
Dec 02 10:17:21 np0005541913.localdomain ceph-mon[298296]: pgmap v708: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 103 KiB/s wr, 6 op/s
Dec 02 10:17:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:21.402 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:17:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:21.404 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:21.404 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:17:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:21.404 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:17:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:21.405 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:17:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:21.407 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "421f49de-9caa-4e96-8ed7-c70fac5c9582", "format": "json"}]: dispatch
Dec 02 10:17:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "421f49de-9caa-4e96-8ed7-c70fac5c9582", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:22 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2488539184' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:17:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:23 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e281 e281: 6 total, 6 up, 6 in
Dec 02 10:17:23 np0005541913.localdomain ceph-mon[298296]: pgmap v709: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 103 KiB/s wr, 6 op/s
Dec 02 10:17:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b8feab34-30e1-4504-93e1-fee137b334fd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:23 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b8feab34-30e1-4504-93e1-fee137b334fd", "format": "json"}]: dispatch
Dec 02 10:17:23 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/852769459' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:17:24 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "bf788ca1-8e50-4d58-9737-f8c6482ff48b", "snap_name": "04e237e6-bdd1-4932-bf28-2abac8ca1d29_15204c8c-795d-4343-829f-29f32d779260", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:24 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "bf788ca1-8e50-4d58-9737-f8c6482ff48b", "snap_name": "04e237e6-bdd1-4932-bf28-2abac8ca1d29", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:24 np0005541913.localdomain ceph-mon[298296]: osdmap e281: 6 total, 6 up, 6 in
Dec 02 10:17:25 np0005541913.localdomain ceph-mon[298296]: pgmap v711: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 123 KiB/s wr, 7 op/s
Dec 02 10:17:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:26.405 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b8feab34-30e1-4504-93e1-fee137b334fd", "format": "json"}]: dispatch
Dec 02 10:17:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b8feab34-30e1-4504-93e1-fee137b334fd", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:27 np0005541913.localdomain ceph-mon[298296]: pgmap v712: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 123 KiB/s wr, 7 op/s
Dec 02 10:17:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "bf788ca1-8e50-4d58-9737-f8c6482ff48b", "format": "json"}]: dispatch
Dec 02 10:17:27 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "bf788ca1-8e50-4d58-9737-f8c6482ff48b", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:27 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e282 e282: 6 total, 6 up, 6 in
Dec 02 10:17:28 np0005541913.localdomain ceph-mon[298296]: pgmap v713: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 123 KiB/s wr, 7 op/s
Dec 02 10:17:28 np0005541913.localdomain ceph-mon[298296]: osdmap e282: 6 total, 6 up, 6 in
Dec 02 10:17:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "37ec89f0-b485-493a-a6e2-4d54629ab0d1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:29 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "37ec89f0-b485-493a-a6e2-4d54629ab0d1", "format": "json"}]: dispatch
Dec 02 10:17:30 np0005541913.localdomain ceph-mon[298296]: pgmap v715: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 150 KiB/s wr, 9 op/s
Dec 02 10:17:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:31.409 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:17:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:31.411 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:17:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:31.412 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:17:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:31.412 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:17:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:31.453 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:31.454 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:17:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e283 e283: 6 total, 6 up, 6 in
Dec 02 10:17:32 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:17:32 np0005541913.localdomain systemd[1]: tmp-crun.ofbFxu.mount: Deactivated successfully.
Dec 02 10:17:32 np0005541913.localdomain podman[335147]: 2025-12-02 10:17:32.4632906 +0000 UTC m=+0.095621273 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 02 10:17:32 np0005541913.localdomain podman[335147]: 2025-12-02 10:17:32.476872725 +0000 UTC m=+0.109203388 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 02 10:17:32 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:17:32 np0005541913.localdomain ceph-mon[298296]: pgmap v716: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 45 KiB/s wr, 4 op/s
Dec 02 10:17:32 np0005541913.localdomain ceph-mon[298296]: osdmap e283: 6 total, 6 up, 6 in
Dec 02 10:17:33 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "37ec89f0-b485-493a-a6e2-4d54629ab0d1", "format": "json"}]: dispatch
Dec 02 10:17:33 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "37ec89f0-b485-493a-a6e2-4d54629ab0d1", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:17:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:17:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:17:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:17:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:17:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:17:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:17:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:17:34 np0005541913.localdomain ceph-mon[298296]: pgmap v718: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 96 KiB/s wr, 6 op/s
Dec 02 10:17:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:17:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:17:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:17:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:17:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:17:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18786 "" "Go-http-client/1.1"
Dec 02 10:17:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:17:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:17:36 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:17:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:36.456 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:36.460 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:36 np0005541913.localdomain systemd[1]: tmp-crun.JeBZfQ.mount: Deactivated successfully.
Dec 02 10:17:36 np0005541913.localdomain podman[335167]: 2025-12-02 10:17:36.468208666 +0000 UTC m=+0.096510108 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Dec 02 10:17:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:36 np0005541913.localdomain podman[335166]: 2025-12-02 10:17:36.508239688 +0000 UTC m=+0.137217458 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 02 10:17:36 np0005541913.localdomain podman[335166]: 2025-12-02 10:17:36.542220498 +0000 UTC m=+0.171198308 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 02 10:17:36 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:17:36 np0005541913.localdomain podman[335168]: 2025-12-02 10:17:36.565396779 +0000 UTC m=+0.192685673 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:17:36 np0005541913.localdomain podman[335167]: 2025-12-02 10:17:36.579723774 +0000 UTC m=+0.208025156 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc.)
Dec 02 10:17:36 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:17:36 np0005541913.localdomain podman[335168]: 2025-12-02 10:17:36.600035108 +0000 UTC m=+0.227323982 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:17:36 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:17:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 e284: 6 total, 6 up, 6 in
Dec 02 10:17:36 np0005541913.localdomain ceph-mon[298296]: pgmap v719: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 96 KiB/s wr, 6 op/s
Dec 02 10:17:36 np0005541913.localdomain ceph-mon[298296]: osdmap e284: 6 total, 6 up, 6 in
Dec 02 10:17:37 np0005541913.localdomain sudo[335224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:17:37 np0005541913.localdomain sudo[335224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:17:37 np0005541913.localdomain sudo[335224]: pam_unix(sudo:session): session closed for user root
Dec 02 10:17:37 np0005541913.localdomain sudo[335242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:17:37 np0005541913.localdomain sudo[335242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:17:37 np0005541913.localdomain sudo[335242]: pam_unix(sudo:session): session closed for user root
Dec 02 10:17:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:17:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:17:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:17:37 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:37 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:17:38 np0005541913.localdomain sudo[335291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:17:38 np0005541913.localdomain sudo[335291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:17:38 np0005541913.localdomain sudo[335291]: pam_unix(sudo:session): session closed for user root
Dec 02 10:17:39 np0005541913.localdomain ceph-mon[298296]: pgmap v721: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 51 KiB/s wr, 2 op/s
Dec 02 10:17:39 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2826f8b0-e859-462c-8596-fb04c439e342", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:39 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2826f8b0-e859-462c-8596-fb04c439e342", "format": "json"}]: dispatch
Dec 02 10:17:41 np0005541913.localdomain ceph-mon[298296]: pgmap v722: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 91 KiB/s wr, 5 op/s
Dec 02 10:17:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:41.462 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:17:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:41.465 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:17:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:41.465 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:17:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:41.465 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:17:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:41.493 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:41.495 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:17:43 np0005541913.localdomain ceph-mon[298296]: pgmap v723: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 434 B/s rd, 78 KiB/s wr, 4 op/s
Dec 02 10:17:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:17:43 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:17:43 np0005541913.localdomain podman[335309]: 2025-12-02 10:17:43.430873708 +0000 UTC m=+0.071032904 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:17:43 np0005541913.localdomain podman[335309]: 2025-12-02 10:17:43.438988585 +0000 UTC m=+0.079147751 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible)
Dec 02 10:17:43 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:17:44 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2826f8b0-e859-462c-8596-fb04c439e342", "format": "json"}]: dispatch
Dec 02 10:17:44 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2826f8b0-e859-462c-8596-fb04c439e342", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:45 np0005541913.localdomain ceph-mon[298296]: pgmap v724: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 70 KiB/s wr, 3 op/s
Dec 02 10:17:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:46.494 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:46.496 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:47 np0005541913.localdomain ceph-mon[298296]: pgmap v725: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 70 KiB/s wr, 3 op/s
Dec 02 10:17:48 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5b0a592a-aac6-453e-a44c-9563c7dadce2", "format": "json"}]: dispatch
Dec 02 10:17:48 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5b0a592a-aac6-453e-a44c-9563c7dadce2", "force": true, "format": "json"}]: dispatch
Dec 02 10:17:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:17:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:17:48 np0005541913.localdomain podman[335330]: 2025-12-02 10:17:48.447496893 +0000 UTC m=+0.083441298 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 02 10:17:48 np0005541913.localdomain podman[335329]: 2025-12-02 10:17:48.488701697 +0000 UTC m=+0.128671909 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:17:48 np0005541913.localdomain podman[335329]: 2025-12-02 10:17:48.503815022 +0000 UTC m=+0.143785224 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:17:48 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:17:48 np0005541913.localdomain podman[335330]: 2025-12-02 10:17:48.541185833 +0000 UTC m=+0.177130298 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 10:17:48 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:17:49 np0005541913.localdomain ceph-mon[298296]: pgmap v726: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 196 B/s rd, 67 KiB/s wr, 2 op/s
Dec 02 10:17:51 np0005541913.localdomain ceph-mon[298296]: pgmap v727: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 78 KiB/s wr, 3 op/s
Dec 02 10:17:51 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a33ca4d0-df57-473d-9fc9-9e83431eec70", "format": "json"}]: dispatch
Dec 02 10:17:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:51.497 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:17:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:51.499 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:17:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:51.499 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:17:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:51.499 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:17:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:51.536 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:51.537 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:17:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:51.712 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:17:51.714 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:17:51 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:17:51.715 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:17:52 np0005541913.localdomain ceph-mon[298296]: pgmap v728: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 51 KiB/s wr, 2 op/s
Dec 02 10:17:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a33ca4d0-df57-473d-9fc9-9e83431eec70", "format": "json"}]: dispatch
Dec 02 10:17:52 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:53 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:17:53.718 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:17:54 np0005541913.localdomain ceph-mon[298296]: pgmap v729: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 74 KiB/s wr, 3 op/s
Dec 02 10:17:55 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:17:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:17:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:56.538 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:17:56.539 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:17:56 np0005541913.localdomain ceph-mon[298296]: pgmap v730: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 43 KiB/s wr, 2 op/s
Dec 02 10:17:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e32efa73-156a-46e5-a7b8-279ab8d48b0b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:56 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e32efa73-156a-46e5-a7b8-279ab8d48b0b", "format": "json"}]: dispatch
Dec 02 10:17:58 np0005541913.localdomain ceph-mon[298296]: pgmap v731: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 43 KiB/s wr, 2 op/s
Dec 02 10:17:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "79a413b5-c28c-47ee-83ea-fa37bb286785", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:17:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "79a413b5-c28c-47ee-83ea-fa37bb286785", "format": "json"}]: dispatch
Dec 02 10:17:58 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:18:00 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "e32efa73-156a-46e5-a7b8-279ab8d48b0b", "new_size": 2147483648, "format": "json"}]: dispatch
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:01.541 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:18:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:01.542 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:18:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:01.542 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:18:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:01.543 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:18:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:01.577 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:01.578 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: pgmap v732: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 65 KiB/s wr, 3 op/s
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:01.927411) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670681927487, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1295, "num_deletes": 259, "total_data_size": 1454330, "memory_usage": 1481008, "flush_reason": "Manual Compaction"}
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670681944795, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 950051, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37410, "largest_seqno": 38700, "table_properties": {"data_size": 944687, "index_size": 2707, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13018, "raw_average_key_size": 20, "raw_value_size": 933281, "raw_average_value_size": 1472, "num_data_blocks": 118, "num_entries": 634, "num_filter_entries": 634, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670612, "oldest_key_time": 1764670612, "file_creation_time": 1764670681, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 17441 microseconds, and 4221 cpu microseconds.
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:01.944859) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 950051 bytes OK
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:01.944882) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:01.946902) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:01.946923) EVENT_LOG_v1 {"time_micros": 1764670681946916, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:01.946945) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 1447898, prev total WAL file size 1448222, number of live WAL files 2.
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:01.947685) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353233' seq:72057594037927935, type:22 .. '6C6F676D0034373735' seq:0, type:0; will stop at (end)
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(927KB)], [63(17MB)]
Dec 02 10:18:01 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670681947743, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 18809635, "oldest_snapshot_seqno": -1}
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 14561 keys, 18691478 bytes, temperature: kUnknown
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670682050305, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 18691478, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18606751, "index_size": 47250, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36421, "raw_key_size": 390557, "raw_average_key_size": 26, "raw_value_size": 18357911, "raw_average_value_size": 1260, "num_data_blocks": 1769, "num_entries": 14561, "num_filter_entries": 14561, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670681, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:02.050734) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 18691478 bytes
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:02.052274) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.2 rd, 182.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 17.0 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(39.5) write-amplify(19.7) OK, records in: 15099, records dropped: 538 output_compression: NoCompression
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:02.052304) EVENT_LOG_v1 {"time_micros": 1764670682052291, "job": 38, "event": "compaction_finished", "compaction_time_micros": 102658, "compaction_time_cpu_micros": 54729, "output_level": 6, "num_output_files": 1, "total_output_size": 18691478, "num_input_records": 15099, "num_output_records": 14561, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670682052577, "job": 38, "event": "table_file_deletion", "file_number": 65}
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670682055127, "job": 38, "event": "table_file_deletion", "file_number": 63}
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:01.947510) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:02.055203) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:02.055209) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:02.055212) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:02.055215) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:18:02.055217) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:18:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:02.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: pgmap v733: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 45 KiB/s wr, 2 op/s
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e32efa73-156a-46e5-a7b8-279ab8d48b0b", "format": "json"}]: dispatch
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e32efa73-156a-46e5-a7b8-279ab8d48b0b", "force": true, "format": "json"}]: dispatch
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "79a413b5-c28c-47ee-83ea-fa37bb286785", "format": "json"}]: dispatch
Dec 02 10:18:02 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "79a413b5-c28c-47ee-83ea-fa37bb286785", "force": true, "format": "json"}]: dispatch
Dec 02 10:18:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:03.061 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:18:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:03.061 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:18:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:03.062 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:18:03 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:18:03 np0005541913.localdomain podman[335375]: 2025-12-02 10:18:03.437924543 +0000 UTC m=+0.075113884 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:18:03 np0005541913.localdomain podman[335375]: 2025-12-02 10:18:03.452087662 +0000 UTC m=+0.089277033 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Dec 02 10:18:03 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:18:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:18:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:18:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:18:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:18:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:18:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:18:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:18:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:04.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:04.827 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:18:04 np0005541913.localdomain ceph-mon[298296]: pgmap v734: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 89 KiB/s wr, 3 op/s
Dec 02 10:18:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/815784916' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:18:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/815784916' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:18:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:18:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:18:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:18:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:18:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:18:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18789 "" "Go-http-client/1.1"
Dec 02 10:18:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:06.578 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:06.580 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:06.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:06.852 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:18:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:06.853 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:18:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:06.853 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:18:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:06.854 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:18:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:06.854 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:18:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:18:07 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2957551482' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.290 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:18:07 np0005541913.localdomain ceph-mon[298296]: pgmap v735: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s wr, 2 op/s
Dec 02 10:18:07 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3c3c7fb5-6d8f-4a5a-ae27-22b86bd4eddc", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:18:07 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3c3c7fb5-6d8f-4a5a-ae27-22b86bd4eddc", "format": "json"}]: dispatch
Dec 02 10:18:07 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:18:07 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2957551482' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.351 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.352 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:18:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:18:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:18:07 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:18:07 np0005541913.localdomain podman[335415]: 2025-12-02 10:18:07.459458502 +0000 UTC m=+0.088078422 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 10:18:07 np0005541913.localdomain podman[335417]: 2025-12-02 10:18:07.502564547 +0000 UTC m=+0.119200156 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:18:07 np0005541913.localdomain podman[335417]: 2025-12-02 10:18:07.516537051 +0000 UTC m=+0.133172670 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:18:07 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:18:07 np0005541913.localdomain podman[335415]: 2025-12-02 10:18:07.590533274 +0000 UTC m=+0.219153134 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:18:07 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.620 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.622 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11026MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:18:07 np0005541913.localdomain podman[335416]: 2025-12-02 10:18:07.570861507 +0000 UTC m=+0.196833296 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.622 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.622 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:18:07 np0005541913.localdomain podman[335416]: 2025-12-02 10:18:07.651076647 +0000 UTC m=+0.277048396 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, version=9.6, vcs-type=git, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Dec 02 10:18:07 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.967 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.968 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:18:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:07.968 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:18:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:08.142 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:18:08 np0005541913.localdomain systemd[1]: tmp-crun.84wnyh.mount: Deactivated successfully.
Dec 02 10:18:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:18:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/893749530' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:18:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:08.571 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:18:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:08.579 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:18:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:08.594 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:18:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:08.597 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:18:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:08.598 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:18:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:08.599 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:08.600 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 02 10:18:08 np0005541913.localdomain ceph-mon[298296]: pgmap v736: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s wr, 2 op/s
Dec 02 10:18:08 np0005541913.localdomain ceph-mon[298296]: mgrmap e52: np0005541914.lljzmk(active, since 18m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:18:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/893749530' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:18:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:09.614 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:09.614 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:18:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:09.615 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:18:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:10.169 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:18:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:10.170 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:18:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:10.170 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:18:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:10.170 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:18:10 np0005541913.localdomain ceph-mon[298296]: pgmap v737: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 82 KiB/s wr, 4 op/s
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.325 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.366 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.367 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.367 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.368 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.368 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.369 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.391 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 02 10:18:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.581 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:18:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:11.584 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:12 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "3c3c7fb5-6d8f-4a5a-ae27-22b86bd4eddc", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Dec 02 10:18:13 np0005541913.localdomain ceph-mon[298296]: pgmap v738: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 60 KiB/s wr, 3 op/s
Dec 02 10:18:14 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3070259623' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:18:14 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:18:14 np0005541913.localdomain podman[335495]: 2025-12-02 10:18:14.439676688 +0000 UTC m=+0.080855578 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:18:14 np0005541913.localdomain podman[335495]: 2025-12-02 10:18:14.45505575 +0000 UTC m=+0.096234620 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 10:18:14 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:18:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:14.850 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:15 np0005541913.localdomain ceph-mon[298296]: pgmap v739: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 84 KiB/s wr, 4 op/s
Dec 02 10:18:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3c3c7fb5-6d8f-4a5a-ae27-22b86bd4eddc", "format": "json"}]: dispatch
Dec 02 10:18:15 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3c3c7fb5-6d8f-4a5a-ae27-22b86bd4eddc", "force": true, "format": "json"}]: dispatch
Dec 02 10:18:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:15.822 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:15.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3788442662' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.108 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.113 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d5af964-28d1-433f-916a-a31b8791db9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.109677', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '3760f804-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': '268c340db948d721a5d3a62dd0b5031f1ac3ac9658dec4892500d2ed722a2d33'}]}, 'timestamp': '2025-12-02 10:18:16.113885', '_unique_id': '85848d0cfe2a427abd9127e84640232f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.115 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.116 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2126dc2-4a02-46d2-80e9-b74d45c0b9b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.116915', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '376184d6-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': '223e1273451caa5ee240838c986b36d2d247b9f8cd14976b4bc1cf482a1cb1b6'}]}, 'timestamp': '2025-12-02 10:18:16.117407', '_unique_id': 'ab8a2d9e03f24736bf050272fa2d991e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.118 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.119 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31ed3d5c-68b4-4846-b945-f654bc2ccda0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.119596', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '3761ef7a-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': '67ad3078d92a3ee0d11e1bd7f53c65227eae86a5af8c6540da1570b4dc78c04c'}]}, 'timestamp': '2025-12-02 10:18:16.120134', '_unique_id': '6758d981b6514c07ac6f18a56f83ec0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.121 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.130 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.131 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e161097-4641-424c-b211-2d050ef56910', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:18:16.122355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '37639fd2-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.341444862, 'message_signature': 'd9b14cb06cb97952d96b37d8c72e6fd1a96c5bd3f2791958880b6679cb7cc2ed'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:18:16.122355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3763b2c4-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.341444862, 'message_signature': 'deb56e796fc87f3c74b04c7cb3f515ec8ef0c947004464a16dd34734daf1c56b'}]}, 'timestamp': '2025-12-02 10:18:16.131697', '_unique_id': 'cf7202307f6e4eb79fbd49a26c1ff6aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.133 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.134 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f279187-c2fe-461f-b963-4a11b1853004', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.134458', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '37643280-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': '758ab3c85345f63d5e8079b5b5a2936a4f92ae84dbd776710c4764243baf6cf9'}]}, 'timestamp': '2025-12-02 10:18:16.134957', '_unique_id': '90dcde9a63194c7c8c0b9fc220414a49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.135 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.137 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.137 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c0f0a78-42ab-4b18-93d4-d2b4d1d46c6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.137382', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '3764a486-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': '4dba3d4871f9ecccae8cfba09d9930bb4ab8d69fe74986b40478fe188c48f01d'}]}, 'timestamp': '2025-12-02 10:18:16.137915', '_unique_id': '77f493025ed4438fbb126500b09d505d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.138 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.140 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edb46fe6-95aa-4fad-bf99-3546779087de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.140023', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '37650a16-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': '9a6d2309a33d515f00f80bf8c6eeeb133df124c92cf245b0f8e37e85338df8c4'}]}, 'timestamp': '2025-12-02 10:18:16.140468', '_unique_id': '0ac4685da7994131a26b9a3f8b0887ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.141 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.142 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.142 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.159 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 20910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '823d50b8-54d9-4f7a-b2dd-1981959a1109', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20910000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:18:16.142872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '376803e2-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.378402722, 'message_signature': '63626773f9c24823e083ab680b26b3eac2a69e27893bb6b12744a6c1f7ad5ec3'}]}, 'timestamp': '2025-12-02 10:18:16.160042', '_unique_id': 'ccffb23b0c5347dc9695710f84d86705'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.161 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.188 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ceedc82-275c-4ebb-aa62-1436c6115e36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:18:16.162372', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '376c6d9c-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': 'bba7b5e4ebb33c53cbbc6f3bb0da9c148562a698928cb2be5cfaf291d6aa154a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:18:16.162372', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '376c7fc6-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': '8c898dfc08d3d68458b9d7ccbcb1f91955c70d8eb11dc2be6e2e4d277433ab49'}]}, 'timestamp': '2025-12-02 10:18:16.189338', '_unique_id': '26beb9ff54aa442e9e01330395e11aa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.190 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.191 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e92d28c7-56a8-49dc-b505-4f442be437a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.191758', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '376cefec-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': '864efdc807605adbf0caef4394a693735a42c6910f87340cc30eece309d72115'}]}, 'timestamp': '2025-12-02 10:18:16.192242', '_unique_id': '7b0e5cdb9d524320a74677e0e991b34d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.193 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.194 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.194 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'deccbb8f-c0ab-4e61-9de0-c565ed508f20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:18:16.194429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '376d5bd0-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': '0c3cda493dac94d7f950024ce8887a06c4d20d78dbc45f6bc7272ea5f5367acf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:18:16.194429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '376d6b8e-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': '883aa2cdfb95837d17a9747e35efc4343cfa76284815ddd033ce5616ef0efca5'}]}, 'timestamp': '2025-12-02 10:18:16.195361', '_unique_id': 'a402e34047da45a79670504cd4239b03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.196 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.197 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.198 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e458cbd6-fbc4-45c9-8c29-cd5f6dfe6940', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:18:16.197548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '376dd358-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.341444862, 'message_signature': 'fa4ae2bbc5e780a02c85e761d4e7cd56323d3ce2207958f5749decb1a4dd2434'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:18:16.197548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '376de35c-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.341444862, 'message_signature': '5ec30a121133fba7c436ac12851168e07eb0fdd633684f9da2f92e737f983f2a'}]}, 'timestamp': '2025-12-02 10:18:16.198433', '_unique_id': '57954d7d28364ad1975f40b1e56689cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.199 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.200 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.200 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.201 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c723b71-6e6a-4469-a5e4-07692e7ff67a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:18:16.200719', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '376e4d7e-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': '8f3647a950d7d16e6c2a19e8e48ff7cbf718dc213d384dc091d06db4bcd26980'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:18:16.200719', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '376e5d50-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': '45b0f380df6f615d58bcae7dfa303143149c29ab11d8e44708d85a530a33373c'}]}, 'timestamp': '2025-12-02 10:18:16.201551', '_unique_id': 'a86cddf03e2b4dae8d623cc18b5355f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.202 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.203 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfb9911c-3b8f-439f-a068-6a7a91da409c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:18:16.203779', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '376ec4de-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.378402722, 'message_signature': '2810913b054a313d5d43263b7b94f5e9474752f033da71e1f020e47aa72103b6'}]}, 'timestamp': '2025-12-02 10:18:16.204218', '_unique_id': '624ed98cc35e42f9aa7fc44daaf6e4b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.205 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.206 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6654bae5-dde0-4357-8687-a2505192fab8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.206490', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '376f3040-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': '55b49840f84bad892cec6dc4a08e42485387d590b14ac13b4848cfcb6877e460'}]}, 'timestamp': '2025-12-02 10:18:16.206985', '_unique_id': '01fdfd73cf164c6c8d97b056e85487f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ff61568-8176-4168-b406-656dfe002591', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.209096', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '376f9472-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': '80bd2985cd16f20de0b50f677f7ce59f1f1320b2f0f398a6bcdaa67e133dede5'}]}, 'timestamp': '2025-12-02 10:18:16.209547', '_unique_id': '3a6cc3f2bba649f1baa5dbd674e3a32f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.211 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.212 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d27014a-ccae-47e3-b5a8-4df83b3badfc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:18:16.211959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3770040c-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': 'd354eb6b3fc739baaed936a94e5d14d4650140842f333ae75064b7024d33d503'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:18:16.211959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '377013b6-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': '49cd15f6e2d04df3a6760d1703de2a8113ac35f81e8e380ceb89ef30410365ab'}]}, 'timestamp': '2025-12-02 10:18:16.212831', '_unique_id': 'f0a26cab99a94f1987ea986327da12be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.214 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd86fb09d-1be2-4bac-86d5-9f2c75e07648', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:18:16.214934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3770784c-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': 'f55b946588ce1c12b940a7fd7bb4b1dc5d564653ff474020f92eb565b1ad9158'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:18:16.214934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3770881e-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': '981c50451e0055de9794df84c407cf37447998035baab0f2d0fcd584717b1b09'}]}, 'timestamp': '2025-12-02 10:18:16.215786', '_unique_id': '540d4863256549d79a746746c3649795'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.216 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.217 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.218 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd2dc2ea-27ff-40a5-85ea-8258423541c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:18:16.217906', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3770ec3c-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': '9b9a1a72759803ef5289d612b348ee7a5881153dd583e67f4c8b55b262dcf81f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:18:16.217906', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3770fbe6-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.381449873, 'message_signature': '34dfd98638bc83176d06d837e668f5e56997fe29fdfb3d12450502120a30386c'}]}, 'timestamp': '2025-12-02 10:18:16.218747', '_unique_id': 'b90b25f23526484da888d48acc5b8eed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.219 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.220 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ed24b17-dd61-4d1a-b74e-244e28882882', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:18:16.220867', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '37716040-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.328757291, 'message_signature': 'e26f1be064159ff383e0b26bea1d915d56d027c6b6f77b42a046111877229b35'}]}, 'timestamp': '2025-12-02 10:18:16.221318', '_unique_id': '8b858fe706e147efa024f71b8535cb97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.222 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.223 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92ba1599-b072-4834-a915-4a4595c9c594', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:18:16.223373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3771c1ca-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.341444862, 'message_signature': '7d21e960c6408d4941010b4712c9a48654a4c7677b12e7fc6c8c1c72b48700d7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:18:16.223373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3771d2a0-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12858.341444862, 'message_signature': 'f7e58f091a7878d2181c631115cd67682abed6fa825754cff3017003f9e9f6ca'}]}, 'timestamp': '2025-12-02 10:18:16.224219', '_unique_id': '9d2f29eb8ff64e26993fac56386f8508'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:18:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:18:16.225 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:18:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:16.585 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:16.587 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:17 np0005541913.localdomain ceph-mon[298296]: pgmap v740: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 40 KiB/s wr, 2 op/s
Dec 02 10:18:17 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:18:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:17.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:17.899 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "20f292f4-5867-4407-9e49-afe0674f9a28", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:18:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "20f292f4-5867-4407-9e49-afe0674f9a28", "format": "json"}]: dispatch
Dec 02 10:18:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b888758a-b516-4f6f-a2a7-c3912230af77", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 02 10:18:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b888758a-b516-4f6f-a2a7-c3912230af77", "format": "json"}]: dispatch
Dec 02 10:18:18 np0005541913.localdomain ceph-mon[298296]: from='client.15678 172.18.0.34:0/1767085859' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 02 10:18:19 np0005541913.localdomain ceph-mon[298296]: pgmap v741: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 40 KiB/s wr, 2 op/s
Dec 02 10:18:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:18:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:18:19 np0005541913.localdomain podman[335514]: 2025-12-02 10:18:19.439929883 +0000 UTC m=+0.077663242 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:18:19 np0005541913.localdomain podman[335515]: 2025-12-02 10:18:19.500512277 +0000 UTC m=+0.134714561 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Dec 02 10:18:19 np0005541913.localdomain podman[335514]: 2025-12-02 10:18:19.520756039 +0000 UTC m=+0.158489408 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:18:19 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:18:19 np0005541913.localdomain podman[335515]: 2025-12-02 10:18:19.54203498 +0000 UTC m=+0.176237284 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:18:19 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:18:21 np0005541913.localdomain ceph-mon[298296]: pgmap v742: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 72 KiB/s wr, 4 op/s
Dec 02 10:18:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:21.587 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:21.590 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "20f292f4-5867-4407-9e49-afe0674f9a28", "format": "json"}]: dispatch
Dec 02 10:18:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "20f292f4-5867-4407-9e49-afe0674f9a28", "force": true, "format": "json"}]: dispatch
Dec 02 10:18:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b888758a-b516-4f6f-a2a7-c3912230af77", "format": "json"}]: dispatch
Dec 02 10:18:22 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b888758a-b516-4f6f-a2a7-c3912230af77", "force": true, "format": "json"}]: dispatch
Dec 02 10:18:23 np0005541913.localdomain ceph-mon[298296]: pgmap v743: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 56 KiB/s wr, 2 op/s
Dec 02 10:18:24 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/59618546' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:18:25 np0005541913.localdomain ceph-mon[298296]: pgmap v744: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 98 KiB/s wr, 4 op/s
Dec 02 10:18:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:26.589 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:26.593 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:27 np0005541913.localdomain ceph-mon[298296]: pgmap v745: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 74 KiB/s wr, 3 op/s
Dec 02 10:18:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a33ca4d0-df57-473d-9fc9-9e83431eec70", "format": "json"}]: dispatch
Dec 02 10:18:28 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a33ca4d0-df57-473d-9fc9-9e83431eec70", "force": true, "format": "json"}]: dispatch
Dec 02 10:18:28 np0005541913.localdomain ceph-mon[298296]: pgmap v746: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 74 KiB/s wr, 3 op/s
Dec 02 10:18:29 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3641618941' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:18:30 np0005541913.localdomain ceph-mon[298296]: pgmap v747: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 85 KiB/s wr, 4 op/s
Dec 02 10:18:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ce71e0bd-fac0-489e-baae-8568840b81a1", "snap_name": "7af3a8b2-5504-4261-9144-956137288f3e_fb2677f6-3453-4240-a85b-11d96bc9c80e", "force": true, "format": "json"}]: dispatch
Dec 02 10:18:31 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ce71e0bd-fac0-489e-baae-8568840b81a1", "snap_name": "7af3a8b2-5504-4261-9144-956137288f3e", "force": true, "format": "json"}]: dispatch
Dec 02 10:18:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:31.592 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:31.596 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:32.920 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:32.946 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Triggering sync for uuid b254bb7f-2891-4b37-9c44-9700e301ce16 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 02 10:18:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:32.947 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "b254bb7f-2891-4b37-9c44-9700e301ce16" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:18:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:32.947 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:18:32 np0005541913.localdomain ceph-mon[298296]: pgmap v748: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 53 KiB/s wr, 2 op/s
Dec 02 10:18:32 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:32.979 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "b254bb7f-2891-4b37-9c44-9700e301ce16" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:18:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:18:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:18:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:18:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:18:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:18:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:18:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:18:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:18:34 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:18:34 np0005541913.localdomain podman[335562]: 2025-12-02 10:18:34.444672046 +0000 UTC m=+0.083556759 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:18:34 np0005541913.localdomain podman[335562]: 2025-12-02 10:18:34.485340987 +0000 UTC m=+0.124225630 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:18:34 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:18:34 np0005541913.localdomain ceph-mon[298296]: pgmap v749: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 71 KiB/s wr, 5 op/s
Dec 02 10:18:34 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ce71e0bd-fac0-489e-baae-8568840b81a1", "format": "json"}]: dispatch
Dec 02 10:18:34 np0005541913.localdomain ceph-mon[298296]: from='client.15678 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ce71e0bd-fac0-489e-baae-8568840b81a1", "force": true, "format": "json"}]: dispatch
Dec 02 10:18:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:18:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:18:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:18:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:18:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:18:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18778 "" "Go-http-client/1.1"
Dec 02 10:18:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:36.595 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:36.598 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:36.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:18:36 np0005541913.localdomain ceph-mon[298296]: pgmap v750: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 28 KiB/s wr, 3 op/s
Dec 02 10:18:38 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e285 e285: 6 total, 6 up, 6 in
Dec 02 10:18:38 np0005541913.localdomain sudo[335581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:18:38 np0005541913.localdomain sudo[335581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:18:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:18:38 np0005541913.localdomain sudo[335581]: pam_unix(sudo:session): session closed for user root
Dec 02 10:18:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:18:38 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:18:38 np0005541913.localdomain sudo[335624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:18:38 np0005541913.localdomain podman[335600]: 2025-12-02 10:18:38.380976349 +0000 UTC m=+0.093017223 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Dec 02 10:18:38 np0005541913.localdomain sudo[335624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:18:38 np0005541913.localdomain podman[335600]: 2025-12-02 10:18:38.394786229 +0000 UTC m=+0.106827113 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64)
Dec 02 10:18:38 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:18:38 np0005541913.localdomain systemd[1]: tmp-crun.NwLMHE.mount: Deactivated successfully.
Dec 02 10:18:38 np0005541913.localdomain podman[335599]: 2025-12-02 10:18:38.506086412 +0000 UTC m=+0.219694768 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 02 10:18:38 np0005541913.localdomain podman[335601]: 2025-12-02 10:18:38.463673876 +0000 UTC m=+0.171887607 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:18:38 np0005541913.localdomain podman[335599]: 2025-12-02 10:18:38.540082323 +0000 UTC m=+0.253690669 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 02 10:18:38 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:18:38 np0005541913.localdomain podman[335601]: 2025-12-02 10:18:38.597712468 +0000 UTC m=+0.305926199 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:18:38 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:18:38 np0005541913.localdomain sudo[335624]: pam_unix(sudo:session): session closed for user root
Dec 02 10:18:39 np0005541913.localdomain ceph-mon[298296]: pgmap v751: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 28 KiB/s wr, 3 op/s
Dec 02 10:18:39 np0005541913.localdomain ceph-mon[298296]: mgrmap e53: np0005541914.lljzmk(active, since 18m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:18:39 np0005541913.localdomain ceph-mon[298296]: osdmap e285: 6 total, 6 up, 6 in
Dec 02 10:18:39 np0005541913.localdomain sudo[335709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:18:39 np0005541913.localdomain sudo[335709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:18:39 np0005541913.localdomain sudo[335709]: pam_unix(sudo:session): session closed for user root
Dec 02 10:18:39 np0005541913.localdomain sudo[335727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid c7c8e171-a193-56fb-95fa-8879fcfa7074 -- inventory --format=json-pretty --filter-for-batch
Dec 02 10:18:39 np0005541913.localdomain sudo[335727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:18:39 np0005541913.localdomain systemd[1]: tmp-crun.Q11UaI.mount: Deactivated successfully.
Dec 02 10:18:39 np0005541913.localdomain podman[335786]: 
Dec 02 10:18:39 np0005541913.localdomain podman[335786]: 2025-12-02 10:18:39.910996292 +0000 UTC m=+0.068114177 container create 60290afae581d8c2806cc1b5a9949ecd350d690dd7bd82b153af1b8d4d5c3215 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_wescoff, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True, name=rhceph, io.buildah.version=1.41.4, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 02 10:18:39 np0005541913.localdomain systemd[1]: Started libpod-conmon-60290afae581d8c2806cc1b5a9949ecd350d690dd7bd82b153af1b8d4d5c3215.scope.
Dec 02 10:18:39 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:18:39 np0005541913.localdomain podman[335786]: 2025-12-02 10:18:39.879995121 +0000 UTC m=+0.037113026 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 10:18:39 np0005541913.localdomain podman[335786]: 2025-12-02 10:18:39.979430945 +0000 UTC m=+0.136548840 container init 60290afae581d8c2806cc1b5a9949ecd350d690dd7bd82b153af1b8d4d5c3215 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_wescoff, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 10:18:39 np0005541913.localdomain podman[335786]: 2025-12-02 10:18:39.98629305 +0000 UTC m=+0.143410935 container start 60290afae581d8c2806cc1b5a9949ecd350d690dd7bd82b153af1b8d4d5c3215 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_wescoff, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, vcs-type=git, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc., version=7, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 02 10:18:39 np0005541913.localdomain podman[335786]: 2025-12-02 10:18:39.986583367 +0000 UTC m=+0.143701292 container attach 60290afae581d8c2806cc1b5a9949ecd350d690dd7bd82b153af1b8d4d5c3215 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_wescoff, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, GIT_CLEAN=True, release=1763362218, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64)
Dec 02 10:18:39 np0005541913.localdomain nifty_wescoff[335801]: 167 167
Dec 02 10:18:39 np0005541913.localdomain systemd[1]: libpod-60290afae581d8c2806cc1b5a9949ecd350d690dd7bd82b153af1b8d4d5c3215.scope: Deactivated successfully.
Dec 02 10:18:39 np0005541913.localdomain podman[335786]: 2025-12-02 10:18:39.992623699 +0000 UTC m=+0.149741584 container died 60290afae581d8c2806cc1b5a9949ecd350d690dd7bd82b153af1b8d4d5c3215 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_wescoff, vendor=Red Hat, Inc., name=rhceph, version=7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, release=1763362218, RELEASE=main)
Dec 02 10:18:40 np0005541913.localdomain podman[335806]: 2025-12-02 10:18:40.097360485 +0000 UTC m=+0.094970366 container remove 60290afae581d8c2806cc1b5a9949ecd350d690dd7bd82b153af1b8d4d5c3215 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_wescoff, version=7, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.41.4)
Dec 02 10:18:40 np0005541913.localdomain systemd[1]: libpod-conmon-60290afae581d8c2806cc1b5a9949ecd350d690dd7bd82b153af1b8d4d5c3215.scope: Deactivated successfully.
Dec 02 10:18:40 np0005541913.localdomain podman[335826]: 
Dec 02 10:18:40 np0005541913.localdomain podman[335826]: 2025-12-02 10:18:40.30871559 +0000 UTC m=+0.054185964 container create 8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_rosalind, architecture=x86_64, vendor=Red Hat, Inc., release=1763362218, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True)
Dec 02 10:18:40 np0005541913.localdomain systemd[1]: Started libpod-conmon-8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a.scope.
Dec 02 10:18:40 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:18:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40e11162b740091b70afc3f2008db766c0c671f0cee215768b2b325d296b5a12/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 02 10:18:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40e11162b740091b70afc3f2008db766c0c671f0cee215768b2b325d296b5a12/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 02 10:18:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40e11162b740091b70afc3f2008db766c0c671f0cee215768b2b325d296b5a12/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 02 10:18:40 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40e11162b740091b70afc3f2008db766c0c671f0cee215768b2b325d296b5a12/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 02 10:18:40 np0005541913.localdomain podman[335826]: 2025-12-02 10:18:40.371555373 +0000 UTC m=+0.117025727 container init 8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_rosalind, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_CLEAN=True, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, version=7, ceph=True, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 02 10:18:40 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-090cac4657a350058074a7d962a1af2293d1abf09bc428da65503f358ddf5151-merged.mount: Deactivated successfully.
Dec 02 10:18:40 np0005541913.localdomain podman[335826]: 2025-12-02 10:18:40.382135367 +0000 UTC m=+0.127605731 container start 8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_rosalind, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., release=1763362218, io.openshift.tags=rhceph ceph, version=7, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4)
Dec 02 10:18:40 np0005541913.localdomain podman[335826]: 2025-12-02 10:18:40.283047091 +0000 UTC m=+0.028517465 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 02 10:18:40 np0005541913.localdomain podman[335826]: 2025-12-02 10:18:40.382354763 +0000 UTC m=+0.127825117 container attach 8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_rosalind, build-date=2025-11-26T19:44:28Z, architecture=x86_64, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=)
Dec 02 10:18:41 np0005541913.localdomain ceph-mon[298296]: pgmap v753: 177 pgs: 3 active+clean+snaptrim, 12 active+clean+snaptrim_wait, 162 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 59 KiB/s wr, 4 op/s
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]: [
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:     {
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         "available": false,
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         "ceph_device": false,
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         "lsm_data": {},
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         "lvs": [],
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         "path": "/dev/sr0",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         "rejected_reasons": [
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "Insufficient space (<5GB)",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "Has a FileSystem"
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         ],
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         "sys_api": {
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "actuators": null,
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "device_nodes": "sr0",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "human_readable_size": "482.00 KB",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "id_bus": "ata",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "model": "QEMU DVD-ROM",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "nr_requests": "2",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "partitions": {},
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "path": "/dev/sr0",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "removable": "1",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "rev": "2.5+",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "ro": "0",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "rotational": "1",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "sas_address": "",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "sas_device_handle": "",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "scheduler_mode": "mq-deadline",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "sectors": 0,
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "sectorsize": "2048",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "size": 493568.0,
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "support_discard": "0",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "type": "disk",
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:             "vendor": "QEMU"
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:         }
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]:     }
Dec 02 10:18:41 np0005541913.localdomain pedantic_rosalind[335840]: ]
Dec 02 10:18:41 np0005541913.localdomain systemd[1]: libpod-8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a.scope: Deactivated successfully.
Dec 02 10:18:41 np0005541913.localdomain podman[335826]: 2025-12-02 10:18:41.410764472 +0000 UTC m=+1.156234876 container died 8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_rosalind, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, RELEASE=main, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=)
Dec 02 10:18:41 np0005541913.localdomain systemd[1]: libpod-8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a.scope: Consumed 1.052s CPU time.
Dec 02 10:18:41 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-40e11162b740091b70afc3f2008db766c0c671f0cee215768b2b325d296b5a12-merged.mount: Deactivated successfully.
Dec 02 10:18:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:41 np0005541913.localdomain podman[337882]: 2025-12-02 10:18:41.498560714 +0000 UTC m=+0.077508777 container remove 8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_rosalind, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Dec 02 10:18:41 np0005541913.localdomain systemd[1]: libpod-conmon-8952ea8364804d5c266aba65cd0416f5ad97ea75ffd3cb624bf71cf8254d848a.scope: Deactivated successfully.
Dec 02 10:18:41 np0005541913.localdomain sudo[335727]: pam_unix(sudo:session): session closed for user root
Dec 02 10:18:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:41.598 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:41.600 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:41 np0005541913.localdomain sudo[337897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:18:41 np0005541913.localdomain sudo[337897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:18:41 np0005541913.localdomain sudo[337897]: pam_unix(sudo:session): session closed for user root
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: pgmap v754: 177 pgs: 3 active+clean+snaptrim, 12 active+clean+snaptrim_wait, 162 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 59 KiB/s wr, 4 op/s
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:18:42 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:18:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:43.558 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '0a:ed:9b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '6e:ce:d1:dc:83:80'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:18:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:43.559 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:43 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:43.560 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 02 10:18:43 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:43.971 263406 INFO neutron.agent.linux.ip_lib [None req-ab550fef-c5d8-4d2f-bcfb-6fe2f68b2dae - - - - - -] Device tap9140f735-04 cannot be used as it has no MAC address
Dec 02 10:18:43 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:43.990 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:43 np0005541913.localdomain kernel: device tap9140f735-04 entered promiscuous mode
Dec 02 10:18:43 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670723.9979] manager: (tap9140f735-04): new Generic device (/org/freedesktop/NetworkManager/Devices/94)
Dec 02 10:18:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:18:43Z|00612|binding|INFO|Claiming lport 9140f735-04c6-485f-9881-2f09b5b9f68f for this chassis.
Dec 02 10:18:43 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:18:43Z|00613|binding|INFO|9140f735-04c6-485f-9881-2f09b5b9f68f: Claiming unknown
Dec 02 10:18:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:44.003 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:44 np0005541913.localdomain systemd-udevd[337925]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:18:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:44.010 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-0cdd3535-8c4d-40c0-93c7-242e2392e8fd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0cdd3535-8c4d-40c0-93c7-242e2392e8fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4851cba82e304f60b99cf343fa7fcf33', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=138f5691-7516-45ed-9230-1f42c25749a2, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=9140f735-04c6-485f-9881-2f09b5b9f68f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:18:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:44.012 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 9140f735-04c6-485f-9881-2f09b5b9f68f in datapath 0cdd3535-8c4d-40c0-93c7-242e2392e8fd bound to our chassis
Dec 02 10:18:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:44.014 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port e185d5d5-dabb-4b86-9834-d2cce80c75a1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:18:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:44.014 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0cdd3535-8c4d-40c0-93c7-242e2392e8fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:18:44 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:44.015 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[74b48429-8c8a-4522-acd6-2456ab7aae20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:18:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9140f735-04: No such device
Dec 02 10:18:44 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:18:44Z|00614|binding|INFO|Setting lport 9140f735-04c6-485f-9881-2f09b5b9f68f ovn-installed in OVS
Dec 02 10:18:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9140f735-04: No such device
Dec 02 10:18:44 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:18:44Z|00615|binding|INFO|Setting lport 9140f735-04c6-485f-9881-2f09b5b9f68f up in Southbound
Dec 02 10:18:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:44.041 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9140f735-04: No such device
Dec 02 10:18:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9140f735-04: No such device
Dec 02 10:18:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9140f735-04: No such device
Dec 02 10:18:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9140f735-04: No such device
Dec 02 10:18:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9140f735-04: No such device
Dec 02 10:18:44 np0005541913.localdomain virtnodedevd[230136]: ethtool ioctl error on tap9140f735-04: No such device
Dec 02 10:18:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:44.083 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:44 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:44.116 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:44 np0005541913.localdomain ceph-mon[298296]: pgmap v755: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 40 KiB/s wr, 2 op/s
Dec 02 10:18:45 np0005541913.localdomain podman[337997]: 
Dec 02 10:18:45 np0005541913.localdomain podman[337997]: 2025-12-02 10:18:45.076757413 +0000 UTC m=+0.088332708 container create ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 02 10:18:45 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:18:45 np0005541913.localdomain systemd[1]: Started libpod-conmon-ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960.scope.
Dec 02 10:18:45 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:18:45 np0005541913.localdomain podman[337997]: 2025-12-02 10:18:45.033703309 +0000 UTC m=+0.045278614 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:18:45 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f9db6d3e7a07fe8276e0bd707c870a01daa8479ca21632c914dfb215bb86c31/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:18:45 np0005541913.localdomain podman[337997]: 2025-12-02 10:18:45.144058557 +0000 UTC m=+0.155633852 container init ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 02 10:18:45 np0005541913.localdomain podman[337997]: 2025-12-02 10:18:45.158271978 +0000 UTC m=+0.169847243 container start ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 02 10:18:45 np0005541913.localdomain dnsmasq[338026]: started, version 2.85 cachesize 150
Dec 02 10:18:45 np0005541913.localdomain dnsmasq[338026]: DNS service limited to local subnets
Dec 02 10:18:45 np0005541913.localdomain dnsmasq[338026]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:18:45 np0005541913.localdomain dnsmasq[338026]: warning: no upstream servers configured
Dec 02 10:18:45 np0005541913.localdomain dnsmasq-dhcp[338026]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:18:45 np0005541913.localdomain dnsmasq[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/addn_hosts - 0 addresses
Dec 02 10:18:45 np0005541913.localdomain dnsmasq-dhcp[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/host
Dec 02 10:18:45 np0005541913.localdomain dnsmasq-dhcp[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/opts
Dec 02 10:18:45 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:45.196 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:45 np0005541913.localdomain podman[338011]: 2025-12-02 10:18:45.20503137 +0000 UTC m=+0.091684177 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:18:45 np0005541913.localdomain podman[338011]: 2025-12-02 10:18:45.219062466 +0000 UTC m=+0.105715333 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd)
Dec 02 10:18:45 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:18:45 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:45.524 263406 INFO neutron.agent.dhcp.agent [None req-008bc975-25ef-4225-9ffa-d75471c11a31 - - - - - -] DHCP configuration for ports {'abfe3dd7-1bf6-404b-9299-6e689403d360'} is completed
Dec 02 10:18:46 np0005541913.localdomain systemd[1]: tmp-crun.L2Z3T9.mount: Deactivated successfully.
Dec 02 10:18:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:46 np0005541913.localdomain ceph-mon[298296]: pgmap v756: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 40 KiB/s wr, 2 op/s
Dec 02 10:18:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:46.627 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:46 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:46.850 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:18:46Z, description=, device_id=4ee86722-429a-4a47-a912-a41ad8c5f9ac, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a00f10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908a00880>], id=6b7859fa-a1af-428d-a1af-07ce1483f5d6, ip_allocation=immediate, mac_address=fa:16:3e:07:2b:ea, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:18:40Z, description=, dns_domain=, id=0cdd3535-8c4d-40c0-93c7-242e2392e8fd, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-917464313-network, port_security_enabled=True, project_id=4851cba82e304f60b99cf343fa7fcf33, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13608, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3934, status=ACTIVE, subnets=['8bc1674a-6256-4eeb-bfe9-12248911570d'], tags=[], tenant_id=4851cba82e304f60b99cf343fa7fcf33, updated_at=2025-12-02T10:18:42Z, vlan_transparent=None, network_id=0cdd3535-8c4d-40c0-93c7-242e2392e8fd, port_security_enabled=False, project_id=4851cba82e304f60b99cf343fa7fcf33, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3952, status=DOWN, tags=[], tenant_id=4851cba82e304f60b99cf343fa7fcf33, updated_at=2025-12-02T10:18:46Z on network 0cdd3535-8c4d-40c0-93c7-242e2392e8fd
Dec 02 10:18:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 e286: 6 total, 6 up, 6 in
Dec 02 10:18:47 np0005541913.localdomain dnsmasq[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/addn_hosts - 1 addresses
Dec 02 10:18:47 np0005541913.localdomain podman[338052]: 2025-12-02 10:18:47.048789899 +0000 UTC m=+0.059426494 container kill ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 02 10:18:47 np0005541913.localdomain dnsmasq-dhcp[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/host
Dec 02 10:18:47 np0005541913.localdomain dnsmasq-dhcp[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/opts
Dec 02 10:18:47 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:47.603 263406 INFO neutron.agent.dhcp.agent [None req-3cabd811-42e4-4a49-b992-42a470152028 - - - - - -] DHCP configuration for ports {'6b7859fa-a1af-428d-a1af-07ce1483f5d6'} is completed
Dec 02 10:18:47 np0005541913.localdomain ceph-mon[298296]: osdmap e286: 6 total, 6 up, 6 in
Dec 02 10:18:48 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:48.164 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:48.333 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:18:46Z, description=, device_id=4ee86722-429a-4a47-a912-a41ad8c5f9ac, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908833e80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f99088338b0>], id=6b7859fa-a1af-428d-a1af-07ce1483f5d6, ip_allocation=immediate, mac_address=fa:16:3e:07:2b:ea, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:18:40Z, description=, dns_domain=, id=0cdd3535-8c4d-40c0-93c7-242e2392e8fd, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-917464313-network, port_security_enabled=True, project_id=4851cba82e304f60b99cf343fa7fcf33, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13608, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3934, status=ACTIVE, subnets=['8bc1674a-6256-4eeb-bfe9-12248911570d'], tags=[], tenant_id=4851cba82e304f60b99cf343fa7fcf33, updated_at=2025-12-02T10:18:42Z, vlan_transparent=None, network_id=0cdd3535-8c4d-40c0-93c7-242e2392e8fd, port_security_enabled=False, project_id=4851cba82e304f60b99cf343fa7fcf33, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3952, status=DOWN, tags=[], tenant_id=4851cba82e304f60b99cf343fa7fcf33, updated_at=2025-12-02T10:18:46Z on network 0cdd3535-8c4d-40c0-93c7-242e2392e8fd
Dec 02 10:18:48 np0005541913.localdomain systemd[1]: tmp-crun.Yqhsha.mount: Deactivated successfully.
Dec 02 10:18:48 np0005541913.localdomain dnsmasq[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/addn_hosts - 1 addresses
Dec 02 10:18:48 np0005541913.localdomain podman[338089]: 2025-12-02 10:18:48.548764375 +0000 UTC m=+0.066827462 container kill ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:18:48 np0005541913.localdomain dnsmasq-dhcp[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/host
Dec 02 10:18:48 np0005541913.localdomain dnsmasq-dhcp[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/opts
Dec 02 10:18:48 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:48.775 263406 INFO neutron.agent.dhcp.agent [None req-769510e8-c3f6-40b8-ae06-321b62bac232 - - - - - -] DHCP configuration for ports {'6b7859fa-a1af-428d-a1af-07ce1483f5d6'} is completed
Dec 02 10:18:48 np0005541913.localdomain ceph-mon[298296]: pgmap v758: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 219 B/s rd, 3.3 KiB/s wr, 1 op/s
Dec 02 10:18:49 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:49.563 160221 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cd2e60f3-a677-4ac1-88e4-9a23beb0fcdd, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 10:18:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:18:50 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:18:50 np0005541913.localdomain systemd[1]: tmp-crun.ud3qdb.mount: Deactivated successfully.
Dec 02 10:18:50 np0005541913.localdomain podman[338109]: 2025-12-02 10:18:50.450105738 +0000 UTC m=+0.092369786 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 10:18:50 np0005541913.localdomain podman[338110]: 2025-12-02 10:18:50.493293725 +0000 UTC m=+0.130206060 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 10:18:50 np0005541913.localdomain podman[338109]: 2025-12-02 10:18:50.515248294 +0000 UTC m=+0.157512322 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 10:18:50 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:18:50 np0005541913.localdomain podman[338110]: 2025-12-02 10:18:50.56324074 +0000 UTC m=+0.200153075 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:18:50 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:18:51 np0005541913.localdomain ceph-mon[298296]: pgmap v759: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.7 KiB/s wr, 0 op/s
Dec 02 10:18:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:51.630 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:52.255 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:53 np0005541913.localdomain ceph-mon[298296]: pgmap v760: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.7 KiB/s wr, 0 op/s
Dec 02 10:18:54 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:54.019 263406 INFO neutron.agent.linux.ip_lib [None req-480099a4-b06a-407c-b1d0-f50c3ff6b5ad - - - - - -] Device tap28b38d56-a5 cannot be used as it has no MAC address
Dec 02 10:18:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:54.045 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:54 np0005541913.localdomain kernel: device tap28b38d56-a5 entered promiscuous mode
Dec 02 10:18:54 np0005541913.localdomain NetworkManager[5965]: <info>  [1764670734.0540] manager: (tap28b38d56-a5): new Generic device (/org/freedesktop/NetworkManager/Devices/95)
Dec 02 10:18:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:18:54Z|00616|binding|INFO|Claiming lport 28b38d56-a5a9-424b-abbe-f0fde1476653 for this chassis.
Dec 02 10:18:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:18:54Z|00617|binding|INFO|28b38d56-a5a9-424b-abbe-f0fde1476653: Claiming unknown
Dec 02 10:18:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:54.055 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:54 np0005541913.localdomain systemd-udevd[338168]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 10:18:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:54.064 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6279c547e4d448d29e2a37d1c9d24474', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8508df3b-5b28-4906-91b4-ce2a2aa7f0ab, chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=28b38d56-a5a9-424b-abbe-f0fde1476653) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:18:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:54.066 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 28b38d56-a5a9-424b-abbe-f0fde1476653 in datapath fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3 bound to our chassis
Dec 02 10:18:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:54.068 160221 DEBUG neutron.agent.ovn.metadata.agent [-] Port c89bb200-796c-4c7a-b886-ca81a9ccf118 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 02 10:18:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:54.071 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:18:54 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:18:54.073 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2fe1f7-4e27-4aac-8ae4-ace9576d140c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:18:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:18:54Z|00618|binding|INFO|Setting lport 28b38d56-a5a9-424b-abbe-f0fde1476653 ovn-installed in OVS
Dec 02 10:18:54 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:18:54Z|00619|binding|INFO|Setting lport 28b38d56-a5a9-424b-abbe-f0fde1476653 up in Southbound
Dec 02 10:18:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:54.080 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:54.106 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:54.145 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:54 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:54.182 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:55 np0005541913.localdomain podman[338224]: 
Dec 02 10:18:55 np0005541913.localdomain podman[338224]: 2025-12-02 10:18:55.088716603 +0000 UTC m=+0.091653548 container create b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 02 10:18:55 np0005541913.localdomain ceph-mon[298296]: pgmap v761: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:18:55 np0005541913.localdomain systemd[1]: Started libpod-conmon-b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a.scope.
Dec 02 10:18:55 np0005541913.localdomain podman[338224]: 2025-12-02 10:18:55.044846207 +0000 UTC m=+0.047783162 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 02 10:18:55 np0005541913.localdomain systemd[1]: tmp-crun.2UdlBg.mount: Deactivated successfully.
Dec 02 10:18:55 np0005541913.localdomain systemd[1]: Started libcrun container.
Dec 02 10:18:55 np0005541913.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/99b42545da9579ee6141692819c2092ba10b87d18c3395d989ce008cca75dda0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 10:18:55 np0005541913.localdomain podman[338224]: 2025-12-02 10:18:55.190292984 +0000 UTC m=+0.193229949 container init b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:18:55 np0005541913.localdomain podman[338224]: 2025-12-02 10:18:55.203272462 +0000 UTC m=+0.206209417 container start b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 10:18:55 np0005541913.localdomain dnsmasq[338242]: started, version 2.85 cachesize 150
Dec 02 10:18:55 np0005541913.localdomain dnsmasq[338242]: DNS service limited to local subnets
Dec 02 10:18:55 np0005541913.localdomain dnsmasq[338242]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 02 10:18:55 np0005541913.localdomain dnsmasq[338242]: warning: no upstream servers configured
Dec 02 10:18:55 np0005541913.localdomain dnsmasq-dhcp[338242]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 02 10:18:55 np0005541913.localdomain dnsmasq[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/addn_hosts - 0 addresses
Dec 02 10:18:55 np0005541913.localdomain dnsmasq-dhcp[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/host
Dec 02 10:18:55 np0005541913.localdomain dnsmasq-dhcp[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/opts
Dec 02 10:18:55 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:55.514 263406 INFO neutron.agent.dhcp.agent [None req-d6ab4103-db52-4cd5-ab0e-c9cbd42626e1 - - - - - -] DHCP configuration for ports {'e4cd5106-3442-4e72-80e4-d3955ed089d9'} is completed
Dec 02 10:18:55 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:55.902 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:56 np0005541913.localdomain systemd[1]: tmp-crun.YxELvL.mount: Deactivated successfully.
Dec 02 10:18:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:18:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:56.632 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:57 np0005541913.localdomain ceph-mon[298296]: pgmap v762: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:18:57 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:57.172 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:18:56Z, description=, device_id=146339f0-3e49-419f-a49c-241664c75695, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990881faf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f990881f400>], id=426df494-1891-4082-a33e-37b61ae617b3, ip_allocation=immediate, mac_address=fa:16:3e:70:cc:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:18:51Z, description=, dns_domain=, id=fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1848591313-network, port_security_enabled=True, project_id=6279c547e4d448d29e2a37d1c9d24474, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13957, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3960, status=ACTIVE, subnets=['b5ae3d95-a8a3-4b1f-84fe-fbd5b83bc30c'], tags=[], tenant_id=6279c547e4d448d29e2a37d1c9d24474, updated_at=2025-12-02T10:18:52Z, vlan_transparent=None, network_id=fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, port_security_enabled=False, project_id=6279c547e4d448d29e2a37d1c9d24474, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3968, status=DOWN, tags=[], tenant_id=6279c547e4d448d29e2a37d1c9d24474, updated_at=2025-12-02T10:18:56Z on network fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3
Dec 02 10:18:57 np0005541913.localdomain dnsmasq[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/addn_hosts - 1 addresses
Dec 02 10:18:57 np0005541913.localdomain dnsmasq-dhcp[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/host
Dec 02 10:18:57 np0005541913.localdomain dnsmasq-dhcp[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/opts
Dec 02 10:18:57 np0005541913.localdomain podman[338260]: 2025-12-02 10:18:57.36569564 +0000 UTC m=+0.044661298 container kill b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:18:57 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:57.662 263406 INFO neutron.agent.dhcp.agent [None req-4aa9b142-ab2f-4653-bd32-338fd27a0360 - - - - - -] DHCP configuration for ports {'426df494-1891-4082-a33e-37b61ae617b3'} is completed
Dec 02 10:18:57 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:18:57Z|00620|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:18:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:18:57.773 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:18:58 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:58.743 263406 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-02T10:18:56Z, description=, device_id=146339f0-3e49-419f-a49c-241664c75695, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908870850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f9908870070>], id=426df494-1891-4082-a33e-37b61ae617b3, ip_allocation=immediate, mac_address=fa:16:3e:70:cc:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-02T10:18:51Z, description=, dns_domain=, id=fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1848591313-network, port_security_enabled=True, project_id=6279c547e4d448d29e2a37d1c9d24474, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13957, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3960, status=ACTIVE, subnets=['b5ae3d95-a8a3-4b1f-84fe-fbd5b83bc30c'], tags=[], tenant_id=6279c547e4d448d29e2a37d1c9d24474, updated_at=2025-12-02T10:18:52Z, vlan_transparent=None, network_id=fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, port_security_enabled=False, project_id=6279c547e4d448d29e2a37d1c9d24474, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3968, status=DOWN, tags=[], tenant_id=6279c547e4d448d29e2a37d1c9d24474, updated_at=2025-12-02T10:18:56Z on network fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3
Dec 02 10:18:58 np0005541913.localdomain dnsmasq[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/addn_hosts - 1 addresses
Dec 02 10:18:58 np0005541913.localdomain dnsmasq-dhcp[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/host
Dec 02 10:18:58 np0005541913.localdomain podman[338297]: 2025-12-02 10:18:58.982126797 +0000 UTC m=+0.059231178 container kill b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:18:58 np0005541913.localdomain dnsmasq-dhcp[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/opts
Dec 02 10:18:59 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:18:59.261 263406 INFO neutron.agent.dhcp.agent [None req-add556c9-0271-44b2-ab4d-946b2f4e74f3 - - - - - -] DHCP configuration for ports {'426df494-1891-4082-a33e-37b61ae617b3'} is completed
Dec 02 10:18:59 np0005541913.localdomain ceph-mon[298296]: pgmap v763: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:00 np0005541913.localdomain ceph-mon[298296]: pgmap v764: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:01.683 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:01 np0005541913.localdomain dnsmasq[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/addn_hosts - 0 addresses
Dec 02 10:19:01 np0005541913.localdomain dnsmasq-dhcp[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/host
Dec 02 10:19:01 np0005541913.localdomain podman[338334]: 2025-12-02 10:19:01.857666096 +0000 UTC m=+0.037897367 container kill ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 10:19:01 np0005541913.localdomain dnsmasq-dhcp[338026]: read /var/lib/neutron/dhcp/0cdd3535-8c4d-40c0-93c7-242e2392e8fd/opts
Dec 02 10:19:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:19:02Z|00621|binding|INFO|Releasing lport 9140f735-04c6-485f-9881-2f09b5b9f68f from this chassis (sb_readonly=0)
Dec 02 10:19:02 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:19:02Z|00622|binding|INFO|Setting lport 9140f735-04c6-485f-9881-2f09b5b9f68f down in Southbound
Dec 02 10:19:02 np0005541913.localdomain kernel: device tap9140f735-04 left promiscuous mode
Dec 02 10:19:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:02.004 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:02.013 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-0cdd3535-8c4d-40c0-93c7-242e2392e8fd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0cdd3535-8c4d-40c0-93c7-242e2392e8fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4851cba82e304f60b99cf343fa7fcf33', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=138f5691-7516-45ed-9230-1f42c25749a2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=9140f735-04c6-485f-9881-2f09b5b9f68f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:19:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:02.015 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 9140f735-04c6-485f-9881-2f09b5b9f68f in datapath 0cdd3535-8c4d-40c0-93c7-242e2392e8fd unbound from our chassis
Dec 02 10:19:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:02.017 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0cdd3535-8c4d-40c0-93c7-242e2392e8fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:19:02 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:02.018 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[341e0de9-de03-4b36-b03d-8e929dd9a3ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:19:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:02.035 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:02 np0005541913.localdomain ceph-mon[298296]: pgmap v765: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:03.062 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:19:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:03.062 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:19:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:03.063 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:19:03 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:19:03Z|00623|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:19:03 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:03.916 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:19:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:19:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:19:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:19:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:19:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:19:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:19:04 np0005541913.localdomain dnsmasq[338026]: exiting on receipt of SIGTERM
Dec 02 10:19:04 np0005541913.localdomain systemd[1]: libpod-ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960.scope: Deactivated successfully.
Dec 02 10:19:04 np0005541913.localdomain podman[338374]: 2025-12-02 10:19:04.307713231 +0000 UTC m=+0.055716853 container kill ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:19:04 np0005541913.localdomain podman[338388]: 2025-12-02 10:19:04.388852676 +0000 UTC m=+0.067422508 container died ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:19:04 np0005541913.localdomain systemd[1]: tmp-crun.I5tcTS.mount: Deactivated successfully.
Dec 02 10:19:04 np0005541913.localdomain podman[338388]: 2025-12-02 10:19:04.434810388 +0000 UTC m=+0.113380190 container cleanup ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 10:19:04 np0005541913.localdomain systemd[1]: libpod-conmon-ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960.scope: Deactivated successfully.
Dec 02 10:19:04 np0005541913.localdomain podman[338390]: 2025-12-02 10:19:04.516808475 +0000 UTC m=+0.186409086 container remove ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cdd3535-8c4d-40c0-93c7-242e2392e8fd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 02 10:19:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:19:04.538 263406 INFO neutron.agent.dhcp.agent [None req-f176ccad-6058-4814-876f-c7ca3d01f249 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:19:04 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:19:04.644 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:19:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:04.837 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:19:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:04.838 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:19:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:04.839 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:19:05 np0005541913.localdomain ceph-mon[298296]: pgmap v766: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2544020483' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:19:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/2544020483' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:19:05 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:19:05 np0005541913.localdomain podman[338416]: 2025-12-02 10:19:05.182680919 +0000 UTC m=+0.070786408 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 02 10:19:05 np0005541913.localdomain podman[338416]: 2025-12-02 10:19:05.193968691 +0000 UTC m=+0.082074160 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:19:05 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:19:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-8f9db6d3e7a07fe8276e0bd707c870a01daa8479ca21632c914dfb215bb86c31-merged.mount: Deactivated successfully.
Dec 02 10:19:05 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab0b2512ad46534a946df6c0a9cadbaf00d75a289804d3b3815d3b3b0f2c9960-userdata-shm.mount: Deactivated successfully.
Dec 02 10:19:05 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2d0cdd3535\x2d8c4d\x2d40c0\x2d93c7\x2d242e2392e8fd.mount: Deactivated successfully.
Dec 02 10:19:05 np0005541913.localdomain dnsmasq[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/addn_hosts - 0 addresses
Dec 02 10:19:05 np0005541913.localdomain dnsmasq-dhcp[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/host
Dec 02 10:19:05 np0005541913.localdomain dnsmasq-dhcp[338242]: read /var/lib/neutron/dhcp/fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3/opts
Dec 02 10:19:05 np0005541913.localdomain podman[338452]: 2025-12-02 10:19:05.376721038 +0000 UTC m=+0.064888799 container kill b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 02 10:19:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:19:05Z|00624|binding|INFO|Releasing lport 28b38d56-a5a9-424b-abbe-f0fde1476653 from this chassis (sb_readonly=0)
Dec 02 10:19:05 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:19:05Z|00625|binding|INFO|Setting lport 28b38d56-a5a9-424b-abbe-f0fde1476653 down in Southbound
Dec 02 10:19:05 np0005541913.localdomain kernel: device tap28b38d56-a5 left promiscuous mode
Dec 02 10:19:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:05.847 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:05.855 160221 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005541913.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8b822d43-1849-5d2e-aa2f-5f185b27d539-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6279c547e4d448d29e2a37d1c9d24474', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005541913.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8508df3b-5b28-4906-91b4-ce2a2aa7f0ab, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>], logical_port=28b38d56-a5a9-424b-abbe-f0fde1476653) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f40ecfcc6d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 02 10:19:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:05.857 160221 INFO neutron.agent.ovn.metadata.agent [-] Port 28b38d56-a5a9-424b-abbe-f0fde1476653 in datapath fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3 unbound from our chassis
Dec 02 10:19:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:05.859 160221 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 02 10:19:05 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:19:05.860 160340 DEBUG oslo.privsep.daemon [-] privsep: reply[539a82a6-88e8-4b37-95df-dcc9cb41645a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 02 10:19:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:05.863 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:19:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:19:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:19:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156107 "" "Go-http-client/1.1"
Dec 02 10:19:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:19:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19256 "" "Go-http-client/1.1"
Dec 02 10:19:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:06.682 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:06.688 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:06.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:19:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:06.842 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:19:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:06.842 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:19:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:06.842 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:19:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:06.842 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:19:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:06.842 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:19:06 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:19:06Z|00626|binding|INFO|Releasing lport d6e7da3f-8574-49e0-8ba1-2f642b3cec92 from this chassis (sb_readonly=0)
Dec 02 10:19:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:06.951 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:07 np0005541913.localdomain ceph-mon[298296]: pgmap v767: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:19:07 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3253900841' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.287 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:19:07 np0005541913.localdomain dnsmasq[338242]: exiting on receipt of SIGTERM
Dec 02 10:19:07 np0005541913.localdomain podman[338512]: 2025-12-02 10:19:07.342733894 +0000 UTC m=+0.063612756 container kill b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 02 10:19:07 np0005541913.localdomain systemd[1]: libpod-b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a.scope: Deactivated successfully.
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.351 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.352 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:19:07 np0005541913.localdomain podman[338530]: 2025-12-02 10:19:07.410309654 +0000 UTC m=+0.051314725 container died b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:19:07 np0005541913.localdomain systemd[1]: tmp-crun.xS7qRZ.mount: Deactivated successfully.
Dec 02 10:19:07 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a-userdata-shm.mount: Deactivated successfully.
Dec 02 10:19:07 np0005541913.localdomain podman[338530]: 2025-12-02 10:19:07.454753536 +0000 UTC m=+0.095758537 container remove b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd87b2bf-bab3-461e-aa7f-9d5a4bfb5ae3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 10:19:07 np0005541913.localdomain systemd[1]: libpod-conmon-b7bbc8332b03035553a1b35fea11ac5e5df59c880963be6b1ed6c2a308de237a.scope: Deactivated successfully.
Dec 02 10:19:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:19:07.496 263406 INFO neutron.agent.dhcp.agent [None req-e34c15f8-58b0-44d6-aa04-fdcf0504adc4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.530 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.533 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11023MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.533 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.534 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:19:07 np0005541913.localdomain neutron_dhcp_agent[263402]: 2025-12-02 10:19:07.629 263406 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.714 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.715 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.716 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.730 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing inventories for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.748 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating ProviderTree inventory for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.749 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Updating inventory in ProviderTree for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.760 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing aggregate associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.781 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Refreshing trait associations for resource provider c79215b2-6762-4f7f-a322-f44db2b0b9bd, traits: COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SVM,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_ABM,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 02 10:19:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:07.819 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:19:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3253900841' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:19:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:19:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4109271996' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:19:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:08.247 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:19:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:08.254 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:19:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:08.269 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:19:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:08.272 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:19:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:08.273 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:19:08 np0005541913.localdomain systemd[1]: var-lib-containers-storage-overlay-99b42545da9579ee6141692819c2092ba10b87d18c3395d989ce008cca75dda0-merged.mount: Deactivated successfully.
Dec 02 10:19:08 np0005541913.localdomain systemd[1]: run-netns-qdhcp\x2dfd87b2bf\x2dbab3\x2d461e\x2daa7f\x2d9d5a4bfb5ae3.mount: Deactivated successfully.
Dec 02 10:19:09 np0005541913.localdomain ceph-mon[298296]: pgmap v768: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/4109271996' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:19:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:19:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:19:09 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:19:09 np0005541913.localdomain podman[338582]: 2025-12-02 10:19:09.210033333 +0000 UTC m=+0.115794414 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:19:09 np0005541913.localdomain podman[338582]: 2025-12-02 10:19:09.218420038 +0000 UTC m=+0.124181119 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:19:09 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:19:09 np0005541913.localdomain podman[338577]: 2025-12-02 10:19:09.182998909 +0000 UTC m=+0.089527590 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Dec 02 10:19:09 np0005541913.localdomain podman[338576]: 2025-12-02 10:19:09.161519443 +0000 UTC m=+0.075663688 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 02 10:19:09 np0005541913.localdomain podman[338577]: 2025-12-02 10:19:09.267160205 +0000 UTC m=+0.173688836 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350)
Dec 02 10:19:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:09.274 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:19:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:09.275 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:19:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:09.275 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:19:09 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:19:09 np0005541913.localdomain podman[338576]: 2025-12-02 10:19:09.295191216 +0000 UTC m=+0.209335431 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:19:09 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:19:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:10.379 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:19:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:10.380 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:19:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:10.381 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:19:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:10.382 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:19:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:10.983 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:19:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:10.998 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:19:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:10.999 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:19:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:11.000 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:19:11 np0005541913.localdomain ceph-mon[298296]: pgmap v769: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:11.684 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:11.690 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:11.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:19:13 np0005541913.localdomain ceph-mon[298296]: pgmap v770: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:15 np0005541913.localdomain ceph-mon[298296]: pgmap v771: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:15 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:19:15 np0005541913.localdomain podman[338636]: 2025-12-02 10:19:15.443402895 +0000 UTC m=+0.083492839 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 02 10:19:15 np0005541913.localdomain podman[338636]: 2025-12-02 10:19:15.456035773 +0000 UTC m=+0.096125697 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Dec 02 10:19:15 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:19:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:15.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:19:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3458857762' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:19:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:16.687 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:16.692 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:17 np0005541913.localdomain ceph-mon[298296]: pgmap v772: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3626538401' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:19:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:17.823 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:19:18 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:18.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:19:19 np0005541913.localdomain ceph-mon[298296]: pgmap v773: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:21 np0005541913.localdomain ceph-mon[298296]: pgmap v774: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:19:21 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:19:21 np0005541913.localdomain systemd[1]: tmp-crun.r06tR2.mount: Deactivated successfully.
Dec 02 10:19:21 np0005541913.localdomain podman[338655]: 2025-12-02 10:19:21.447211293 +0000 UTC m=+0.089819658 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:19:21 np0005541913.localdomain podman[338655]: 2025-12-02 10:19:21.45567284 +0000 UTC m=+0.098281185 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:19:21 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:19:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:21 np0005541913.localdomain podman[338656]: 2025-12-02 10:19:21.546691378 +0000 UTC m=+0.185597824 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:19:21 np0005541913.localdomain podman[338656]: 2025-12-02 10:19:21.608037992 +0000 UTC m=+0.246944448 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 02 10:19:21 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:19:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:21.690 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:23 np0005541913.localdomain ceph-mon[298296]: pgmap v775: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:25 np0005541913.localdomain ceph-mon[298296]: pgmap v776: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:26 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2958011048' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:19:26 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2436311479' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:19:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:26.696 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:19:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:26.698 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:19:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:26.698 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:19:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:26.698 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:19:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:26.719 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:26.719 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:19:27 np0005541913.localdomain ceph-mon[298296]: pgmap v777: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:28 np0005541913.localdomain ceph-mon[298296]: pgmap v778: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:30 np0005541913.localdomain ceph-mon[298296]: pgmap v779: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:31.720 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:19:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:31.722 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:19:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:31.723 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:19:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:31.723 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:19:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:31.761 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:31.762 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:19:33 np0005541913.localdomain ceph-mon[298296]: pgmap v780: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:19:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:19:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:19:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:19:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:19:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:19:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:19:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:19:35 np0005541913.localdomain ceph-mon[298296]: pgmap v781: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:35 np0005541913.localdomain sshd[338703]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:19:35 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:19:35 np0005541913.localdomain sshd[338703]: Accepted publickey for zuul from 38.102.83.114 port 37936 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:19:35 np0005541913.localdomain systemd-logind[757]: New session 75 of user zuul.
Dec 02 10:19:35 np0005541913.localdomain systemd[1]: Started Session 75 of User zuul.
Dec 02 10:19:35 np0005541913.localdomain sshd[338703]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:19:35 np0005541913.localdomain podman[338705]: 2025-12-02 10:19:35.338784717 +0000 UTC m=+0.086276973 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 10:19:35 np0005541913.localdomain podman[338705]: 2025-12-02 10:19:35.379902058 +0000 UTC m=+0.127394294 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 10:19:35 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:19:35 np0005541913.localdomain sudo[338741]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czpvtinpifwbfezyrkemlnrljopfjnpd ; /usr/bin/python3
Dec 02 10:19:35 np0005541913.localdomain sudo[338741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:19:35 np0005541913.localdomain python3[338743]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                           _uses_shell=True zuul_log_id=fa163e3b-3c83-cf35-eb71-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 10:19:35 np0005541913.localdomain sudo[338741]: pam_unix(sudo:session): session closed for user root
Dec 02 10:19:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:19:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:19:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:19:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:19:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:19:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18789 "" "Go-http-client/1.1"
Dec 02 10:19:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:36.763 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:19:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:36.797 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:19:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:36.798 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:19:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:36.798 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:19:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:36.800 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:36.802 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: pgmap v782: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.070946) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670777070977, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1478, "num_deletes": 253, "total_data_size": 2573017, "memory_usage": 2646480, "flush_reason": "Manual Compaction"}
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670777079486, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 1691608, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38705, "largest_seqno": 40178, "table_properties": {"data_size": 1685803, "index_size": 3083, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13812, "raw_average_key_size": 21, "raw_value_size": 1673636, "raw_average_value_size": 2570, "num_data_blocks": 131, "num_entries": 651, "num_filter_entries": 651, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764670681, "oldest_key_time": 1764670681, "file_creation_time": 1764670777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 8564 microseconds, and 2632 cpu microseconds.
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.079510) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 1691608 bytes OK
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.079523) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.081204) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.081214) EVENT_LOG_v1 {"time_micros": 1764670777081211, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.081225) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 2565946, prev total WAL file size 2565946, number of live WAL files 2.
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.081712) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133333033' seq:72057594037927935, type:22 .. '7061786F73003133353535' seq:0, type:0; will stop at (end)
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(1651KB)], [66(17MB)]
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670777081761, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 20383086, "oldest_snapshot_seqno": -1}
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 14676 keys, 19065069 bytes, temperature: kUnknown
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670777186691, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 19065069, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18980124, "index_size": 47148, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36741, "raw_key_size": 393571, "raw_average_key_size": 26, "raw_value_size": 18729844, "raw_average_value_size": 1276, "num_data_blocks": 1761, "num_entries": 14676, "num_filter_entries": 14676, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764669385, "oldest_key_time": 0, "file_creation_time": 1764670777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2b5a5119-a77e-4ac2-8a7c-136bbfa56c89", "db_session_id": "7NRXCK2K9UGWEPQBYWTV", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.187131) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 19065069 bytes
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.189276) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.0 rd, 181.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 17.8 +0.0 blob) out(18.2 +0.0 blob), read-write-amplify(23.3) write-amplify(11.3) OK, records in: 15212, records dropped: 536 output_compression: NoCompression
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.189314) EVENT_LOG_v1 {"time_micros": 1764670777189297, "job": 40, "event": "compaction_finished", "compaction_time_micros": 105071, "compaction_time_cpu_micros": 48891, "output_level": 6, "num_output_files": 1, "total_output_size": 19065069, "num_input_records": 15212, "num_output_records": 14676, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670777190012, "job": 40, "event": "table_file_deletion", "file_number": 68}
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005541913/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764670777193769, "job": 40, "event": "table_file_deletion", "file_number": 66}
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.081595) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.193968) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.193977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.193980) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.193983) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:19:37 np0005541913.localdomain ceph-mon[298296]: rocksdb: (Original Log Time 2025/12/02-10:19:37.193986) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 02 10:19:37 np0005541913.localdomain ovn_controller[154505]: 2025-12-02T10:19:37Z|00627|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Dec 02 10:19:39 np0005541913.localdomain ceph-mon[298296]: pgmap v783: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:39 np0005541913.localdomain ceph-mon[298296]: mgrmap e54: np0005541914.lljzmk(active, since 19m), standbys: np0005541912.qwddia, np0005541913.mfesdm
Dec 02 10:19:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:19:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:19:39 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:19:39 np0005541913.localdomain systemd[1]: tmp-crun.DTBkGi.mount: Deactivated successfully.
Dec 02 10:19:39 np0005541913.localdomain podman[338746]: 2025-12-02 10:19:39.453871882 +0000 UTC m=+0.087470625 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:19:39 np0005541913.localdomain systemd[1]: tmp-crun.zPA4sR.mount: Deactivated successfully.
Dec 02 10:19:39 np0005541913.localdomain podman[338747]: 2025-12-02 10:19:39.508226459 +0000 UTC m=+0.138644387 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 10:19:39 np0005541913.localdomain podman[338746]: 2025-12-02 10:19:39.537659677 +0000 UTC m=+0.171258360 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 02 10:19:39 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:19:39 np0005541913.localdomain podman[338748]: 2025-12-02 10:19:39.551861548 +0000 UTC m=+0.178027611 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:19:39 np0005541913.localdomain podman[338748]: 2025-12-02 10:19:39.559883123 +0000 UTC m=+0.186049266 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 02 10:19:39 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:19:39 np0005541913.localdomain podman[338747]: 2025-12-02 10:19:39.593452882 +0000 UTC m=+0.223870820 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, config_id=edpm, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Dec 02 10:19:39 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:19:41 np0005541913.localdomain ceph-mon[298296]: pgmap v784: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 02 10:19:41 np0005541913.localdomain sshd[338703]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:19:41 np0005541913.localdomain systemd[1]: session-75.scope: Deactivated successfully.
Dec 02 10:19:41 np0005541913.localdomain systemd-logind[757]: Session 75 logged out. Waiting for processes to exit.
Dec 02 10:19:41 np0005541913.localdomain systemd-logind[757]: Removed session 75.
Dec 02 10:19:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:41.800 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:41.802 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:42 np0005541913.localdomain sudo[338805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:19:42 np0005541913.localdomain sudo[338805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:19:42 np0005541913.localdomain sudo[338805]: pam_unix(sudo:session): session closed for user root
Dec 02 10:19:42 np0005541913.localdomain sudo[338823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:19:42 np0005541913.localdomain sudo[338823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:19:42 np0005541913.localdomain sudo[338823]: pam_unix(sudo:session): session closed for user root
Dec 02 10:19:43 np0005541913.localdomain sudo[338873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:19:43 np0005541913.localdomain sudo[338873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:19:43 np0005541913.localdomain sudo[338873]: pam_unix(sudo:session): session closed for user root
Dec 02 10:19:43 np0005541913.localdomain ceph-mon[298296]: pgmap v785: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 02 10:19:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:19:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:19:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:19:43 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:19:45 np0005541913.localdomain ceph-mon[298296]: pgmap v786: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 02 10:19:46 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:19:46 np0005541913.localdomain podman[338891]: 2025-12-02 10:19:46.450523647 +0000 UTC m=+0.092617963 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 02 10:19:46 np0005541913.localdomain podman[338891]: 2025-12-02 10:19:46.466042323 +0000 UTC m=+0.108136709 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:19:46 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:19:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:46.802 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:19:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:46.803 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:19:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:46.803 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:19:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:46.804 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:19:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:46.804 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:46 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:46.805 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:19:47 np0005541913.localdomain ceph-mon[298296]: pgmap v787: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 02 10:19:48 np0005541913.localdomain ceph-mon[298296]: pgmap v788: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 02 10:19:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:19:50 np0005541913.localdomain ceph-mon[298296]: pgmap v789: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 02 10:19:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:51.806 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:51 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:51.808 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:19:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:19:52 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:19:52 np0005541913.localdomain podman[338911]: 2025-12-02 10:19:52.445829577 +0000 UTC m=+0.081709750 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:19:52 np0005541913.localdomain podman[338911]: 2025-12-02 10:19:52.485953062 +0000 UTC m=+0.121833185 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 10:19:52 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:19:52 np0005541913.localdomain podman[338910]: 2025-12-02 10:19:52.502429704 +0000 UTC m=+0.140067084 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:19:52 np0005541913.localdomain podman[338910]: 2025-12-02 10:19:52.538042269 +0000 UTC m=+0.175679619 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 10:19:52 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:19:53 np0005541913.localdomain ceph-mon[298296]: pgmap v790: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Dec 02 10:19:54 np0005541913.localdomain sshd[338960]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:19:54 np0005541913.localdomain sshd[338960]: Accepted publickey for zuul from 38.102.83.114 port 51868 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:19:54 np0005541913.localdomain systemd-logind[757]: New session 76 of user zuul.
Dec 02 10:19:54 np0005541913.localdomain systemd[1]: Started Session 76 of User zuul.
Dec 02 10:19:54 np0005541913.localdomain sshd[338960]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:19:54 np0005541913.localdomain sudo[338964]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Dec 02 10:19:54 np0005541913.localdomain sudo[338964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:19:55 np0005541913.localdomain ceph-mon[298296]: pgmap v791: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Dec 02 10:19:55 np0005541913.localdomain sudo[338964]: pam_unix(sudo:session): session closed for user root
Dec 02 10:19:55 np0005541913.localdomain sshd[338963]: Received disconnect from 38.102.83.114 port 51868:11: disconnected by user
Dec 02 10:19:55 np0005541913.localdomain sshd[338963]: Disconnected from user zuul 38.102.83.114 port 51868
Dec 02 10:19:55 np0005541913.localdomain sshd[338960]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:19:55 np0005541913.localdomain systemd[1]: session-76.scope: Deactivated successfully.
Dec 02 10:19:55 np0005541913.localdomain systemd-logind[757]: Session 76 logged out. Waiting for processes to exit.
Dec 02 10:19:55 np0005541913.localdomain systemd-logind[757]: Removed session 76.
Dec 02 10:19:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:19:56 np0005541913.localdomain sshd[338982]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:19:56 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:19:56.810 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:19:56 np0005541913.localdomain sshd[338982]: Accepted publickey for zuul from 38.102.83.114 port 51880 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:19:56 np0005541913.localdomain systemd-logind[757]: New session 77 of user zuul.
Dec 02 10:19:56 np0005541913.localdomain systemd[1]: Started Session 77 of User zuul.
Dec 02 10:19:56 np0005541913.localdomain sshd[338982]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:19:56 np0005541913.localdomain sudo[338986]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Dec 02 10:19:56 np0005541913.localdomain sudo[338986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:19:56 np0005541913.localdomain sudo[338986]: pam_unix(sudo:session): session closed for user root
Dec 02 10:19:56 np0005541913.localdomain sshd[338985]: Received disconnect from 38.102.83.114 port 51880:11: disconnected by user
Dec 02 10:19:56 np0005541913.localdomain sshd[338985]: Disconnected from user zuul 38.102.83.114 port 51880
Dec 02 10:19:56 np0005541913.localdomain sshd[338982]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:19:56 np0005541913.localdomain systemd[1]: session-77.scope: Deactivated successfully.
Dec 02 10:19:56 np0005541913.localdomain systemd-logind[757]: Session 77 logged out. Waiting for processes to exit.
Dec 02 10:19:56 np0005541913.localdomain systemd-logind[757]: Removed session 77.
Dec 02 10:19:57 np0005541913.localdomain ceph-mon[298296]: pgmap v792: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:57 np0005541913.localdomain sshd[339004]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:19:57 np0005541913.localdomain sshd[339004]: Accepted publickey for zuul from 38.102.83.114 port 51890 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:19:57 np0005541913.localdomain systemd-logind[757]: New session 78 of user zuul.
Dec 02 10:19:57 np0005541913.localdomain systemd[1]: Started Session 78 of User zuul.
Dec 02 10:19:57 np0005541913.localdomain sshd[339004]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:19:57 np0005541913.localdomain sudo[339008]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Dec 02 10:19:57 np0005541913.localdomain sudo[339008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:19:57 np0005541913.localdomain sudo[339008]: pam_unix(sudo:session): session closed for user root
Dec 02 10:19:57 np0005541913.localdomain sshd[339007]: Received disconnect from 38.102.83.114 port 51890:11: disconnected by user
Dec 02 10:19:57 np0005541913.localdomain sshd[339007]: Disconnected from user zuul 38.102.83.114 port 51890
Dec 02 10:19:57 np0005541913.localdomain sshd[339004]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:19:57 np0005541913.localdomain systemd[1]: session-78.scope: Deactivated successfully.
Dec 02 10:19:57 np0005541913.localdomain systemd-logind[757]: Session 78 logged out. Waiting for processes to exit.
Dec 02 10:19:57 np0005541913.localdomain systemd-logind[757]: Removed session 78.
Dec 02 10:19:58 np0005541913.localdomain sshd[339026]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:19:58 np0005541913.localdomain sshd[339026]: Accepted publickey for zuul from 38.102.83.114 port 51894 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:19:58 np0005541913.localdomain systemd-logind[757]: New session 79 of user zuul.
Dec 02 10:19:58 np0005541913.localdomain systemd[1]: Started Session 79 of User zuul.
Dec 02 10:19:58 np0005541913.localdomain sshd[339026]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:19:58 np0005541913.localdomain sudo[339030]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Dec 02 10:19:58 np0005541913.localdomain sudo[339030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:19:58 np0005541913.localdomain sudo[339030]: pam_unix(sudo:session): session closed for user root
Dec 02 10:19:58 np0005541913.localdomain sshd[339029]: Received disconnect from 38.102.83.114 port 51894:11: disconnected by user
Dec 02 10:19:58 np0005541913.localdomain sshd[339029]: Disconnected from user zuul 38.102.83.114 port 51894
Dec 02 10:19:58 np0005541913.localdomain sshd[339026]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:19:58 np0005541913.localdomain systemd[1]: session-79.scope: Deactivated successfully.
Dec 02 10:19:58 np0005541913.localdomain systemd-logind[757]: Session 79 logged out. Waiting for processes to exit.
Dec 02 10:19:58 np0005541913.localdomain systemd-logind[757]: Removed session 79.
Dec 02 10:19:58 np0005541913.localdomain sshd[339048]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:19:59 np0005541913.localdomain sshd[339048]: Accepted publickey for zuul from 38.102.83.114 port 51908 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:19:59 np0005541913.localdomain systemd-logind[757]: New session 80 of user zuul.
Dec 02 10:19:59 np0005541913.localdomain systemd[1]: Started Session 80 of User zuul.
Dec 02 10:19:59 np0005541913.localdomain sshd[339048]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:19:59 np0005541913.localdomain ceph-mon[298296]: pgmap v793: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:19:59 np0005541913.localdomain sudo[339052]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Dec 02 10:19:59 np0005541913.localdomain sudo[339052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:19:59 np0005541913.localdomain sudo[339052]: pam_unix(sudo:session): session closed for user root
Dec 02 10:19:59 np0005541913.localdomain sshd[339051]: Received disconnect from 38.102.83.114 port 51908:11: disconnected by user
Dec 02 10:19:59 np0005541913.localdomain sshd[339051]: Disconnected from user zuul 38.102.83.114 port 51908
Dec 02 10:19:59 np0005541913.localdomain sshd[339048]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:19:59 np0005541913.localdomain systemd[1]: session-80.scope: Deactivated successfully.
Dec 02 10:19:59 np0005541913.localdomain systemd-logind[757]: Session 80 logged out. Waiting for processes to exit.
Dec 02 10:19:59 np0005541913.localdomain systemd-logind[757]: Removed session 80.
Dec 02 10:19:59 np0005541913.localdomain sshd[339070]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:19:59 np0005541913.localdomain sshd[339070]: Accepted publickey for zuul from 38.102.83.114 port 54644 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:19:59 np0005541913.localdomain systemd-logind[757]: New session 81 of user zuul.
Dec 02 10:19:59 np0005541913.localdomain systemd[1]: Started Session 81 of User zuul.
Dec 02 10:19:59 np0005541913.localdomain sshd[339070]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:19:59 np0005541913.localdomain sudo[339074]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Dec 02 10:20:00 np0005541913.localdomain sudo[339074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:20:00 np0005541913.localdomain sudo[339074]: pam_unix(sudo:session): session closed for user root
Dec 02 10:20:00 np0005541913.localdomain sshd[339073]: Received disconnect from 38.102.83.114 port 54644:11: disconnected by user
Dec 02 10:20:00 np0005541913.localdomain sshd[339073]: Disconnected from user zuul 38.102.83.114 port 54644
Dec 02 10:20:00 np0005541913.localdomain sshd[339070]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:20:00 np0005541913.localdomain systemd[1]: session-81.scope: Deactivated successfully.
Dec 02 10:20:00 np0005541913.localdomain systemd-logind[757]: Session 81 logged out. Waiting for processes to exit.
Dec 02 10:20:00 np0005541913.localdomain systemd-logind[757]: Removed session 81.
Dec 02 10:20:00 np0005541913.localdomain ceph-mon[298296]: overall HEALTH_OK
Dec 02 10:20:00 np0005541913.localdomain sshd[339092]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:20:00 np0005541913.localdomain sshd[339092]: Accepted publickey for zuul from 38.102.83.114 port 54652 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:20:00 np0005541913.localdomain systemd-logind[757]: New session 82 of user zuul.
Dec 02 10:20:00 np0005541913.localdomain systemd[1]: Started Session 82 of User zuul.
Dec 02 10:20:00 np0005541913.localdomain sshd[339092]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:20:00 np0005541913.localdomain sudo[339096]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Dec 02 10:20:00 np0005541913.localdomain sudo[339096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:20:00 np0005541913.localdomain sudo[339096]: pam_unix(sudo:session): session closed for user root
Dec 02 10:20:00 np0005541913.localdomain sshd[339095]: Received disconnect from 38.102.83.114 port 54652:11: disconnected by user
Dec 02 10:20:00 np0005541913.localdomain sshd[339095]: Disconnected from user zuul 38.102.83.114 port 54652
Dec 02 10:20:00 np0005541913.localdomain sshd[339092]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:20:00 np0005541913.localdomain systemd[1]: session-82.scope: Deactivated successfully.
Dec 02 10:20:00 np0005541913.localdomain systemd-logind[757]: Session 82 logged out. Waiting for processes to exit.
Dec 02 10:20:00 np0005541913.localdomain systemd-logind[757]: Removed session 82.
Dec 02 10:20:01 np0005541913.localdomain ceph-mon[298296]: pgmap v794: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:01 np0005541913.localdomain sshd[339114]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:20:01 np0005541913.localdomain sshd[339114]: Accepted publickey for zuul from 38.102.83.114 port 54656 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:20:01 np0005541913.localdomain systemd-logind[757]: New session 83 of user zuul.
Dec 02 10:20:01 np0005541913.localdomain systemd[1]: Started Session 83 of User zuul.
Dec 02 10:20:01 np0005541913.localdomain sshd[339114]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:20:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:01 np0005541913.localdomain sudo[339118]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Dec 02 10:20:01 np0005541913.localdomain sudo[339118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:20:01 np0005541913.localdomain sudo[339118]: pam_unix(sudo:session): session closed for user root
Dec 02 10:20:01 np0005541913.localdomain sshd[339117]: Received disconnect from 38.102.83.114 port 54656:11: disconnected by user
Dec 02 10:20:01 np0005541913.localdomain sshd[339117]: Disconnected from user zuul 38.102.83.114 port 54656
Dec 02 10:20:01 np0005541913.localdomain sshd[339114]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:20:01 np0005541913.localdomain systemd[1]: session-83.scope: Deactivated successfully.
Dec 02 10:20:01 np0005541913.localdomain systemd-logind[757]: Session 83 logged out. Waiting for processes to exit.
Dec 02 10:20:01 np0005541913.localdomain systemd-logind[757]: Removed session 83.
Dec 02 10:20:01 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:01.813 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:01 np0005541913.localdomain sshd[339136]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:20:02 np0005541913.localdomain sshd[339136]: Accepted publickey for zuul from 38.102.83.114 port 54672 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:20:02 np0005541913.localdomain systemd-logind[757]: New session 84 of user zuul.
Dec 02 10:20:02 np0005541913.localdomain systemd[1]: Started Session 84 of User zuul.
Dec 02 10:20:02 np0005541913.localdomain sshd[339136]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:20:02 np0005541913.localdomain sudo[339140]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Dec 02 10:20:02 np0005541913.localdomain sudo[339140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:20:02 np0005541913.localdomain sudo[339140]: pam_unix(sudo:session): session closed for user root
Dec 02 10:20:02 np0005541913.localdomain sshd[339139]: Received disconnect from 38.102.83.114 port 54672:11: disconnected by user
Dec 02 10:20:02 np0005541913.localdomain sshd[339139]: Disconnected from user zuul 38.102.83.114 port 54672
Dec 02 10:20:02 np0005541913.localdomain sshd[339136]: pam_unix(sshd:session): session closed for user zuul
Dec 02 10:20:02 np0005541913.localdomain systemd[1]: session-84.scope: Deactivated successfully.
Dec 02 10:20:02 np0005541913.localdomain systemd-logind[757]: Session 84 logged out. Waiting for processes to exit.
Dec 02 10:20:02 np0005541913.localdomain systemd-logind[757]: Removed session 84.
Dec 02 10:20:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:20:03.063 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:20:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:20:03.063 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:20:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:20:03.064 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:20:03 np0005541913.localdomain ceph-mon[298296]: pgmap v795: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:20:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:20:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:20:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:20:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:20:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:20:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:20:04 np0005541913.localdomain ceph-mon[298296]: pgmap v796: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 02 10:20:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1234616018' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:20:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 02 10:20:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1234616018' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:20:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:04.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:04.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:04 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:04.829 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:20:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1234616018' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:20:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/1234616018' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:20:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:20:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:20:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:20:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:20:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:20:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18777 "" "Go-http-client/1.1"
Dec 02 10:20:06 np0005541913.localdomain ceph-mon[298296]: pgmap v797: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:06 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:20:06 np0005541913.localdomain systemd[1]: tmp-crun.uRNhcn.mount: Deactivated successfully.
Dec 02 10:20:06 np0005541913.localdomain podman[339158]: 2025-12-02 10:20:06.45382789 +0000 UTC m=+0.094230786 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 02 10:20:06 np0005541913.localdomain podman[339158]: 2025-12-02 10:20:06.46538953 +0000 UTC m=+0.105792416 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec 02 10:20:06 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:20:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:06.816 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:06.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:06.849 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:20:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:06.849 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:20:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:06.850 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:20:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:06.850 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:20:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:06.851 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:20:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:20:07 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1049435799' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:20:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:07.298 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:20:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:07.585 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:20:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:07.586 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:20:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:07.804 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:20:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:07.806 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=11031MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:20:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:07.807 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:20:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:07.807 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:20:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:08.008 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:20:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:08.008 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:20:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:08.009 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:20:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1049435799' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:20:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:08.053 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:20:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:20:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3296878270' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:20:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:08.519 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:20:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:08.527 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:20:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:08.588 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:20:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:08.591 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:20:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:08.591 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:20:09 np0005541913.localdomain ceph-mon[298296]: pgmap v798: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3296878270' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:20:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:20:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:20:10 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:20:10 np0005541913.localdomain podman[339225]: 2025-12-02 10:20:10.451978881 +0000 UTC m=+0.082654707 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 10:20:10 np0005541913.localdomain podman[339225]: 2025-12-02 10:20:10.460076138 +0000 UTC m=+0.090751934 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:20:10 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:20:10 np0005541913.localdomain podman[339223]: 2025-12-02 10:20:10.502495135 +0000 UTC m=+0.138519254 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:20:10 np0005541913.localdomain podman[339223]: 2025-12-02 10:20:10.510967002 +0000 UTC m=+0.146991111 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 02 10:20:10 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:20:10 np0005541913.localdomain podman[339224]: 2025-12-02 10:20:10.595386674 +0000 UTC m=+0.227971240 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 10:20:10 np0005541913.localdomain podman[339224]: 2025-12-02 10:20:10.635221291 +0000 UTC m=+0.267805887 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Dec 02 10:20:10 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:20:11 np0005541913.localdomain ceph-mon[298296]: pgmap v799: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:11.592 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:11.593 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:20:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:11.593 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:20:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:11.820 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:11.822 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:11.822 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:20:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:11.823 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:11.843 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:11 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:11.845 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:12.565 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:20:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:12.566 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:20:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:12.566 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:20:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:12.567 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:20:13 np0005541913.localdomain ceph-mon[298296]: pgmap v800: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:14.647 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:20:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:14.662 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:20:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:14.663 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:20:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:14.664 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:14 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:14.665 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:15 np0005541913.localdomain ceph-mon[298296]: pgmap v801: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.109 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'name': 'test', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005541913.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'hostId': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.142 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.143 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7740c8c6-6991-4110-a607-550b22cb18ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:20:16.110534', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7eebfe9e-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': 'd9e837da40ec73df6b27eaa964281370994b61d9b6c71a5301298a0b93f3f719'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:20:16.110534', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7eec12f8-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': '9168d64b3d21ce985b82314a2f9a88ce99ffd318fa11a0fefad1d951f6c9359c'}]}, 'timestamp': '2025-12-02 10:20:16.143742', '_unique_id': 'b4d2c62d876e4f8b91fc2ad7370caf5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.145 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.146 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 1962998170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.147 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.latency volume: 38950418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '634983ec-c83f-430b-b0ca-2efe2629337c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1962998170, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:20:16.146695', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7eec9b42-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': 'cbccc1d8720fd9e3312bf7eaed7f35b52e8c067fdf73e6ce72b0e9a5291c222f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950418, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:20:16.146695', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7eecab64-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': '416036926d8842d67e4fb43eab6188e071735198f4af3f2046ea2b8f5eeed4b7'}]}, 'timestamp': '2025-12-02 10:20:16.147547', '_unique_id': '2f4bffa3344e4377acb7b3ce955f139a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.148 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.153 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1dda90a-7154-46e2-96b5-f9a275736e2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.149766', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7eedb61c-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': '7f8f6890314f0e216f055540b555f4645fafcafd0757d2a85a8ca059fb1f9d16'}]}, 'timestamp': '2025-12-02 10:20:16.154411', '_unique_id': '0ae1738b82434d0ab0cfbf7f04800f0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.155 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.156 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea55069d-5fd2-41f1-90e8-901cd96a83c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.156782', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7eee2520-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': 'bd7147c58c0d484e523584a1b5da9face1a55cf5b87a9ce2094888a857d86625'}]}, 'timestamp': '2025-12-02 10:20:16.157241', '_unique_id': 'c1c073a4b8344ad9a711e641a14bf03f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.158 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.159 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.179 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/cpu volume: 21530000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6abe32a5-9b90-4f8f-b02d-f407d373268d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21530000000, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:20:16.159339', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7ef1b0c8-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.398925734, 'message_signature': 'ae012679ea65722c7e33067e7db3485f842ffa7ee4d4f62a44fc5f6545665824'}]}, 'timestamp': '2025-12-02 10:20:16.180486', '_unique_id': 'e5990e5d6173428ab4d9150d6cc73fdc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.181 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.196 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aeb69d8a-f2d1-4515-8abe-2433d9cf63b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:20:16.182840', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ef428d0-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.401915775, 'message_signature': '4082c1ec22cd55f0eeeccc2a0fc13567eaf3e93e89c5a5c9f7d2a053e7524606'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:20:16.182840', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ef43bf4-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.401915775, 'message_signature': 'ee8993ec17e66c0c4e09b497e5dbd492edf9e5e025bebda7ef56eb07d9e6d2f0'}]}, 'timestamp': '2025-12-02 10:20:16.197138', '_unique_id': '9a3cfa779273446d82b14c74604be50b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.198 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.199 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4462822-4e41-40c4-8792-5a7992fe7035', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.199797', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7ef4b700-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': 'a4a63a39f0f78f5fb9efbae96c0372528e5729a8ac531c6a4d753a06b4b318ec'}]}, 'timestamp': '2025-12-02 10:20:16.200308', '_unique_id': '53ce246d4fa847baaa3a4a8dd567cb3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.201 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.203 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/4137885901' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67080c6b-091f-4f5a-a10a-00d77e876d5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.202946', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7ef5354a-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': '38b357553824f227193218c096a910d4413900a65fbc4fb165456ca0557c2a3d'}]}, 'timestamp': '2025-12-02 10:20:16.203561', '_unique_id': 'e9f37bfea13644c1b62b34695d882dcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.204 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.205 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.206 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c50ba657-fd20-451f-a470-203f9db2823a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:20:16.205837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ef5a16a-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': '8f733b9679d419b99c5542e14c1f7ea30c99186742cfb3d15fbda5696f2c957f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:20:16.205837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ef5b2ae-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': '455c6c5b06f0eb0e43b97a679751ee70198c725348c3e2c9db3d7497c2f7d406'}]}, 'timestamp': '2025-12-02 10:20:16.206749', '_unique_id': 'd3cf81588d0b46d28c87ca0b5e4f11de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.207 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.209 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b188c783-bae0-4e8e-8f9c-7479d0e41634', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.209114', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7ef621b2-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': '6999ffb5924a558532efc8955d4c53b94a540ecbee41324952a91fa1c7d39305'}]}, 'timestamp': '2025-12-02 10:20:16.209692', '_unique_id': '634d2b8b5bc344b3ac9b0ffe09228595'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.210 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.211 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.211 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.212 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00d9ae98-aa42-40ed-a0fd-74615f1bf0d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.212258', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7ef6a36c-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': '1775b6026813c774e56284e7c99f17eade4cdb1a0a08c0b356bfc418fa07814f'}]}, 'timestamp': '2025-12-02 10:20:16.212927', '_unique_id': 'd8ab825e0075404eb4135378d1416da7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.213 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.215 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.215 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.215 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '509014eb-3f79-49b5-a188-2ae17a5dc76f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:20:16.215351', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ef71540-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': '1098615b7dac8333444b01452697e6654e34ac693971873e82fc2c358f7783eb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:20:16.215351', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ef729b8-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': '1dbc810811bd91aaab5ab7a654668db4e9b759b9d75acd7374f4467a11184376'}]}, 'timestamp': '2025-12-02 10:20:16.216315', '_unique_id': 'c0a68b2a49434f6e95c3ef986a392d27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.217 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.218 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.218 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.219 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e961830-4ea0-4392-9bab-7569ff26b82f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:20:16.218829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ef79ccc-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': '83d610c231796cf488e3d5604f4b4374cc7408ca5003787e4cede9a7256bda2a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:20:16.218829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ef7ad3e-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': 'd789cd50f7eee38da40b9802939ea33c649758898be1558f827840bc6d49bde1'}]}, 'timestamp': '2025-12-02 10:20:16.219752', '_unique_id': 'c694c201d98c423bb6ac7b5b4fa38bfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.220 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.222 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb95279c-0bb1-4edb-8a29-3da97e3cf29a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.221991', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7ef8186e-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': 'a8cc3f2f3288231bf0423ae072fdd9e6731cdf4c3d092368d015bab907454e2e'}]}, 'timestamp': '2025-12-02 10:20:16.222449', '_unique_id': '90c74d7cee9c45d8b1a7fa9de2fed80c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.223 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.224 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 1807645093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.225 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.read.latency volume: 89262124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c8d5be4-0278-4c80-b00f-9a924005f00e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1807645093, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:20:16.224661', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ef8804c-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': 'a517e1e403ab8ae1dd4b55ccef2c1fa536df675c660cac3515c9fe94a23d28fd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89262124, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:20:16.224661', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ef88fce-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.329607198, 'message_signature': '1dde1e03c8343a2b1328934820daf36ccb980902f1258416b83eda582e4d200d'}]}, 'timestamp': '2025-12-02 10:20:16.225472', '_unique_id': '0d700dea18ef453c925919a897a4d5e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.226 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.227 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.227 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/memory.usage volume: 51.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a272b981-d83f-4398-b2bd-cd2eea7abfbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6328125, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'timestamp': '2025-12-02T10:20:16.227738', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7ef8f946-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.398925734, 'message_signature': '6274e067687404550993ece487bdc4eb156adda4060103ea67dd6af0acadf53d'}]}, 'timestamp': '2025-12-02 10:20:16.228191', '_unique_id': 'a31b36565918461ca60cdb40cab7f419'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.229 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.230 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.230 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a55aa510-90e2-4e4b-b99d-2cf70c07fdd1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.230749', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7ef96fb6-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': 'e1a355313bd230a7d6ce2cea71ca6ba11a9e2c3553e89eba1201a1896a07c039'}]}, 'timestamp': '2025-12-02 10:20:16.231241', '_unique_id': 'd5698950c96d43779b6dc8f67c5339e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.232 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.233 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.234 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cf2457b-bbaf-4477-aa6f-db73f7f54117', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:20:16.233440', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ef9d938-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.401915775, 'message_signature': '073f7a54b89cd025c1e12cddd01bf67a776991de7c8033d5c6d1e1dc418f9d32'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:20:16.233440', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ef9f1e8-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.401915775, 'message_signature': '38f2b946d2b696363e89683dc75a80f047b8bc926375f62a6b7d4d06e4c2f3b9'}]}, 'timestamp': '2025-12-02 10:20:16.234588', '_unique_id': 'ea6d67d519664ccebdf7faf9ededc99a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.235 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.236 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.237 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74be7dc0-a09e-4ff0-a2c7-9a16cca84795', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vda', 'timestamp': '2025-12-02T10:20:16.236855', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7efa59b2-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.401915775, 'message_signature': 'e372346c63c4ff2a81889cb7a40f3eaa223f6d38b9966e392dd552792c438db5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16-vdb', 'timestamp': '2025-12-02T10:20:16.236855', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7efa64ca-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.401915775, 'message_signature': 'df6af92e0e54e6f72aec9408d8991eecab19a889b8d1a71248afe504e6ed7bd2'}]}, 'timestamp': '2025-12-02 10:20:16.237415', '_unique_id': '27e1ce05080c42aabe1e1d0963a50557'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.238 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0dc2185-f0f2-455f-b790-46cb85f46ee7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.238797', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7efaa5c0-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': '4a96e8744b0a7066e952f7c5ccf409fd9b93d71d0fdfae0660088cd7a8216af1'}]}, 'timestamp': '2025-12-02 10:20:16.239090', '_unique_id': '68b98d8fd34142d2becde71b13a23d46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.239 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.240 12 DEBUG ceilometer.compute.pollsters [-] b254bb7f-2891-4b37-9c44-9700e301ce16/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcb3bf48-ca34-4ed3-abab-9e941f06e65d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb8b7d2a63b642aa999db12e17eeb9e4', 'user_name': None, 'project_id': 'e2d97696ab6749899bb8ba5ce29a3de2', 'project_name': None, 'resource_id': 'instance-00000002-b254bb7f-2891-4b37-9c44-9700e301ce16-tap4a318f6a-b3', 'timestamp': '2025-12-02T10:20:16.240428', 'resource_metadata': {'display_name': 'test', 'name': 'tap4a318f6a-b3', 'instance_id': 'b254bb7f-2891-4b37-9c44-9700e301ce16', 'instance_type': 'm1.small', 'host': '0ad0c9dfa8a31298b6a0f119d9616e77e0b75b76881f2815e01eeeff', 'instance_host': 'np0005541913.localdomain', 'flavor': {'id': '45a99238-6f19-4f9e-be82-6ef3af1dcb31', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa'}, 'image_ref': '6dbbc069-a02f-4753-b6a8-9fb02d7e0aaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:26:b2:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a318f6a-b3'}, 'message_id': '7efae602-cf68-11f0-a0da-fa163e3f40cc', 'monotonic_time': 12978.36886618, 'message_signature': '03bfcdab1e3fd3d555d61bb09b05aaf430b8f1081b8cbd1e6d9ea99aa6961dcc'}]}, 'timestamp': '2025-12-02 10:20:16.240770', '_unique_id': '9a500507f2d24e348e56d5cafe00a08c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     yield
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 02 10:20:16 np0005541913.localdomain ceilometer_agent_compute[238101]: 2025-12-02 10:20:16.241 12 ERROR oslo_messaging.notify.messaging 
Dec 02 10:20:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:16.845 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:16.847 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:16.848 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:20:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:16.848 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:16.881 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:16 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:16.881 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:17 np0005541913.localdomain ceph-mon[298296]: pgmap v802: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2506910296' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:20:17 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:20:17 np0005541913.localdomain podman[339283]: 2025-12-02 10:20:17.440583799 +0000 UTC m=+0.078306010 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 02 10:20:17 np0005541913.localdomain podman[339283]: 2025-12-02 10:20:17.451943903 +0000 UTC m=+0.089666074 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 10:20:17 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:20:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:17.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:17.851 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:19 np0005541913.localdomain ceph-mon[298296]: pgmap v803: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:19.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:19.828 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:20:21 np0005541913.localdomain ceph-mon[298296]: pgmap v804: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:21.882 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:21.883 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:21.883 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:20:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:21.883 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:21.884 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:21 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:21.887 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:23 np0005541913.localdomain ceph-mon[298296]: pgmap v805: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:20:23 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:20:23 np0005541913.localdomain podman[339300]: 2025-12-02 10:20:23.426460187 +0000 UTC m=+0.069970096 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 10:20:23 np0005541913.localdomain podman[339300]: 2025-12-02 10:20:23.463030577 +0000 UTC m=+0.106540476 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:20:23 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:20:23 np0005541913.localdomain systemd[1]: tmp-crun.ETVbqZ.mount: Deactivated successfully.
Dec 02 10:20:23 np0005541913.localdomain podman[339301]: 2025-12-02 10:20:23.554522319 +0000 UTC m=+0.198837330 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 02 10:20:23 np0005541913.localdomain podman[339301]: 2025-12-02 10:20:23.594055158 +0000 UTC m=+0.238370229 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 10:20:23 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:20:24 np0005541913.localdomain ceph-mon[298296]: pgmap v806: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:26 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:26 np0005541913.localdomain ceph-mon[298296]: pgmap v807: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:26 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:26.886 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:28 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:20:28 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2708879317' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:20:28 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1038183216' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:20:29 np0005541913.localdomain ceph-mon[298296]: pgmap v808: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:29 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2708879317' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:20:31 np0005541913.localdomain ceph-mon[298296]: pgmap v809: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:31 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:31.890 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:31.892 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:31.893 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:20:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:31.893 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:31.924 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:31 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:31.925 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:33 np0005541913.localdomain ceph-mon[298296]: pgmap v810: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:20:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:20:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:20:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:20:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:20:34 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:20:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:20:34 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:20:35 np0005541913.localdomain ceph-mon[298296]: pgmap v811: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:36 np0005541913.localdomain podman[240799]: time="2025-12-02T10:20:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:20:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:20:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:20:36 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:20:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18793 "" "Go-http-client/1.1"
Dec 02 10:20:36 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:36.958 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:36.960 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:36.960 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:20:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:36.960 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:36.961 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:36.961 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:36 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:36.963 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:37 np0005541913.localdomain ceph-mon[298296]: pgmap v812: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:37 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:20:37 np0005541913.localdomain podman[339346]: 2025-12-02 10:20:37.455053111 +0000 UTC m=+0.089579421 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:20:37 np0005541913.localdomain podman[339346]: 2025-12-02 10:20:37.467993839 +0000 UTC m=+0.102520119 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 02 10:20:37 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:20:39 np0005541913.localdomain ceph-mon[298296]: pgmap v813: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:41 np0005541913.localdomain ceph-mon[298296]: pgmap v814: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:20:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:20:41 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:20:41 np0005541913.localdomain podman[339365]: 2025-12-02 10:20:41.441977783 +0000 UTC m=+0.082627765 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 10:20:41 np0005541913.localdomain podman[339367]: 2025-12-02 10:20:41.5060665 +0000 UTC m=+0.137648929 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:20:41 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:41 np0005541913.localdomain podman[339367]: 2025-12-02 10:20:41.514169508 +0000 UTC m=+0.145751927 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 02 10:20:41 np0005541913.localdomain podman[339365]: 2025-12-02 10:20:41.521939596 +0000 UTC m=+0.162589508 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 02 10:20:41 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:20:41 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:20:41 np0005541913.localdomain podman[339366]: 2025-12-02 10:20:41.604040766 +0000 UTC m=+0.239223161 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container)
Dec 02 10:20:41 np0005541913.localdomain podman[339366]: 2025-12-02 10:20:41.620931228 +0000 UTC m=+0.256113603 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Dec 02 10:20:41 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:20:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:41.964 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:41.966 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:41.966 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:20:41 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:41.966 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:42.006 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:42 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:42.007 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:43 np0005541913.localdomain ceph-mon[298296]: pgmap v815: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:43 np0005541913.localdomain sudo[339426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:20:43 np0005541913.localdomain sudo[339426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:20:43 np0005541913.localdomain sudo[339426]: pam_unix(sudo:session): session closed for user root
Dec 02 10:20:43 np0005541913.localdomain sudo[339444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 02 10:20:43 np0005541913.localdomain sudo[339444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:20:44 np0005541913.localdomain podman[339536]: 2025-12-02 10:20:44.247770113 +0000 UTC m=+0.095535601 container exec 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1763362218, architecture=x86_64, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_CLEAN=True)
Dec 02 10:20:44 np0005541913.localdomain podman[339536]: 2025-12-02 10:20:44.354518653 +0000 UTC m=+0.202284141 container exec_died 990b8c741b7783c8fc872e091a073eaa8355db0c6a880b1d7d40e92d418ad467 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-c7c8e171-a193-56fb-95fa-8879fcfa7074-crash-np0005541913, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 02 10:20:44 np0005541913.localdomain sudo[339444]: pam_unix(sudo:session): session closed for user root
Dec 02 10:20:45 np0005541913.localdomain sudo[339655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 02 10:20:45 np0005541913.localdomain sudo[339655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:20:45 np0005541913.localdomain sudo[339655]: pam_unix(sudo:session): session closed for user root
Dec 02 10:20:45 np0005541913.localdomain sudo[339673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c7c8e171-a193-56fb-95fa-8879fcfa7074/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 02 10:20:45 np0005541913.localdomain sudo[339673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:20:45 np0005541913.localdomain ceph-mon[298296]: pgmap v816: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:20:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:20:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:20:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:20:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:20:45 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:20:45 np0005541913.localdomain sudo[339673]: pam_unix(sudo:session): session closed for user root
Dec 02 10:20:46 np0005541913.localdomain sudo[339723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 02 10:20:46 np0005541913.localdomain sudo[339723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 02 10:20:46 np0005541913.localdomain sudo[339723]: pam_unix(sudo:session): session closed for user root
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 172.18.0.108:0/2286681988' entity='mgr.np0005541914.lljzmk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 02 10:20:46 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:47 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:47.050 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:47 np0005541913.localdomain ceph-mon[298296]: pgmap v817: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:47 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541912.localdomain to 836.6M
Dec 02 10:20:47 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541914.localdomain to 836.6M
Dec 02 10:20:47 np0005541913.localdomain ceph-mon[298296]: Adjusting osd_memory_target on np0005541913.localdomain to 836.6M
Dec 02 10:20:47 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541912.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 10:20:47 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541914.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 10:20:47 np0005541913.localdomain ceph-mon[298296]: Unable to set osd_memory_target on np0005541913.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 02 10:20:48 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:20:48 np0005541913.localdomain podman[339741]: 2025-12-02 10:20:48.434019245 +0000 UTC m=+0.069135154 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 10:20:48 np0005541913.localdomain podman[339741]: 2025-12-02 10:20:48.442186204 +0000 UTC m=+0.077302123 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 02 10:20:48 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:20:48 np0005541913.localdomain ceph-mon[298296]: pgmap v818: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:48 np0005541913.localdomain ceph-mon[298296]: from='mgr.34354 ' entity='mgr.np0005541914.lljzmk' 
Dec 02 10:20:50 np0005541913.localdomain ceph-mon[298296]: pgmap v819: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:51 np0005541913.localdomain sshd[339762]: main: sshd: ssh-rsa algorithm is disabled
Dec 02 10:20:51 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:51 np0005541913.localdomain sshd[339762]: Accepted publickey for zuul from 192.168.122.10 port 53472 ssh2: RSA SHA256:uKO0Ohw486fj8lQHxind1+ryY96O3+Z9KYMOgF8+dKU
Dec 02 10:20:51 np0005541913.localdomain systemd-logind[757]: New session 85 of user zuul.
Dec 02 10:20:51 np0005541913.localdomain systemd[1]: Started Session 85 of User zuul.
Dec 02 10:20:51 np0005541913.localdomain sshd[339762]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 02 10:20:51 np0005541913.localdomain sudo[339766]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Dec 02 10:20:51 np0005541913.localdomain sudo[339766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 02 10:20:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:52.054 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:52.058 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:20:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:52.058 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:20:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:52.058 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:52.080 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:52 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:52.081 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:20:53 np0005541913.localdomain ceph-mon[298296]: pgmap v820: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.
Dec 02 10:20:54 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.
Dec 02 10:20:54 np0005541913.localdomain podman[339919]: 2025-12-02 10:20:54.461651861 +0000 UTC m=+0.084431903 container health_status cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 02 10:20:54 np0005541913.localdomain podman[339918]: 2025-12-02 10:20:54.521504995 +0000 UTC m=+0.145751917 container health_status 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:20:54 np0005541913.localdomain podman[339918]: 2025-12-02 10:20:54.539023364 +0000 UTC m=+0.163270286 container exec_died 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 10:20:54 np0005541913.localdomain systemd[1]: 53cdcad7bb2eb92a4005cb53efa297ce75c0fcc9e15541d6ad164377a8999709.service: Deactivated successfully.
Dec 02 10:20:54 np0005541913.localdomain podman[339919]: 2025-12-02 10:20:54.564110377 +0000 UTC m=+0.186890419 container exec_died cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 02 10:20:54 np0005541913.localdomain systemd[1]: cb7a1d62ad28be61994a5cce5e8d17f6fcd752ea72af3018f59ac13d978bf782.service: Deactivated successfully.
Dec 02 10:20:55 np0005541913.localdomain ceph-mon[298296]: pgmap v821: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:55 np0005541913.localdomain ceph-mon[298296]: from='client.69428 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:20:55 np0005541913.localdomain ceph-mon[298296]: from='client.59014 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:20:55 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "status"} v 0)
Dec 02 10:20:55 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/618962487' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 02 10:20:56 np0005541913.localdomain ceph-mon[298296]: from='client.49287 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:20:56 np0005541913.localdomain ceph-mon[298296]: from='client.69434 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:20:56 np0005541913.localdomain ceph-mon[298296]: from='client.59020 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:20:56 np0005541913.localdomain ceph-mon[298296]: from='client.49293 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:20:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1735864119' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 02 10:20:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/618962487' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 02 10:20:56 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1347547795' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 02 10:20:56 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:20:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:57.081 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:57 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:20:57.083 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:20:57 np0005541913.localdomain ceph-mon[298296]: pgmap v822: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:57 np0005541913.localdomain ovs-vsctl[340063]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 02 10:20:58 np0005541913.localdomain virtqemud[203664]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 02 10:20:58 np0005541913.localdomain virtqemud[203664]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 02 10:20:58 np0005541913.localdomain virtqemud[203664]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 02 10:20:58 np0005541913.localdomain systemd[1]: efi.automount: Got automount request for /efi, triggered by 340219 (lsinitrd)
Dec 02 10:20:58 np0005541913.localdomain systemd[1]: Mounting EFI System Partition Automount...
Dec 02 10:20:58 np0005541913.localdomain systemd[1]: Mounted EFI System Partition Automount.
Dec 02 10:20:59 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: cache status {prefix=cache status} (starting...)
Dec 02 10:20:59 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:20:59 np0005541913.localdomain ceph-mon[298296]: pgmap v823: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:20:59 np0005541913.localdomain lvm[340303]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 02 10:20:59 np0005541913.localdomain lvm[340303]: VG ceph_vg1 finished
Dec 02 10:20:59 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: client ls {prefix=client ls} (starting...)
Dec 02 10:20:59 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:20:59 np0005541913.localdomain lvm[340309]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 02 10:20:59 np0005541913.localdomain lvm[340309]: VG ceph_vg0 finished
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: damage ls {prefix=damage ls} (starting...)
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: dump loads {prefix=dump loads} (starting...)
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "report"} v 0)
Dec 02 10:21:00 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/60804839' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 02 10:21:00 np0005541913.localdomain ceph-mon[298296]: from='client.69449 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:00 np0005541913.localdomain ceph-mon[298296]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 02 10:21:00 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2106594270' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:00 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 02 10:21:00 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/527587334' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 02 10:21:00 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config log"} v 0)
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/951073284' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 02 10:21:01 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/701367491' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: ops {prefix=ops} (starting...)
Dec 02 10:21:01 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.59032 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: pgmap v824: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.69455 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.49305 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.59038 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.49311 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/60804839' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.69476 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3132060943' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3324992184' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/527587334' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.59077 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1963989503' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/912277046' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.49338 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/951073284' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1158635755' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/701367491' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1115069005' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/259684202' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 02 10:21:01 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3223086368' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 02 10:21:01 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: session ls {prefix=session ls} (starting...)
Dec 02 10:21:01 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe Can't run that command on an inactive MDS!
Dec 02 10:21:01 np0005541913.localdomain ceph-mds[286809]: mds.mds.np0005541913.maexpe asok_command: status {prefix=status} (starting...)
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/200524969' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:02.084 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:21:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:02.088 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:21:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:02.088 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:21:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:02.088 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:21:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:02.089 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:21:02 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:02.091 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2110934913' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3173019494' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1077184288' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/259684202' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: pgmap v825: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3223086368' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1820455341' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.69521 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.49368 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1882961530' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2070081632' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.69536 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/200524969' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1397926291' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.59122 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2258191271' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1839279825' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "features"} v 0)
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/162806466' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 02 10:21:02 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3729005944' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:21:03.063 160221 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:21:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:21:03.064 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:21:03 np0005541913.localdomain ovn_metadata_agent[160216]: 2025-12-02 10:21:03.065 160221 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/509267441' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2037890107' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.49386 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3350057924' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1839279825' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/513619678' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/162806466' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2821027336' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.49401 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3178003001' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3729005944' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3115311484' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/509267441' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3833727113' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1263238642' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3132199302' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec 02 10:21:03 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3308714736' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:21:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 10:21:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:21:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:21:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:21:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 10:21:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:21:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 10:21:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:21:04 np0005541913.localdomain openstack_network_exporter[242845]: ERROR   10:21:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 10:21:04 np0005541913.localdomain openstack_network_exporter[242845]: 
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.69584 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2037890107' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2934133144' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: pgmap v826: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.59176 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1262229764' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/788720208' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1985353952' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.69611 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3132199302' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3308714736' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2320340967' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.49458 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.69626 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1120667361' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec 02 10:21:04 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1410707360' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2379843221' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:59.431208+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1006047 data_alloc: 184549376 data_used: 9691136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8ca7000/0x0/0x1bfc00000, data 0x2d641c3/0x2de7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95248384 unmapped: 4317184 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:00.431357+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95248384 unmapped: 4317184 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:01.431530+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95248384 unmapped: 4317184 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:02.431687+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8ca7000/0x0/0x1bfc00000, data 0x2d641c3/0x2de7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95248384 unmapped: 4317184 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:03.431848+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95248384 unmapped: 4317184 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:04.431988+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1006047 data_alloc: 184549376 data_used: 9691136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95248384 unmapped: 4317184 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:05.432216+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8ca7000/0x0/0x1bfc00000, data 0x2d641c3/0x2de7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95256576 unmapped: 4308992 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:06.432371+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8ca7000/0x0/0x1bfc00000, data 0x2d641c3/0x2de7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95256576 unmapped: 4308992 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:07.432506+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95256576 unmapped: 4308992 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:08.432658+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95256576 unmapped: 4308992 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:09.432823+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 89 handle_osd_map epochs [89,90], i have 89, src has [1,90]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 85.230804443s of 85.389671326s, submitted: 12
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 32
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now 
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/3550144330
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc reconnect No active mgr available yet
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 ms_handle_reset con 0x56524767c000 session 0x565248f48780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1010219 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652451e0400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95404032 unmapped: 4161536 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:10.432959+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 33
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc reconnect Starting new session with [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: get_auth_request con 0x565247a55c00 auth_method 0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_configure stats_period=5
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95330304 unmapped: 4235264 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:11.433135+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 34
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95485952 unmapped: 4079616 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:12.433331+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95485952 unmapped: 4079616 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:13.433674+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95485952 unmapped: 4079616 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:14.433829+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 35
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95485952 unmapped: 4079616 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:15.433986+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 36
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:16.434138+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:17.434297+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:18.434454+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:19.434565+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:20.434747+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:21.434887+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:22.435079+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:23.435275+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:24.435447+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:25.435654+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:26.435799+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:27.435954+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:28.436134+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:29.436307+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:30.436559+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:31.436787+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:32.436944+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:33.437057+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:34.437210+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:35.437383+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 37
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:36.437534+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:37.437642+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:38.437789+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:39.437978+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:40.438644+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:41.438814+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:42.438985+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:43.439162+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:44.439324+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:45.439472+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:46.439664+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:47.439839+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:48.439970+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:49.440163+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:50.440330+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:51.440536+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:52.440690+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:53.440858+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:54.441022+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:55.441151+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:56.441315+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:57.441479+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:58.441720+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:59.441920+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:00.442071+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:01.442283+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:02.442482+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:03.442705+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:04.442893+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:05.443071+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:06.443235+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:07.443397+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:08.443537+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:09.443661+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:10.443858+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:11.444075+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:12.444288+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:13.444430+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:14.444658+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:15.444889+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:16.445042+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:17.445202+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:18.465481+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:19.465601+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:20.465777+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:21.465999+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:22.466158+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:23.466313+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:24.466418+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:25.466578+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:26.466746+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:27.466940+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:28.467122+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:29.467252+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:30.467418+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:31.467655+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:32.467800+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:33.467944+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:34.468097+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8ca4000/0x0/0x1bfc00000, data 0x2d66281/0x2dea000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1009339 data_alloc: 184549376 data_used: 9699328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:35.468259+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:36.468389+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94797824 unmapped: 4767744 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 38
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now 
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/2383186409
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc reconnect No active mgr available yet
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 ms_handle_reset con 0x5652451e0400 session 0x5652491caf00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc7800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 86.950881958s of 87.005424500s, submitted: 12
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:37.468525+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94920704 unmapped: 4644864 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 39
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: get_auth_request con 0x5652474ee400 auth_method 0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_configure stats_period=5
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:38.468700+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94920704 unmapped: 4644864 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca0000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:39.468831+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94920704 unmapped: 4644864 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:40.468985+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94920704 unmapped: 4644864 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:41.469163+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94920704 unmapped: 4644864 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 41
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:42.469312+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94920704 unmapped: 4644864 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 42
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:43.469478+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94920704 unmapped: 4644864 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:44.469598+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:45.469843+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:46.470152+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:47.471594+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:48.471802+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:49.472260+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:50.472666+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:51.472830+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:52.472971+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:53.473460+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:54.473724+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:55.473923+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:56.474221+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:57.474645+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:58.474850+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:59.475348+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:00.475733+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:01.476365+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:02.476488+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94928896 unmapped: 4636672 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:03.476646+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:04.477075+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:05.477236+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:06.477510+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:07.477762+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:08.477892+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:09.478084+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:10.478220+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:11.478406+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:12.478573+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:13.478762+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:14.478943+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:15.479072+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:16.479207+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:17.479434+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:18.479569+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:19.479674+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:20.479855+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:21.480052+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:22.480282+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:23.480402+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:24.480536+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:25.480691+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:26.481170+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:27.481414+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:28.481759+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:29.481976+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:30.482162+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:31.482580+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:32.482708+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:33.482912+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:34.483103+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:35.483304+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:36.483455+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:37.483593+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:38.483721+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:39.483963+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:40.484109+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:41.484290+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:42.484463+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:43.484652+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:44.484811+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:45.484956+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:46.485220+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:47.485454+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:48.485685+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:49.485892+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:50.486074+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:51.486380+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:52.486557+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:53.486674+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:54.486811+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:55.486947+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:56.487082+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:57.487257+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:58.487444+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:59.487662+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:00.487827+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:01.488057+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:02.488231+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:03.488468+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:04.488564+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:05.488716+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:06.488867+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:07.488977+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:08.489110+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94502912 unmapped: 5062656 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 43
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:09.489254+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:10.489404+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:11.489759+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:12.489930+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:13.490163+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:14.490325+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:15.490442+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:16.490591+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:17.490797+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:18.490947+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:19.491104+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:20.491221+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1012631 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:21.491394+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:22.491558+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x2d68497/0x2ded000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:23.491710+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 106.814064026s of 106.865432739s, submitted: 16
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 94650368 unmapped: 4915200 heap: 99565568 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:24.491859+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95731712 unmapped: 16424960 heap: 112156672 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:25.492012+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1100269 data_alloc: 184549376 data_used: 9707520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 91 handle_osd_map epochs [91,92], i have 91, src has [1,92]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 95731712 unmapped: 16424960 heap: 112156672 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:26.492142+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b802b000/0x0/0x1bfc00000, data 0x39da6ea/0x3a63000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 102514688 unmapped: 10690560 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:27.492299+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 92 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96886784 unmapped: 16318464 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:28.492435+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96886784 unmapped: 16318464 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:29.492559+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:30.492685+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:31.492885+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:32.493075+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:33.493247+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:34.493423+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:35.493591+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:36.493754+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:37.493903+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:38.494071+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:39.494221+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:40.494388+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:41.494510+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:42.494671+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:43.494821+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:44.494933+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:45.495061+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets getting new tickets!
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:46.495275+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _finish_auth 0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:46.496231+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:47.495388+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:48.495528+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:49.495677+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:50.495841+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:51.496040+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:52.496187+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:53.496315+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:54.496408+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:55.496566+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:56.497027+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:57.497670+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:58.498169+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:59.498312+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:00.498442+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:01.498682+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:02.498936+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:03.499089+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:04.499320+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:05.499527+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:06.499717+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:07.499892+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:08.500037+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:09.500210+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:10.500392+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:11.500577+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:12.500731+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:13.500892+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:14.501041+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:15.501196+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:16.501328+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:17.501530+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:18.501713+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:19.501874+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:20.501987+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1198968 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:21.502242+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b6000/0x0/0x1bfc00000, data 0x464c91d/0x46d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96894976 unmapped: 16310272 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:22.502433+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 58.573200226s of 58.812870026s, submitted: 25
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b73b5000/0x0/0x1bfc00000, data 0x464c92d/0x46d8000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655e400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 ms_handle_reset con 0x56524655e400 session 0x565247e585a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 97402880 unmapped: 15802368 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:23.502694+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c63400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 ms_handle_reset con 0x565249c63400 session 0x565248f43e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 97402880 unmapped: 15802368 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b6db4000/0x0/0x1bfc00000, data 0x4c4e92d/0x4cda000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:24.502859+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd7c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 ms_handle_reset con 0x565247fd7c00 session 0x565248f45e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 97402880 unmapped: 15802368 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:25.503032+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1254246 data_alloc: 184549376 data_used: 9723904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247793400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 ms_handle_reset con 0x565247793400 session 0x565247784b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655c000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 97402880 unmapped: 15802368 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:26.503188+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 ms_handle_reset con 0x56524655c000 session 0x5652472743c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655c000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 93 handle_osd_map epochs [93,94], i have 93, src has [1,94]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 94 heartbeat osd_stat(store_statfs(0x1b6d8e000/0x0/0x1bfc00000, data 0x4c72960/0x4d00000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 94 ms_handle_reset con 0x56524655c000 session 0x565248f42780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 96460800 unmapped: 16744448 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655e400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:27.503308+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 97075200 unmapped: 16130048 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:28.503506+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 94 heartbeat osd_stat(store_statfs(0x1b6d88000/0x0/0x1bfc00000, data 0x4c75416/0x4d05000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 94 handle_osd_map epochs [94,95], i have 94, src has [1,95]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 100098048 unmapped: 13107200 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:29.503671+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a03400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 95 ms_handle_reset con 0x565247a03400 session 0x565248f494a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 101179392 unmapped: 12025856 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:30.503885+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1303788 data_alloc: 184549376 data_used: 13303808
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 95 heartbeat osd_stat(store_statfs(0x1b6d86000/0x0/0x1bfc00000, data 0x4c76dd4/0x4d07000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 101179392 unmapped: 12025856 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:31.504136+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 101179392 unmapped: 12025856 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:32.504282+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 101179392 unmapped: 12025856 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:33.504444+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 95 heartbeat osd_stat(store_statfs(0x1b6d86000/0x0/0x1bfc00000, data 0x4c76dd4/0x4d07000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 101187584 unmapped: 12017664 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:34.504591+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 101187584 unmapped: 12017664 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:35.504761+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1303788 data_alloc: 184549376 data_used: 13303808
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 101187584 unmapped: 12017664 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:36.504918+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 13.776059151s of 14.096249580s, submitted: 80
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 101244928 unmapped: 11960320 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:37.505072+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 102178816 unmapped: 11026432 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:38.510900+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 102481920 unmapped: 10723328 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:39.510992+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 96 heartbeat osd_stat(store_statfs(0x1b6807000/0x0/0x1bfc00000, data 0x51f5eca/0x5287000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 8265728 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:40.511127+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1389844 data_alloc: 184549376 data_used: 13533184
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 106381312 unmapped: 6823936 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:41.511295+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 106463232 unmapped: 6742016 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:42.511432+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 106496000 unmapped: 6709248 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:43.511589+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 96 heartbeat osd_stat(store_statfs(0x1b6597000/0x0/0x1bfc00000, data 0x544eeca/0x54e0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 106561536 unmapped: 6643712 heap: 113205248 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:44.511733+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105250816 unmapped: 16883712 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:45.511862+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1491961 data_alloc: 184549376 data_used: 13537280
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105390080 unmapped: 16744448 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:46.512011+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:47.512237+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105390080 unmapped: 16744448 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:48.512407+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105431040 unmapped: 16703488 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 96 heartbeat osd_stat(store_statfs(0x1b575b000/0x0/0x1bfc00000, data 0x62a0ed9/0x6333000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.935752869s of 12.601107597s, submitted: 154
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:49.512589+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105439232 unmapped: 16695296 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:50.512782+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105570304 unmapped: 16564224 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 96 heartbeat osd_stat(store_statfs(0x1b5759000/0x0/0x1bfc00000, data 0x62a0f0c/0x6335000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1500496 data_alloc: 184549376 data_used: 13520896
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 96 heartbeat osd_stat(store_statfs(0x1b574c000/0x0/0x1bfc00000, data 0x62adf0c/0x6342000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:51.512967+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105611264 unmapped: 16523264 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 96 ms_handle_reset con 0x56524655e400 session 0x565248f44d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:52.513134+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105611264 unmapped: 16523264 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 96 heartbeat osd_stat(store_statfs(0x1b574c000/0x0/0x1bfc00000, data 0x62adf0c/0x6342000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:53.513303+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105627648 unmapped: 16506880 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524652f400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 97 ms_handle_reset con 0x56524652f400 session 0x565248f3f2c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c62800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 97 ms_handle_reset con 0x565249c62800 session 0x565248f3ef00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:54.513448+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105570304 unmapped: 16564224 heap: 122134528 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524652f400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 97 ms_handle_reset con 0x56524652f400 session 0x565248af72c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655c800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:55.513579+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118784000 unmapped: 17276928 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1783446 data_alloc: 184549376 data_used: 18796544
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 98 ms_handle_reset con 0x56524655c800 session 0x565248f432c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 98 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 98 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:56.513718+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118800384 unmapped: 17260544 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:57.513849+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118808576 unmapped: 17252352 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 99 heartbeat osd_stat(store_statfs(0x1b3819000/0x0/0x1bfc00000, data 0x81d961e/0x8273000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:58.514034+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118808576 unmapped: 17252352 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:59.514213+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118669312 unmapped: 17391616 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.720550537s of 10.482752800s, submitted: 130
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:00.514349+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112549888 unmapped: 23511040 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1775256 data_alloc: 184549376 data_used: 18817024
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:01.514710+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112549888 unmapped: 23511040 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:02.514892+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112549888 unmapped: 23511040 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 heartbeat osd_stat(store_statfs(0x1b3817000/0x0/0x1bfc00000, data 0x81db714/0x8276000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:03.515106+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 104300544 unmapped: 31760384 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 ms_handle_reset con 0x56524655ec00 session 0x5652491ca960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:04.515247+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 104300544 unmapped: 31760384 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247793000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 heartbeat osd_stat(store_statfs(0x1b4577000/0x0/0x1bfc00000, data 0x747f6d1/0x7517000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:05.515447+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105988096 unmapped: 30072832 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1679247 data_alloc: 184549376 data_used: 7581696
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:06.515661+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 106160128 unmapped: 29900800 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:07.515827+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 105947136 unmapped: 30113792 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:08.516563+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 106258432 unmapped: 29802496 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:09.516848+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 106258432 unmapped: 29802496 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.618333817s of 10.177935600s, submitted: 143
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd9000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:10.516987+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 109633536 unmapped: 26427392 heap: 136060928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 heartbeat osd_stat(store_statfs(0x1b3269000/0x0/0x1bfc00000, data 0x878b743/0x8825000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [0,0,0,1,0,0,2])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1934254 data_alloc: 184549376 data_used: 11087872
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 ms_handle_reset con 0x565247fd9000 session 0x5652475a92c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:11.517168+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112877568 unmapped: 27033600 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcdc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 ms_handle_reset con 0x565248bcdc00 session 0x565248f4b860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:12.517320+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524652f400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 ms_handle_reset con 0x56524652f400 session 0x565247832f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 113033216 unmapped: 26877952 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247792c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 ms_handle_reset con 0x565247792c00 session 0x565247852780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778c000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 ms_handle_reset con 0x56524778c000 session 0x565247e58f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:13.517548+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 113098752 unmapped: 26812416 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 ms_handle_reset con 0x565247793000 session 0x5652481750e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:14.517758+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 113156096 unmapped: 26755072 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 101 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: get_auth_request con 0x5652461a0000 auth_method 0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 101 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c61000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 101 ms_handle_reset con 0x565249c61000 session 0x5652458e21e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:15.517929+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 31490048 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 101 heartbeat osd_stat(store_statfs(0x1b20f5000/0x0/0x1bfc00000, data 0x98fa9da/0x9998000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1692314 data_alloc: 184549376 data_used: 7938048
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:16.518279+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 31490048 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:17.518747+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 31490048 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:18.519024+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 31490048 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:19.519165+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 31490048 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.426567078s of 10.201519012s, submitted: 144
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 101 heartbeat osd_stat(store_statfs(0x1b3d99000/0x0/0x1bfc00000, data 0x79d5968/0x7a71000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:20.519313+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107872256 unmapped: 32038912 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 101 handle_osd_map epochs [101,102], i have 101, src has [1,102]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 102 ms_handle_reset con 0x565248bcc000 session 0x5652474c05a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1509036 data_alloc: 184549376 data_used: 7659520
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:21.519493+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107913216 unmapped: 31997952 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:22.519711+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107913216 unmapped: 31997952 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:23.519865+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107913216 unmapped: 31997952 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:24.520000+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107913216 unmapped: 31997952 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 102 heartbeat osd_stat(store_statfs(0x1b586e000/0x0/0x1bfc00000, data 0x6184a1c/0x621e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:25.520162+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107331584 unmapped: 32579584 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1518340 data_alloc: 184549376 data_used: 8224768
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:26.520343+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107446272 unmapped: 32464896 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 102 heartbeat osd_stat(store_statfs(0x1b5870000/0x0/0x1bfc00000, data 0x6184a1c/0x621e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:27.520575+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107446272 unmapped: 32464896 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 102 heartbeat osd_stat(store_statfs(0x1b586c000/0x0/0x1bfc00000, data 0x6188a1c/0x6222000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:28.520749+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107454464 unmapped: 32456704 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 102 ms_handle_reset con 0x5652476c4800 session 0x5652462170e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:29.520945+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107446272 unmapped: 32464896 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 102 handle_osd_map epochs [102,103], i have 102, src has [1,103]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.756147385s of 10.001887321s, submitted: 90
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 103 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 103 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 103 heartbeat osd_stat(store_statfs(0x1b586c000/0x0/0x1bfc00000, data 0x6188a1c/0x6222000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:30.521131+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd9000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 103 ms_handle_reset con 0x565247fd9000 session 0x5652478530e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 107487232 unmapped: 32423936 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1642070 data_alloc: 184549376 data_used: 14553088
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:31.521724+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 103 ms_handle_reset con 0x56524655ec00 session 0x565247833c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121053184 unmapped: 18857984 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc8800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:32.521878+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121053184 unmapped: 18857984 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 104 handle_osd_map epochs [104,105], i have 104, src has [1,105]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:33.522045+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 105 ms_handle_reset con 0x565248bc8800 session 0x56524769f4a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121184256 unmapped: 18726912 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 105 ms_handle_reset con 0x56524655ec00 session 0x5652475a8000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 105 ms_handle_reset con 0x5652476c4800 session 0x5652476ad2c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd9000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 105 ms_handle_reset con 0x565247fd9000 session 0x56524769fa40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:34.522251+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121036800 unmapped: 18874368 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 105 heartbeat osd_stat(store_statfs(0x1b42e9000/0x0/0x1bfc00000, data 0x770412e/0x77a3000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 105 ms_handle_reset con 0x565248bcc000 session 0x5652462172c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:35.522474+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108265472 unmapped: 31645696 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 105 handle_osd_map epochs [105,106], i have 105, src has [1,106]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1493383 data_alloc: 184549376 data_used: 6893568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:36.523067+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108281856 unmapped: 31629312 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:37.523525+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108281856 unmapped: 31629312 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:38.523735+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108281856 unmapped: 31629312 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778fc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x56524778fc00 session 0x5652479a41e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:39.523882+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x56524655ec00 session 0x565248af6b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x5652476c4800 session 0x565248f4be00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd9000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 108331008 unmapped: 31580160 heap: 139911168 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x565247fd9000 session 0x5652475ba5a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x565248bcc000 session 0x565248f443c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767d800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x56524767d800 session 0x5652479b8d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767d800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.853602409s of 10.057726860s, submitted: 182
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:40.524159+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111484928 unmapped: 44105728 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x56524767d800 session 0x5652477baf00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 heartbeat osd_stat(store_statfs(0x1b5e14000/0x0/0x1bfc00000, data 0x5bdd18f/0x5c7a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1771041 data_alloc: 184549376 data_used: 9744384
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 heartbeat osd_stat(store_statfs(0x1b3b65000/0x0/0x1bfc00000, data 0x7e8c18f/0x7f29000, compress 0x0/0x0/0x0, omap 0x644, meta 0x416f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:41.524387+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x56524655ec00 session 0x5652481743c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111501312 unmapped: 44089344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x5652476c4800 session 0x565247799680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd9000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x565247fd9000 session 0x5652474c12c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:42.524570+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 ms_handle_reset con 0x565248bcc000 session 0x565247274f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111509504 unmapped: 44081152 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247792400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:43.524789+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bce800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111558656 unmapped: 44032000 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 106 handle_osd_map epochs [106,107], i have 106, src has [1,107]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 ms_handle_reset con 0x565247792400 session 0x5652477221e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:44.525017+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111665152 unmapped: 43925504 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:45.525195+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112369664 unmapped: 43220992 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1672728 data_alloc: 184549376 data_used: 15224832
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:46.525379+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115269632 unmapped: 40321024 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 heartbeat osd_stat(store_statfs(0x1b4cd0000/0x0/0x1bfc00000, data 0x69193b4/0x69b8000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:47.525533+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115359744 unmapped: 40230912 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 ms_handle_reset con 0x565248bce800 session 0x565247274b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 heartbeat osd_stat(store_statfs(0x1b4cd0000/0x0/0x1bfc00000, data 0x69193b4/0x69b8000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248650800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:48.525721+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111149056 unmapped: 44441600 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 ms_handle_reset con 0x565248650800 session 0x5652479a4d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:49.525857+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 ms_handle_reset con 0x565248bcc800 session 0x565248af7e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247ee8c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 ms_handle_reset con 0x565247ee8c00 session 0x565248af7a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247792400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 ms_handle_reset con 0x565247792400 session 0x565248af6d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111157248 unmapped: 44433408 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248650800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 ms_handle_reset con 0x565248650800 session 0x5652465b41e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.213258743s of 10.009930611s, submitted: 164
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:50.526019+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 ms_handle_reset con 0x565248bcc800 session 0x5652462145a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bce800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 ms_handle_reset con 0x565248bce800 session 0x5652462141e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111869952 unmapped: 43720704 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1492835 data_alloc: 184549376 data_used: 6877184
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:51.526323+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111869952 unmapped: 43720704 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b5fae000/0x0/0x1bfc00000, data 0x56404d9/0x56df000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:52.526510+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111869952 unmapped: 43720704 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:53.526695+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111886336 unmapped: 43704320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:54.526859+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111886336 unmapped: 43704320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:55.527020+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111886336 unmapped: 43704320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b5fae000/0x0/0x1bfc00000, data 0x56404d9/0x56df000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1492835 data_alloc: 184549376 data_used: 6877184
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:56.527233+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 111935488 unmapped: 43655168 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc8400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 ms_handle_reset con 0x565248bc8400 session 0x5652474af680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247792400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:57.527436+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112263168 unmapped: 43327488 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:58.527706+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112263168 unmapped: 43327488 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b5f8b000/0x0/0x1bfc00000, data 0x56644d9/0x5703000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:59.527882+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112263168 unmapped: 43327488 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:00.528028+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112263168 unmapped: 43327488 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1494760 data_alloc: 184549376 data_used: 6877184
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:01.528234+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112263168 unmapped: 43327488 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:02.528448+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112263168 unmapped: 43327488 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:03.528713+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112263168 unmapped: 43327488 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b5f8b000/0x0/0x1bfc00000, data 0x56644d9/0x5703000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:04.528871+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 112271360 unmapped: 43319296 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:05.528997+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115105792 unmapped: 40484864 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1554920 data_alloc: 184549376 data_used: 15314944
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:06.529079+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 117227520 unmapped: 38363136 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:07.529242+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 117227520 unmapped: 38363136 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:08.529390+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b5f8b000/0x0/0x1bfc00000, data 0x56644d9/0x5703000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 117235712 unmapped: 38354944 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:09.529510+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 117235712 unmapped: 38354944 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:10.529653+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b5f8b000/0x0/0x1bfc00000, data 0x56644d9/0x5703000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 117235712 unmapped: 38354944 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1562280 data_alloc: 184549376 data_used: 16392192
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:11.529851+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 117235712 unmapped: 38354944 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:12.530015+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 117235712 unmapped: 38354944 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:13.530185+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 117235712 unmapped: 38354944 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:14.530337+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 117235712 unmapped: 38354944 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 24.762271881s of 24.951953888s, submitted: 51
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:15.530544+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121602048 unmapped: 33988608 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f9d400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1650220 data_alloc: 184549376 data_used: 16867328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b55bd000/0x0/0x1bfc00000, data 0x60244d9/0x60c3000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:16.530686+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bca800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 ms_handle_reset con 0x5652476c4800 session 0x5652477bb2c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121724928 unmapped: 33865728 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 ms_handle_reset con 0x565247792400 session 0x5652474ae960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:17.530838+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120406016 unmapped: 35184640 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:18.531017+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120479744 unmapped: 35110912 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:19.531229+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120479744 unmapped: 35110912 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:20.531410+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120668160 unmapped: 34922496 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b55a9000/0x0/0x1bfc00000, data 0x603d4d9/0x60dc000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1660872 data_alloc: 184549376 data_used: 17207296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:21.531599+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120733696 unmapped: 34856960 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:22.531800+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120733696 unmapped: 34856960 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:23.531967+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120758272 unmapped: 34832384 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:24.532102+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b55a9000/0x0/0x1bfc00000, data 0x603d4d9/0x60dc000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120832000 unmapped: 34758656 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:25.532271+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.011450768s of 10.443418503s, submitted: 124
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 ms_handle_reset con 0x565248bca800 session 0x5652479a5860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 ms_handle_reset con 0x565247f9d400 session 0x565247a043c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120840192 unmapped: 34750464 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247ee9c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b55b2000/0x0/0x1bfc00000, data 0x603d4d9/0x60dc000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1379776 data_alloc: 184549376 data_used: 6877184
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:26.532489+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 ms_handle_reset con 0x565247ee9c00 session 0x5652477841e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115212288 unmapped: 40378368 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:27.532706+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115212288 unmapped: 40378368 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:28.532879+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115212288 unmapped: 40378368 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:29.533042+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115212288 unmapped: 40378368 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcb400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:30.533167+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 ms_handle_reset con 0x565248bcb400 session 0x5652465b4b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115654656 unmapped: 39936000 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bccc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:31.533352+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1495357 data_alloc: 184549376 data_used: 6877184
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115744768 unmapped: 39845888 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 109 ms_handle_reset con 0x565248bccc00 session 0x5652479a45a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b61ae000/0x0/0x1bfc00000, data 0x5442477/0x54e0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:32.533489+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115744768 unmapped: 39845888 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:33.533690+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f9d400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 109 ms_handle_reset con 0x565247f9d400 session 0x565246019c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115744768 unmapped: 39845888 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b61aa000/0x0/0x1bfc00000, data 0x5444a87/0x54e4000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:34.533864+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bca800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 109 ms_handle_reset con 0x565248bca800 session 0x56524543bc20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115744768 unmapped: 39845888 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247794c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 109 ms_handle_reset con 0x565247794c00 session 0x56524543a000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247794c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 109 ms_handle_reset con 0x565247794c00 session 0x5652475ba1e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:35.534053+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.603146553s of 10.066352844s, submitted: 114
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b617f000/0x0/0x1bfc00000, data 0x546ea97/0x550f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f9d400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115941376 unmapped: 39649280 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bca800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcb400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:36.534180+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1504426 data_alloc: 184549376 data_used: 6889472
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115982336 unmapped: 39608320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 109 handle_osd_map epochs [109,110], i have 109, src has [1,110]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:37.534303+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 110 ms_handle_reset con 0x565248bcb400 session 0x5652477baf00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115441664 unmapped: 40148992 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:38.534413+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115621888 unmapped: 39968768 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 110 heartbeat osd_stat(store_statfs(0x1b617c000/0x0/0x1bfc00000, data 0x54708eb/0x5511000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:39.534563+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115621888 unmapped: 39968768 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 110 heartbeat osd_stat(store_statfs(0x1b617c000/0x0/0x1bfc00000, data 0x54708eb/0x5511000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:40.534676+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 115630080 unmapped: 39960576 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:41.534786+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1545189 data_alloc: 184549376 data_used: 11882496
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b617c000/0x0/0x1bfc00000, data 0x54708eb/0x5511000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 116752384 unmapped: 38838272 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:42.534889+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 116752384 unmapped: 38838272 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:43.535037+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 116752384 unmapped: 38838272 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b6179000/0x0/0x1bfc00000, data 0x54729e1/0x5514000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:44.535181+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 116752384 unmapped: 38838272 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:45.535335+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 116752384 unmapped: 38838272 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:46.535502+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1545189 data_alloc: 184549376 data_used: 11882496
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 116752384 unmapped: 38838272 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:47.535660+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 116752384 unmapped: 38838272 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 12.222372055s of 12.420304298s, submitted: 76
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:48.535803+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124583936 unmapped: 31006720 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:49.535930+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b5556000/0x0/0x1bfc00000, data 0x60909e1/0x6132000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 125304832 unmapped: 30285824 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:50.536094+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:51.536283+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1653933 data_alloc: 184549376 data_used: 13082624
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b5536000/0x0/0x1bfc00000, data 0x60b09e1/0x6152000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:52.536477+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:53.536770+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:54.536921+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b5536000/0x0/0x1bfc00000, data 0x60b09e1/0x6152000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:55.537034+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:56.537200+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1653933 data_alloc: 184549376 data_used: 13082624
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:57.537344+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b5536000/0x0/0x1bfc00000, data 0x60b09e1/0x6152000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:58.537481+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b5536000/0x0/0x1bfc00000, data 0x60b09e1/0x6152000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b5536000/0x0/0x1bfc00000, data 0x60b09e1/0x6152000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:59.537706+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123289600 unmapped: 32301056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.779379845s of 12.132931709s, submitted: 105
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 ms_handle_reset con 0x565247f9d400 session 0x5652474c0780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 ms_handle_reset con 0x565248bca800 session 0x5652478321e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:00.537941+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767dc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 ms_handle_reset con 0x56524767dc00 session 0x565247e585a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:01.538136+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1408734 data_alloc: 184549376 data_used: 6893568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:02.538300+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b6f7b000/0x0/0x1bfc00000, data 0x46729d1/0x4713000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:03.538467+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b6f7b000/0x0/0x1bfc00000, data 0x46729d1/0x4713000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:04.538604+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b6f7b000/0x0/0x1bfc00000, data 0x46729d1/0x4713000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:05.538770+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:06.538976+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1408734 data_alloc: 184549376 data_used: 6893568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:07.539165+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:08.539295+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b6f7b000/0x0/0x1bfc00000, data 0x46729d1/0x4713000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:09.539509+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:10.540294+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:11.540999+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1408734 data_alloc: 184549376 data_used: 6893568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:12.541225+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:13.541389+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b6f7b000/0x0/0x1bfc00000, data 0x46729d1/0x4713000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:14.541570+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:15.543274+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:16.544275+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1408734 data_alloc: 184549376 data_used: 6893568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118587392 unmapped: 37003264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:17.544554+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 17.189210892s of 17.328433990s, submitted: 42
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c61000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 handle_osd_map epochs [111,112], i have 111, src has [1,112]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 111 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118611968 unmapped: 36978688 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 112 ms_handle_reset con 0x565249c61000 session 0x565247e58f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:18.545575+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118644736 unmapped: 36945920 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:19.546727+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b6f75000/0x0/0x1bfc00000, data 0x46753f1/0x4718000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247926000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118661120 unmapped: 36929536 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:20.546837+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 113 ms_handle_reset con 0x565247926000 session 0x565248174780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118628352 unmapped: 36962304 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:21.547291+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1418660 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118628352 unmapped: 36962304 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:22.547782+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118628352 unmapped: 36962304 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:23.548025+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118628352 unmapped: 36962304 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:24.548192+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118628352 unmapped: 36962304 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:25.548343+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 113 heartbeat osd_stat(store_statfs(0x1b6f73000/0x0/0x1bfc00000, data 0x4676e45/0x471a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 113 handle_osd_map epochs [113,114], i have 113, src has [1,114]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118636544 unmapped: 36954112 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:26.548641+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1420754 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118636544 unmapped: 36954112 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:27.548984+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118636544 unmapped: 36954112 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:28.549151+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118644736 unmapped: 36945920 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:29.549315+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118644736 unmapped: 36945920 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:30.549538+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118644736 unmapped: 36945920 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:31.549677+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1420754 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118644736 unmapped: 36945920 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:32.549872+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118644736 unmapped: 36945920 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:33.550005+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118652928 unmapped: 36937728 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:34.550188+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118652928 unmapped: 36937728 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:35.550471+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118652928 unmapped: 36937728 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:36.550637+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1420754 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118661120 unmapped: 36929536 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:37.550823+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118661120 unmapped: 36929536 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:38.551689+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118661120 unmapped: 36929536 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:39.551832+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118661120 unmapped: 36929536 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:40.552001+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118661120 unmapped: 36929536 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:41.552240+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1420754 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118661120 unmapped: 36929536 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:42.552395+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118661120 unmapped: 36929536 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:43.552584+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118661120 unmapped: 36929536 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:44.552733+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118669312 unmapped: 36921344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:45.552888+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118669312 unmapped: 36921344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:46.553031+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1420754 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118669312 unmapped: 36921344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:47.553185+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118669312 unmapped: 36921344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:48.553364+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118669312 unmapped: 36921344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:49.553503+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118669312 unmapped: 36921344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:50.553675+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118669312 unmapped: 36921344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:51.553899+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1420754 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118669312 unmapped: 36921344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:52.554043+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118677504 unmapped: 36913152 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:53.554183+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118677504 unmapped: 36913152 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:54.554305+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118677504 unmapped: 36913152 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:55.554458+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118677504 unmapped: 36913152 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:56.554722+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1420754 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118677504 unmapped: 36913152 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:57.554809+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118677504 unmapped: 36913152 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:58.555003+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118677504 unmapped: 36913152 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:59.555154+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118677504 unmapped: 36913152 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:00.555308+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118693888 unmapped: 36896768 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:01.555491+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1420754 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118693888 unmapped: 36896768 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:02.555693+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118693888 unmapped: 36896768 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:03.555850+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118693888 unmapped: 36896768 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:04.556003+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6f71000/0x0/0x1bfc00000, data 0x4678f3b/0x471d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118693888 unmapped: 36896768 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:05.556139+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118693888 unmapped: 36896768 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:06.556308+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 48.807258606s of 49.083847046s, submitted: 89
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1426218 data_alloc: 184549376 data_used: 6901760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 114 handle_osd_map epochs [114,115], i have 114, src has [1,115]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118726656 unmapped: 36864000 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:07.556470+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118726656 unmapped: 36864000 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:08.556671+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 115 handle_osd_map epochs [115,116], i have 115, src has [1,116]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778f400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 116 heartbeat osd_stat(store_statfs(0x1b6f68000/0x0/0x1bfc00000, data 0x467b5d3/0x4726000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118751232 unmapped: 36839424 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 116 ms_handle_reset con 0x56524778f400 session 0x5652478534a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:09.556822+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 116 ms_handle_reset con 0x56524767d000 session 0x565247a05a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcb400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118751232 unmapped: 36839424 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 116 ms_handle_reset con 0x565248bcb400 session 0x565248f450e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 116 heartbeat osd_stat(store_statfs(0x1b6f62000/0x0/0x1bfc00000, data 0x467d816/0x472b000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:10.556980+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118538240 unmapped: 37052416 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:11.557210+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1615034 data_alloc: 184549376 data_used: 6918144
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 117 ms_handle_reset con 0x56524767d000 session 0x5652462161e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118636544 unmapped: 36954112 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:12.557402+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc6000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 127041536 unmapped: 28549120 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:13.557602+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 117 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 119848960 unmapped: 35741696 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 118 ms_handle_reset con 0x56524778e800 session 0x5652477ba000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:14.557756+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 118 heartbeat osd_stat(store_statfs(0x1b375c000/0x0/0x1bfc00000, data 0x7e7fe59/0x7f31000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655c400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 118 heartbeat osd_stat(store_statfs(0x1b375a000/0x0/0x1bfc00000, data 0x7e81de4/0x7f33000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 118 handle_osd_map epochs [118,119], i have 118, src has [1,119]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247794400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 119 ms_handle_reset con 0x565247794400 session 0x565248f48f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 119 ms_handle_reset con 0x565248bc6000 session 0x5652476acd20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118915072 unmapped: 36675584 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:15.557929+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 119 heartbeat osd_stat(store_statfs(0x1b375a000/0x0/0x1bfc00000, data 0x7e81de4/0x7f33000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 119 ms_handle_reset con 0x56524655c400 session 0x565248f42b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 118931456 unmapped: 36659200 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:16.558833+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1845004 data_alloc: 184549376 data_used: 6938624
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655c400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.190928459s of 10.355269432s, submitted: 233
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 119 handle_osd_map epochs [119,120], i have 119, src has [1,120]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 120 ms_handle_reset con 0x56524655c400 session 0x5652478330e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 119029760 unmapped: 36560896 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:17.559005+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 120 ms_handle_reset con 0x56524767d000 session 0x565247a043c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 120 handle_osd_map epochs [120,121], i have 120, src has [1,121]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 119119872 unmapped: 36470784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:18.559149+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a02400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 122 ms_handle_reset con 0x565247a02400 session 0x565247a05a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 119218176 unmapped: 36372480 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:19.559258+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247927400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 122 ms_handle_reset con 0x565247927400 session 0x5652475bb2c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652474ef800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 122 handle_osd_map epochs [122,123], i have 122, src has [1,123]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 123 ms_handle_reset con 0x5652474ef800 session 0x565248174b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120307712 unmapped: 35282944 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:20.559450+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 123 heartbeat osd_stat(store_statfs(0x1b6f4d000/0x0/0x1bfc00000, data 0x468a72e/0x473e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652474ef800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 124 heartbeat osd_stat(store_statfs(0x1b6f48000/0x0/0x1bfc00000, data 0x468c96c/0x4741000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [0,0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120365056 unmapped: 35225600 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:21.559667+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1481151 data_alloc: 184549376 data_used: 6950912
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 124 handle_osd_map epochs [124,125], i have 124, src has [1,125]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 124 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 125 ms_handle_reset con 0x5652474ef800 session 0x565246217c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120373248 unmapped: 35217408 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:22.560216+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 125 heartbeat osd_stat(store_statfs(0x1b6f43000/0x0/0x1bfc00000, data 0x4690d3a/0x4749000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120373248 unmapped: 35217408 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:23.560363+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655f000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 125 ms_handle_reset con 0x56524655f000 session 0x5652474c0960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247792000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 125 handle_osd_map epochs [125,126], i have 125, src has [1,126]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 125 handle_osd_map epochs [126,126], i have 126, src has [1,126]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120446976 unmapped: 35143680 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 126 ms_handle_reset con 0x565247792000 session 0x5652474c12c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:24.560508+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120463360 unmapped: 35127296 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:25.560770+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f9d400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 126 handle_osd_map epochs [126,127], i have 126, src has [1,127]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 127 ms_handle_reset con 0x565247f9d400 session 0x5652458e2000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120479744 unmapped: 35110912 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:26.560961+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1489302 data_alloc: 184549376 data_used: 6963200
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247794400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.337639809s of 10.147456169s, submitted: 292
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 127 heartbeat osd_stat(store_statfs(0x1b6f3d000/0x0/0x1bfc00000, data 0x4695266/0x4751000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 127 ms_handle_reset con 0x565247794400 session 0x5652479b9e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247794400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 128 ms_handle_reset con 0x565247794400 session 0x565247e58780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120569856 unmapped: 35020800 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:27.561362+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120569856 unmapped: 35020800 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:28.561569+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 128 handle_osd_map epochs [128,129], i have 128, src has [1,129]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120586240 unmapped: 35004416 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:29.561720+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 129 ms_handle_reset con 0x5652476c4000 session 0x5652478525a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 129 heartbeat osd_stat(store_statfs(0x1b6f36000/0x0/0x1bfc00000, data 0x4699716/0x4758000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120586240 unmapped: 35004416 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:30.561863+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120594432 unmapped: 34996224 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:31.561999+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1499772 data_alloc: 184549376 data_used: 6983680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcb800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 130 ms_handle_reset con 0x565248bcb800 session 0x5652458e25a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 130 heartbeat osd_stat(store_statfs(0x1b6f32000/0x0/0x1bfc00000, data 0x469b82c/0x475b000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:32.562188+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120619008 unmapped: 34971648 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 131 ms_handle_reset con 0x565248bcc000 session 0x565248af7860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:33.562325+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120635392 unmapped: 34955264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:34.562498+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120635392 unmapped: 34955264 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 131 ms_handle_reset con 0x56524655d000 session 0x5652475a8000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 131 handle_osd_map epochs [131,131], i have 131, src has [1,131]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 131 handle_osd_map epochs [132,132], i have 131, src has [1,132]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 132 ms_handle_reset con 0x56524655d000 session 0x565247833860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:35.562689+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120717312 unmapped: 34873344 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 132 ms_handle_reset con 0x5652476c4000 session 0x5652476adc20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:36.562881+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120733696 unmapped: 34856960 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247794400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 132 ms_handle_reset con 0x565247794400 session 0x5652476ad860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1502878 data_alloc: 184549376 data_used: 6983680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:37.563049+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120766464 unmapped: 34824192 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 132 heartbeat osd_stat(store_statfs(0x1b6f2d000/0x0/0x1bfc00000, data 0x469fc40/0x4761000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:38.563243+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120766464 unmapped: 34824192 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 132 heartbeat osd_stat(store_statfs(0x1b6f2d000/0x0/0x1bfc00000, data 0x469fc40/0x4761000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:39.563392+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120766464 unmapped: 34824192 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:40.563566+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120782848 unmapped: 34807808 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 132 handle_osd_map epochs [132,133], i have 132, src has [1,133]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 13.453918457s of 13.884525299s, submitted: 157
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:41.563667+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120807424 unmapped: 34783232 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1506172 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:42.563971+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120807424 unmapped: 34783232 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6f2a000/0x0/0x1bfc00000, data 0x46a1d56/0x4764000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:43.564190+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120807424 unmapped: 34783232 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6f2a000/0x0/0x1bfc00000, data 0x46a1d56/0x4764000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:44.564410+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120807424 unmapped: 34783232 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.2 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2705 syncs, 3.78 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4296 writes, 14K keys, 4296 commit groups, 1.0 writes per commit group, ingest: 13.86 MB, 0.02 MB/s
                                                          Interval WAL: 4296 writes, 1841 syncs, 2.33 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:45.564657+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120815616 unmapped: 34775040 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:46.564862+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120815616 unmapped: 34775040 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1506172 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:47.565022+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120815616 unmapped: 34775040 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:48.565167+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120815616 unmapped: 34775040 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6f29000/0x0/0x1bfc00000, data 0x46a1d66/0x4765000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:49.565340+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120815616 unmapped: 34775040 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:50.565507+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120815616 unmapped: 34775040 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a02400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 ms_handle_reset con 0x565247a02400 session 0x565248f45860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247791400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.240589142s of 10.294438362s, submitted: 25
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 ms_handle_reset con 0x565247791400 session 0x565248f452c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:51.565777+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120840192 unmapped: 34750464 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1511260 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:52.565985+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120840192 unmapped: 34750464 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6f28000/0x0/0x1bfc00000, data 0x46a1dc9/0x4766000, compress 0x0/0x0/0x0, omap 0x644, meta 0x456f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:53.566237+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767c400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120840192 unmapped: 34750464 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 ms_handle_reset con 0x56524767c400 session 0x565248f454a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 ms_handle_reset con 0x565247f94800 session 0x565247a05680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:54.566418+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120840192 unmapped: 34750464 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:55.566791+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120840192 unmapped: 34750464 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:56.566996+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120840192 unmapped: 34750464 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1512176 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:57.567185+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120840192 unmapped: 34750464 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1db8/0x4765000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247793000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 ms_handle_reset con 0x565247793000 session 0x565247a04f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:58.567361+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120889344 unmapped: 34701312 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:59.567513+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120889344 unmapped: 34701312 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:00.567696+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120889344 unmapped: 34701312 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:01.567909+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120889344 unmapped: 34701312 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1510862 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1d56/0x4764000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:02.568057+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120889344 unmapped: 34701312 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:03.568212+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120889344 unmapped: 34701312 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a6f000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 12.236142159s of 12.487181664s, submitted: 56
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 ms_handle_reset con 0x565247a6f000 session 0x5652474c0f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:04.568346+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120913920 unmapped: 34676736 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:05.568478+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120913920 unmapped: 34676736 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:06.568708+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120913920 unmapped: 34676736 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1510862 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1d56/0x4764000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:07.568860+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120913920 unmapped: 34676736 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:08.569027+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120913920 unmapped: 34676736 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:09.569181+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120913920 unmapped: 34676736 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:10.569321+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120913920 unmapped: 34676736 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:11.569473+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120913920 unmapped: 34676736 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1510862 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1d56/0x4764000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:12.569636+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120922112 unmapped: 34668544 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:13.569817+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120922112 unmapped: 34668544 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:14.569988+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 120922112 unmapped: 34668544 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 44
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:15.570168+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1d56/0x4764000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:16.570348+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1510862 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:17.600312+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:18.600478+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:19.600795+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1d56/0x4764000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:20.600953+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 17.437656403s of 17.475158691s, submitted: 10
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 45
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:21.601331+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1511577 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:22.601517+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1e20/0x4765000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:23.601710+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:24.601855+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1e20/0x4765000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:25.602016+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:26.602187+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1511577 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1e20/0x4765000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:27.602362+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:28.602566+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121118720 unmapped: 34471936 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:29.602713+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 heartbeat osd_stat(store_statfs(0x1b6b29000/0x0/0x1bfc00000, data 0x46a1e20/0x4765000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121118720 unmapped: 34471936 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:30.602858+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121118720 unmapped: 34471936 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767dc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.034205437s of 10.064682961s, submitted: 8
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 ms_handle_reset con 0x56524767dc00 session 0x56524769e5a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:31.603068+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121118720 unmapped: 34471936 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1516592 data_alloc: 184549376 data_used: 6995968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767dc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:32.603298+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121135104 unmapped: 34455552 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248651800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:33.603504+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 135 ms_handle_reset con 0x56524767dc00 session 0x565248f45a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 135 ms_handle_reset con 0x565248651800 session 0x5652476abc20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121159680 unmapped: 34430976 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd8800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:34.603678+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121200640 unmapped: 34390016 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 136 ms_handle_reset con 0x565247fd8800 session 0x5652458e2b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a54000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc6000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 136 ms_handle_reset con 0x565248bc6000 session 0x565247a04d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 136 ms_handle_reset con 0x565247a54000 session 0x5652476aaf00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767dc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 136 ms_handle_reset con 0x56524767dc00 session 0x565247e59e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 136 heartbeat osd_stat(store_statfs(0x1b6b15000/0x0/0x1bfc00000, data 0x46a9c6e/0x4777000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:35.603847+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121036800 unmapped: 34553856 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd8800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:36.604064+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121044992 unmapped: 34545664 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248651800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 137 ms_handle_reset con 0x565247fd8800 session 0x56524769f4a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 137 ms_handle_reset con 0x565248651800 session 0x56524543a3c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 137 heartbeat osd_stat(store_statfs(0x1b6b0e000/0x0/0x1bfc00000, data 0x46ac85d/0x477f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1555037 data_alloc: 184549376 data_used: 7028736
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc6000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 137 ms_handle_reset con 0x565248bc6000 session 0x5652476ab860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:37.604189+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121102336 unmapped: 34488320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655cc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 137 ms_handle_reset con 0x56524655cc00 session 0x5652476aa1e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655cc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 138 ms_handle_reset con 0x56524655cc00 session 0x565247833860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:38.604345+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121151488 unmapped: 34439168 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767dc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 139 ms_handle_reset con 0x56524767dc00 session 0x5652476ab4a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd8800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 139 heartbeat osd_stat(store_statfs(0x1b6b06000/0x0/0x1bfc00000, data 0x46afae8/0x4786000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:39.604462+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 139 ms_handle_reset con 0x565247fd8800 session 0x5652479b8b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121208832 unmapped: 34381824 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:40.604666+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 121208832 unmapped: 34381824 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.278358459s of 10.005006790s, submitted: 184
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:41.604835+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122273792 unmapped: 33316864 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1568652 data_alloc: 184549376 data_used: 7020544
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:42.604991+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122281984 unmapped: 33308672 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 140 heartbeat osd_stat(store_statfs(0x1b6b02000/0x0/0x1bfc00000, data 0x46b385f/0x478b000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247927400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 140 ms_handle_reset con 0x565247927400 session 0x565247723860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:43.605139+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122331136 unmapped: 33259520 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524652f400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 142 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248650800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:44.605287+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 142 handle_osd_map epochs [141,142], i have 142, src has [1,142]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 142 ms_handle_reset con 0x565248650800 session 0x5652465b4780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 142 ms_handle_reset con 0x56524652f400 session 0x5652477221e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122363904 unmapped: 33226752 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655cc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:45.605447+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122404864 unmapped: 33185792 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 143 ms_handle_reset con 0x56524655cc00 session 0x5652458e2000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767dc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 144 ms_handle_reset con 0x56524767dc00 session 0x565246217c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:46.605642+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122470400 unmapped: 33120256 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1578504 data_alloc: 184549376 data_used: 7053312
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:47.605804+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 145 heartbeat osd_stat(store_statfs(0x1b6afa000/0x0/0x1bfc00000, data 0x46b9b98/0x4790000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122470400 unmapped: 33120256 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:48.605984+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122470400 unmapped: 33120256 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:49.606142+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122470400 unmapped: 33120256 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:50.606279+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122396672 unmapped: 33193984 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655c800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.795162201s of 10.007409096s, submitted: 379
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 146 ms_handle_reset con 0x56524767d000 session 0x56524778ad20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bce800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:51.606415+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122421248 unmapped: 33169408 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1587203 data_alloc: 184549376 data_used: 7077888
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:52.606534+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 147 ms_handle_reset con 0x565248bce800 session 0x565248aed4a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122429440 unmapped: 33161216 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:53.606718+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 147 heartbeat osd_stat(store_statfs(0x1b6af1000/0x0/0x1bfc00000, data 0x46c032e/0x479b000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcf000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122437632 unmapped: 33153024 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 148 ms_handle_reset con 0x565248bcf000 session 0x565248aec5a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcd800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:54.606865+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 149 ms_handle_reset con 0x565248bcd800 session 0x565248aede00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcac00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 149 ms_handle_reset con 0x565248bcac00 session 0x565248aec000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122519552 unmapped: 33071104 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 149 heartbeat osd_stat(store_statfs(0x1b6aed000/0x0/0x1bfc00000, data 0x46c4759/0x47a0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:55.606992+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 ms_handle_reset con 0x565248bcc800 session 0x5652458e1c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122486784 unmapped: 33103872 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc6c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 ms_handle_reset con 0x565248bc6c00 session 0x565248bc2780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc6c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:56.608107+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 ms_handle_reset con 0x565248bc6c00 session 0x565248bc23c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcac00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 ms_handle_reset con 0x565248bcac00 session 0x565248bc2960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122535936 unmapped: 33054720 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 ms_handle_reset con 0x565248bcc800 session 0x56524543a3c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1593972 data_alloc: 184549376 data_used: 7086080
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:57.608271+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122560512 unmapped: 33030144 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:58.608420+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 heartbeat osd_stat(store_statfs(0x1b6aeb000/0x0/0x1bfc00000, data 0x46c6acc/0x47a3000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 122560512 unmapped: 33030144 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:59.608532+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247794400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 ms_handle_reset con 0x565247794400 session 0x5652458dcf00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 31973376 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 heartbeat osd_stat(store_statfs(0x1b6aea000/0x0/0x1bfc00000, data 0x46c6adc/0x47a4000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:00.608705+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 31973376 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 150 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.344982147s of 10.001677513s, submitted: 219
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 151 ms_handle_reset con 0x565248bcc000 session 0x5652476aba40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:01.608893+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 31973376 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcc000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 151 heartbeat osd_stat(store_statfs(0x1b6ae5000/0x0/0x1bfc00000, data 0x46c8e26/0x47a8000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 151 ms_handle_reset con 0x565248bcc000 session 0x565247852b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247794400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1601722 data_alloc: 184549376 data_used: 7102464
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:02.608993+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 31940608 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 152 ms_handle_reset con 0x565247794400 session 0x565248aede00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:03.609202+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123682816 unmapped: 31907840 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:04.609354+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247792000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 152 ms_handle_reset con 0x565247792000 session 0x5652465b4780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c61400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123715584 unmapped: 31875072 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 152 ms_handle_reset con 0x565249c61400 session 0x565247274b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 46
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:05.609488+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123953152 unmapped: 31637504 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:06.609678+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 123953152 unmapped: 31637504 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c61800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1610677 data_alloc: 184549376 data_used: 7102464
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:07.609862+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124002304 unmapped: 31588352 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b6adf000/0x0/0x1bfc00000, data 0x46cb40f/0x47af000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:08.610009+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124002304 unmapped: 31588352 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b6adf000/0x0/0x1bfc00000, data 0x46cb40f/0x47af000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:09.610163+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124002304 unmapped: 31588352 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:10.610494+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124002304 unmapped: 31588352 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:11.610697+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.062295914s of 10.254380226s, submitted: 72
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124002304 unmapped: 31588352 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1612593 data_alloc: 184549376 data_used: 7110656
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:12.610872+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124002304 unmapped: 31588352 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6add000/0x0/0x1bfc00000, data 0x46cd608/0x47b1000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:13.611030+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:14.611188+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:15.611327+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:16.611467+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ade000/0x0/0x1bfc00000, data 0x46cd5d2/0x47b0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ade000/0x0/0x1bfc00000, data 0x46cd5d2/0x47b0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1611919 data_alloc: 184549376 data_used: 7110656
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:17.611674+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ade000/0x0/0x1bfc00000, data 0x46cd5d2/0x47b0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:18.611808+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:19.611970+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:20.612116+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:21.612322+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1611919 data_alloc: 184549376 data_used: 7110656
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:22.612490+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ade000/0x0/0x1bfc00000, data 0x46cd5d2/0x47b0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:23.612675+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:24.613212+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ade000/0x0/0x1bfc00000, data 0x46cd5d2/0x47b0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:25.613827+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:26.613996+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1611919 data_alloc: 184549376 data_used: 7110656
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:27.614216+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ade000/0x0/0x1bfc00000, data 0x46cd5d2/0x47b0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:28.614406+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124010496 unmapped: 31580160 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 17.715339661s of 17.748870850s, submitted: 25
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:29.614605+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124051456 unmapped: 31539200 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:30.614823+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124059648 unmapped: 31531008 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:31.614976+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ade000/0x0/0x1bfc00000, data 0x46cd63b/0x47b0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124059648 unmapped: 31531008 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ade000/0x0/0x1bfc00000, data 0x46cd63b/0x47b0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1613125 data_alloc: 184549376 data_used: 7114752
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:32.615139+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124067840 unmapped: 31522816 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:33.615333+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ae0000/0x0/0x1bfc00000, data 0x46cd5cf/0x47ae000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124076032 unmapped: 31514624 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 heartbeat osd_stat(store_statfs(0x1b6ae0000/0x0/0x1bfc00000, data 0x46cd5cf/0x47ae000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:34.615519+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124076032 unmapped: 31514624 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:35.615740+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124076032 unmapped: 31514624 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:36.615951+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124108800 unmapped: 31481856 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1611425 data_alloc: 184549376 data_used: 7110656
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 153 handle_osd_map epochs [153,154], i have 153, src has [1,154]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:37.616134+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 154 heartbeat osd_stat(store_statfs(0x1b6adb000/0x0/0x1bfc00000, data 0x46cf912/0x47b2000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124133376 unmapped: 31457280 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:38.616301+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124133376 unmapped: 31457280 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:39.616386+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 154 heartbeat osd_stat(store_statfs(0x1b6adc000/0x0/0x1bfc00000, data 0x46cf877/0x47b1000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124133376 unmapped: 31457280 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:40.616532+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.022494316s of 11.344523430s, submitted: 50
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124133376 unmapped: 31457280 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:41.616707+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcd800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 154 ms_handle_reset con 0x565248bcd800 session 0x565248af65a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124157952 unmapped: 31432704 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1621785 data_alloc: 184549376 data_used: 7118848
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:42.616859+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124157952 unmapped: 31432704 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 154 heartbeat osd_stat(store_statfs(0x1b6ada000/0x0/0x1bfc00000, data 0x46cf8f9/0x47b4000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:43.617056+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124157952 unmapped: 31432704 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 154 heartbeat osd_stat(store_statfs(0x1b6ad8000/0x0/0x1bfc00000, data 0x46cf96c/0x47b6000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:44.617258+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124157952 unmapped: 31432704 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc6000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 154 ms_handle_reset con 0x565248bc6000 session 0x565248f425a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:45.617399+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124166144 unmapped: 31424512 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc9800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x565248bc9800 session 0x565248f434a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:46.617562+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc7000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x565248bc7000 session 0x565248f42000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124174336 unmapped: 31416320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1629370 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:47.617718+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124174336 unmapped: 31416320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b6ad4000/0x0/0x1bfc00000, data 0x46d19ef/0x47b7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:48.617873+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124174336 unmapped: 31416320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:49.618043+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124174336 unmapped: 31416320 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x565247f94800 session 0x565248f43a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:50.618208+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x5652476c4000 session 0x565248f42d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124231680 unmapped: 31358976 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.415409088s of 10.683913231s, submitted: 76
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:51.618402+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b6ad9000/0x0/0x1bfc00000, data 0x46d197d/0x47b5000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124231680 unmapped: 31358976 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1623118 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:52.618555+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124231680 unmapped: 31358976 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:53.618773+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124231680 unmapped: 31358976 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:54.618920+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124231680 unmapped: 31358976 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:55.619045+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124231680 unmapped: 31358976 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:56.619163+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b6ada000/0x0/0x1bfc00000, data 0x46d196d/0x47b4000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 31350784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:57.619313+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1624886 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 31350784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:58.619476+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 31350784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:59.619704+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 31350784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:00.619858+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124272640 unmapped: 31318016 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:01.620055+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b6ad6000/0x0/0x1bfc00000, data 0x46d1b30/0x47b7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124280832 unmapped: 31309824 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:02.620186+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1628374 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124280832 unmapped: 31309824 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:03.620333+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b6ad6000/0x0/0x1bfc00000, data 0x46d1b30/0x47b7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 12.447432518s of 12.542370796s, submitted: 23
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124289024 unmapped: 31301632 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:04.620475+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124289024 unmapped: 31301632 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:05.620651+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124289024 unmapped: 31301632 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:06.620779+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b6ad6000/0x0/0x1bfc00000, data 0x46d1aff/0x47b7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124289024 unmapped: 31301632 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:07.620922+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1630094 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x56524778e800 session 0x5652475a9680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124338176 unmapped: 31252480 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:08.621097+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b6ad3000/0x0/0x1bfc00000, data 0x46d1c57/0x47b9000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124338176 unmapped: 31252480 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 47
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:09.621256+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247795800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x565247795800 session 0x565248f452c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247926000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124125184 unmapped: 31465472 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x565247926000 session 0x565248f45a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc8000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b6ad6000/0x0/0x1bfc00000, data 0x46d1bf5/0x47b8000, compress 0x0/0x0/0x0, omap 0x644, meta 0x496f9bc), peers [0,1,2,4,5] op hist [0,0,0,1,2])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:10.621449+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x565248bc8000 session 0x565248f483c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124321792 unmapped: 31268864 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:11.621623+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124321792 unmapped: 31268864 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:12.621776+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655f000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1640241 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x56524655f000 session 0x565248f49a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124354560 unmapped: 31236096 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:13.621958+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b5936000/0x0/0x1bfc00000, data 0x46d1b63/0x47b7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5b0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124354560 unmapped: 31236096 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.799915314s of 10.263329506s, submitted: 102
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:14.622119+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124379136 unmapped: 31211520 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:15.667161+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124231680 unmapped: 31358976 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:16.667356+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124231680 unmapped: 31358976 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:17.667549+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1641166 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 31350784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:18.667741+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 31350784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b5936000/0x0/0x1bfc00000, data 0x46d1bc9/0x47b7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5b0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:19.667913+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 31350784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:20.668076+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 31350784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:21.668274+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 31350784 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:22.668428+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1640382 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 31342592 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b5936000/0x0/0x1bfc00000, data 0x46d1c96/0x47b8000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5b0f9bc), peers [0,1,2,4,5] op hist [0,0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:23.668583+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 31342592 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.116040230s of 10.218334198s, submitted: 22
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:24.668805+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 31334400 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:25.668983+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 31334400 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:26.669130+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 31334400 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:27.669369+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1643052 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124280832 unmapped: 31309824 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:28.669664+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124280832 unmapped: 31309824 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b5935000/0x0/0x1bfc00000, data 0x46d1d92/0x47b7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5b0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:29.669836+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124280832 unmapped: 31309824 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:30.670573+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b5934000/0x0/0x1bfc00000, data 0x46d1e2d/0x47b8000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5b0f9bc), peers [0,1,2,4,5] op hist [0,0,1,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124313600 unmapped: 31277056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:31.671232+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124313600 unmapped: 31277056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:32.672811+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1644082 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124313600 unmapped: 31277056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:33.672982+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247792000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x565247792000 session 0x565248f49e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124313600 unmapped: 31277056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.865417480s of 10.001346588s, submitted: 31
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:34.673148+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b5934000/0x0/0x1bfc00000, data 0x46d1e3a/0x47b9000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5b0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124313600 unmapped: 31277056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:35.673375+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 124313600 unmapped: 31277056 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x56524778e800 session 0x565248f49860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:36.673552+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 ms_handle_reset con 0x565249c61800 session 0x56524778a3c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c63800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138641408 unmapped: 16949248 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:37.673804+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1707340 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135569408 unmapped: 20021248 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:38.674566+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 127205376 unmapped: 28385280 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 48
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:39.674769+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135806976 unmapped: 19783680 heap: 155590656 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x6ed1e6e/0x6fba000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:40.675040+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 127614976 unmapped: 36372480 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:41.675241+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 127369216 unmapped: 36618240 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:42.675450+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2367836 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135815168 unmapped: 28172288 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:43.675705+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135962624 unmapped: 28024832 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.096475601s of 10.038651466s, submitted: 384
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:44.675856+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1ac532000/0x0/0x1bfc00000, data 0xd6d1f9e/0xd7bb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 127688704 unmapped: 36298752 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:45.676525+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 136151040 unmapped: 27836416 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:46.676727+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 127950848 unmapped: 36036608 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:47.676979+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2990914 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 128122880 unmapped: 35864576 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:48.677194+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 129253376 unmapped: 34734080 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:49.677396+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 137781248 unmapped: 26206208 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1a7d33000/0x0/0x1bfc00000, data 0x11ed203a/0x11fbb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:50.677544+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1a7d33000/0x0/0x1bfc00000, data 0x11ed203a/0x11fbb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 129458176 unmapped: 34529280 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:51.677718+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1a6532000/0x0/0x1bfc00000, data 0x136d200a/0x137bb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 137961472 unmapped: 26025984 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:52.677878+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3465146 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1a4d32000/0x0/0x1bfc00000, data 0x14ed200a/0x14fbb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 129761280 unmapped: 34226176 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:53.678081+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 130932736 unmapped: 33054720 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.044265747s of 10.025178909s, submitted: 78
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:54.678279+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 131055616 unmapped: 32931840 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:55.678465+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1a1531000/0x0/0x1bfc00000, data 0x186d20a5/0x187bb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 131178496 unmapped: 32808960 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:56.678661+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x1a0531000/0x0/0x1bfc00000, data 0x196d20b5/0x197bb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 131309568 unmapped: 32677888 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:57.678826+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3960354 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 139714560 unmapped: 24272896 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:58.678977+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 131448832 unmapped: 32538624 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:59.679137+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x19ed32000/0x0/0x1bfc00000, data 0x1aed2080/0x1afbb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140075008 unmapped: 23912448 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:00.679255+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 131809280 unmapped: 32178176 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:01.679426+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 heartbeat osd_stat(store_statfs(0x19d532000/0x0/0x1bfc00000, data 0x1c6d211b/0x1c7bc000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 132931584 unmapped: 31055872 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:02.679593+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4400894 data_alloc: 184549376 data_used: 7131136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 155 handle_osd_map epochs [155,156], i have 155, src has [1,156]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140443648 unmapped: 23543808 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:03.679790+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141713408 unmapped: 22274048 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.640242577s of 10.018783569s, submitted: 113
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:04.679945+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 133496832 unmapped: 30490624 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:05.680104+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 156 heartbeat osd_stat(store_statfs(0x19852e000/0x0/0x1bfc00000, data 0x216d442f/0x217bf000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141959168 unmapped: 22028288 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:06.680352+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 142106624 unmapped: 21880832 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 156 heartbeat osd_stat(store_statfs(0x19752f000/0x0/0x1bfc00000, data 0x226d442f/0x227bf000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:07.680494+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 5010462 data_alloc: 184549376 data_used: 7139328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 156 heartbeat osd_stat(store_statfs(0x196d2d000/0x0/0x1bfc00000, data 0x22ed4565/0x22fc1000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 133865472 unmapped: 30121984 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:08.680683+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 142622720 unmapped: 21364736 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:09.680864+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 156 heartbeat osd_stat(store_statfs(0x195d2c000/0x0/0x1bfc00000, data 0x23ed45fc/0x23fc1000, compress 0x0/0x0/0x0, omap 0x644, meta 0x5f0f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134242304 unmapped: 29745152 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:10.681035+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 156 handle_osd_map epochs [156,157], i have 156, src has [1,157]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134406144 unmapped: 29581312 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:11.681215+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 143024128 unmapped: 20963328 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:12.681378+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 5585832 data_alloc: 184549376 data_used: 7151616
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134815744 unmapped: 29171712 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:13.681591+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 158 heartbeat osd_stat(store_statfs(0x192d05000/0x0/0x1bfc00000, data 0x27ed8969/0x27fc7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134987776 unmapped: 28999680 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.663516998s of 10.072677612s, submitted: 134
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:14.681785+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/557616291' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 158 heartbeat osd_stat(store_statfs(0x191505000/0x0/0x1bfc00000, data 0x296d8969/0x297c7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 143523840 unmapped: 20463616 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:15.681975+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 158 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135241728 unmapped: 28745728 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 159 heartbeat osd_stat(store_statfs(0x18fd04000/0x0/0x1bfc00000, data 0x2aed8a04/0x2afc8000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:16.682098+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 159 heartbeat osd_stat(store_statfs(0x18fd00000/0x0/0x1bfc00000, data 0x2aedabc7/0x2afcc000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 159 handle_osd_map epochs [159,160], i have 159, src has [1,160]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135282688 unmapped: 28704768 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 160 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:17.682248+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 5905372 data_alloc: 184549376 data_used: 7163904
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 160 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 136364032 unmapped: 27623424 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:18.682393+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 161 ms_handle_reset con 0x565249c63800 session 0x565247597a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 137453568 unmapped: 26533888 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:19.682581+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f95800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 137461760 unmapped: 26525696 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:20.682759+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 162 ms_handle_reset con 0x565247f95800 session 0x5652458dcd20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 162 handle_osd_map epochs [162,163], i have 162, src has [1,163]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134971392 unmapped: 29016064 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:21.682973+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f95800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 163 ms_handle_reset con 0x565247f95800 session 0x5652491ca960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcd000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 163 ms_handle_reset con 0x565248bcd000 session 0x565247e59e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 163 heartbeat osd_stat(store_statfs(0x1b64f7000/0x0/0x1bfc00000, data 0x46e32ab/0x47d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134832128 unmapped: 29155328 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:22.683130+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 163 ms_handle_reset con 0x56524655d000 session 0x5652472750e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1782508 data_alloc: 184549376 data_used: 7168000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134848512 unmapped: 29138944 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:23.683304+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcb800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 163 ms_handle_reset con 0x565248bcb800 session 0x56524543a000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134905856 unmapped: 29081600 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:24.683523+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.635607719s of 10.787071228s, submitted: 322
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 163 ms_handle_reset con 0x56524767d000 session 0x565247e58000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134922240 unmapped: 29065216 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:25.683690+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134922240 unmapped: 29065216 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:26.683855+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 164 ms_handle_reset con 0x56524767d000 session 0x565247832d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134946816 unmapped: 29040640 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:27.684012+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 164 heartbeat osd_stat(store_statfs(0x1b64f3000/0x0/0x1bfc00000, data 0x46e552c/0x47da000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 164 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1790043 data_alloc: 184549376 data_used: 7180288
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134660096 unmapped: 29327360 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:28.684197+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134660096 unmapped: 29327360 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:29.684378+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134684672 unmapped: 29302784 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:30.684579+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bca800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 165 ms_handle_reset con 0x565248bca800 session 0x5652491cb2c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134684672 unmapped: 29302784 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:31.684831+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134684672 unmapped: 29302784 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:32.685112+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1793069 data_alloc: 184549376 data_used: 7180288
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 165 heartbeat osd_stat(store_statfs(0x1b64f0000/0x0/0x1bfc00000, data 0x46e7805/0x47dd000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134709248 unmapped: 29278208 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:33.685310+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 165 ms_handle_reset con 0x565247f94800 session 0x5652458e10e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247794c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 165 ms_handle_reset con 0x565247794c00 session 0x5652476abe00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134725632 unmapped: 29261824 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:34.685479+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f9d400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134733824 unmapped: 29253632 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:35.685750+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 165 ms_handle_reset con 0x565247f9d400 session 0x5652476abc20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f9d400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.434343338s of 10.874414444s, submitted: 136
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 165 handle_osd_map epochs [165,166], i have 165, src has [1,166]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 166 heartbeat osd_stat(store_statfs(0x1b64ef000/0x0/0x1bfc00000, data 0x46e7879/0x47de000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 166 ms_handle_reset con 0x565247f9d400 session 0x565248f45c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134758400 unmapped: 29229056 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:36.685889+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134758400 unmapped: 29229056 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:37.686046+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1799617 data_alloc: 184549376 data_used: 7192576
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134766592 unmapped: 29220864 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:38.686205+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247792800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc6c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 167 handle_osd_map epochs [167,168], i have 167, src has [1,168]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 49
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134799360 unmapped: 29188096 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:39.686390+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 168 handle_osd_map epochs [168,169], i have 168, src has [1,169]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134807552 unmapped: 29179904 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:40.686516+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 169 ms_handle_reset con 0x565248bc6c00 session 0x565247796d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 169 handle_osd_map epochs [169,170], i have 169, src has [1,170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 169 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134832128 unmapped: 29155328 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:41.686724+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247793400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc9c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 170 ms_handle_reset con 0x565248bc9c00 session 0x5652458e1c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524652e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 170 ms_handle_reset con 0x56524652e800 session 0x565248aede00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 170 heartbeat osd_stat(store_statfs(0x1b64d9000/0x0/0x1bfc00000, data 0x46f242d/0x47f2000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134873088 unmapped: 29114368 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:42.686905+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 170 handle_osd_map epochs [170,171], i have 170, src has [1,171]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 171 ms_handle_reset con 0x565247793400 session 0x5652478681e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1824443 data_alloc: 184549376 data_used: 7200768
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 171 ms_handle_reset con 0x565247792800 session 0x565246018b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:43.687084+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134897664 unmapped: 29089792 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778f400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247926000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:44.687197+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 171 ms_handle_reset con 0x565247926000 session 0x565248af74a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c63000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134897664 unmapped: 29089792 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 171 handle_osd_map epochs [171,172], i have 171, src has [1,172]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 172 ms_handle_reset con 0x565249c63000 session 0x5652491ca1e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 172 ms_handle_reset con 0x56524778f400 session 0x5652475bb0e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:45.687428+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 134995968 unmapped: 28991488 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 50
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:46.687600+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135004160 unmapped: 28983296 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.990988731s of 10.710959435s, submitted: 248
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524652f400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:47.687815+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135020544 unmapped: 28966912 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 173 heartbeat osd_stat(store_statfs(0x1b64d3000/0x0/0x1bfc00000, data 0x46f6bf0/0x47f7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 173 handle_osd_map epochs [173,174], i have 173, src has [1,174]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 174 ms_handle_reset con 0x56524652f400 session 0x565248aec780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f95800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1831681 data_alloc: 184549376 data_used: 7200768
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.59206 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.49467 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 174 ms_handle_reset con 0x565247f95800 session 0x56524543b4a0
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1410707360' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 174 heartbeat osd_stat(store_statfs(0x1b64cd000/0x0/0x1bfc00000, data 0x46fafda/0x47ff000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3639373356' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.32:0/3639373356' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:48.687959+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.69635 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135749632 unmapped: 28237824 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1015257701' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:49.688153+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.59224 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135749632 unmapped: 28237824 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2446013239' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.49491 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:50.688345+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.69656 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135757824 unmapped: 28229632 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 174 handle_osd_map epochs [174,175], i have 174, src has [1,175]
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2379843221' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/625052886' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.59245 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:51.688565+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135766016 unmapped: 28221440 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3504013880' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 175 heartbeat osd_stat(store_statfs(0x1b64ce000/0x0/0x1bfc00000, data 0x46fafa6/0x47ff000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:52.688765+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135766016 unmapped: 28221440 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1834990 data_alloc: 184549376 data_used: 7213056
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 175 heartbeat osd_stat(store_statfs(0x1b64c9000/0x0/0x1bfc00000, data 0x46fe56e/0x4805000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:53.688984+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135766016 unmapped: 28221440 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:54.689143+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135774208 unmapped: 28213248 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:55.689372+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135921664 unmapped: 28065792 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:56.689581+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135929856 unmapped: 28057600 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.750160217s of 10.069472313s, submitted: 141
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:57.689755+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135938048 unmapped: 28049408 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1846922 data_alloc: 184549376 data_used: 7225344
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:58.689962+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135888896 unmapped: 28098560 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 176 heartbeat osd_stat(store_statfs(0x1b64a5000/0x0/0x1bfc00000, data 0x471ecc8/0x4826000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:59.690136+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135888896 unmapped: 28098560 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 176 heartbeat osd_stat(store_statfs(0x1b64a5000/0x0/0x1bfc00000, data 0x471ecc8/0x4826000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:00.690290+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135888896 unmapped: 28098560 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:01.690510+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 135987200 unmapped: 28000256 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:02.690678+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 176 heartbeat osd_stat(store_statfs(0x1b6482000/0x0/0x1bfc00000, data 0x4743f33/0x484b000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 137035776 unmapped: 26951680 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1849792 data_alloc: 184549376 data_used: 7225344
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:03.690840+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 137035776 unmapped: 26951680 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:04.690990+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 136306688 unmapped: 27680768 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:05.691158+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 136306688 unmapped: 27680768 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:06.691316+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 136306688 unmapped: 27680768 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.883369446s of 10.053344727s, submitted: 46
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 176 heartbeat osd_stat(store_statfs(0x1b645c000/0x0/0x1bfc00000, data 0x47687d4/0x4871000, compress 0x0/0x0/0x0, omap 0x644, meta 0x4f2f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:07.691436+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 136355840 unmapped: 27631616 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1881093 data_alloc: 184549376 data_used: 7225344
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 176 ms_handle_reset con 0x565248bcec00 session 0x565248cb7c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:08.691536+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 136921088 unmapped: 27066368 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:09.691689+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 136921088 unmapped: 27066368 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:10.691816+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 176 ms_handle_reset con 0x565247f94c00 session 0x565248af6f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 139083776 unmapped: 24903680 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:11.692012+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 139108352 unmapped: 24879104 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 177 heartbeat osd_stat(store_statfs(0x1b4058000/0x0/0x1bfc00000, data 0x55ca2c5/0x56d4000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:12.692146+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 139108352 unmapped: 24879104 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1969843 data_alloc: 184549376 data_used: 7233536
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:13.692295+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138895360 unmapped: 25092096 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 177 heartbeat osd_stat(store_statfs(0x1b4057000/0x0/0x1bfc00000, data 0x55cc530/0x56d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:14.692430+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138928128 unmapped: 25059328 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 177 heartbeat osd_stat(store_statfs(0x1b403f000/0x0/0x1bfc00000, data 0x55e3789/0x56ee000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:15.692666+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 177 handle_osd_map epochs [177,178], i have 177, src has [1,178]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138936320 unmapped: 25051136 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249adc800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 178 ms_handle_reset con 0x565249adc800 session 0x5652477845a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:16.692824+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138936320 unmapped: 25051136 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 178 handle_osd_map epochs [178,179], i have 178, src has [1,179]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 179 heartbeat osd_stat(store_statfs(0x1b403a000/0x0/0x1bfc00000, data 0x55e59fd/0x56f2000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.621487617s of 10.171217918s, submitted: 138
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:17.693217+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 179 heartbeat osd_stat(store_statfs(0x1b403a000/0x0/0x1bfc00000, data 0x55e59fd/0x56f2000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138969088 unmapped: 25018368 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1986461 data_alloc: 184549376 data_used: 7241728
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:18.693369+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138969088 unmapped: 25018368 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 179 handle_osd_map epochs [179,180], i have 179, src has [1,180]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 179 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc8000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:19.693519+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c63400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 ms_handle_reset con 0x565249c63400 session 0x5652476aab40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 ms_handle_reset con 0x565248bc8000 session 0x565248aed0e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138690560 unmapped: 25296896 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:20.693708+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138690560 unmapped: 25296896 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 ms_handle_reset con 0x565247f94c00 session 0x565246215860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f95800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bcec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 ms_handle_reset con 0x565248bcec00 session 0x5652472743c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 ms_handle_reset con 0x565247f95800 session 0x565248af7860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:21.693902+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138780672 unmapped: 25206784 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 heartbeat osd_stat(store_statfs(0x1b400e000/0x0/0x1bfc00000, data 0x560cac3/0x571e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:22.694075+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138805248 unmapped: 25182208 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 heartbeat osd_stat(store_statfs(0x1b3ff6000/0x0/0x1bfc00000, data 0x5625f23/0x5737000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1999639 data_alloc: 184549376 data_used: 7241728
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524922c000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 ms_handle_reset con 0x56524655ec00 session 0x565247796b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:23.694229+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 ms_handle_reset con 0x56524655ec00 session 0x5652475ba1e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138805248 unmapped: 25182208 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 180 handle_osd_map epochs [180,181], i have 180, src has [1,181]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:24.694342+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 181 ms_handle_reset con 0x5652476c4800 session 0x5652458e30e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138805248 unmapped: 25182208 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 181 ms_handle_reset con 0x56524922c000 session 0x565248175c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:25.694502+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138813440 unmapped: 25174016 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:26.694701+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138813440 unmapped: 25174016 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.516562462s of 10.003259659s, submitted: 122
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:27.694878+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138878976 unmapped: 25108480 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2010068 data_alloc: 184549376 data_used: 7258112
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 181 heartbeat osd_stat(store_statfs(0x1b3fcc000/0x0/0x1bfc00000, data 0x564a8e1/0x575f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524bd87000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:28.695022+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 181 ms_handle_reset con 0x56524bd87000 session 0x565247596000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 182 ms_handle_reset con 0x565247f94c00 session 0x565247868d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain systemd-journald[47611]: Data hash table of /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Dec 02 10:21:05 np0005541913.localdomain systemd-journald[47611]: /run/log/journal/510530184876bdc0ebb29e7199f63471/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138846208 unmapped: 25141248 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 182 ms_handle_reset con 0x56524655ec00 session 0x56524769f4a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 182 heartbeat osd_stat(store_statfs(0x1b3fcc000/0x0/0x1bfc00000, data 0x564cad4/0x5761000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:29.695206+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 182 handle_osd_map epochs [182,183], i have 182, src has [1,183]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 183 ms_handle_reset con 0x5652476c4800 session 0x565248f43a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524922c000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138862592 unmapped: 25124864 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 183 ms_handle_reset con 0x56524922c000 session 0x5652458e1e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:30.695413+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 138862592 unmapped: 25124864 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:31.695598+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 139943936 unmapped: 24043520 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 183 handle_osd_map epochs [183,184], i have 183, src has [1,184]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:32.695758+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140009472 unmapped: 23977984 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2016333 data_alloc: 184549376 data_used: 7274496
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:33.696011+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140009472 unmapped: 23977984 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:34.696208+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 139993088 unmapped: 23994368 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 184 heartbeat osd_stat(store_statfs(0x1b3f8a000/0x0/0x1bfc00000, data 0x568e914/0x57a3000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:35.696356+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140001280 unmapped: 23986176 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 184 handle_osd_map epochs [184,185], i have 184, src has [1,185]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524652e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 186 ms_handle_reset con 0x56524652e800 session 0x5652477965a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:36.696514+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140034048 unmapped: 23953408 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.282528877s of 10.085911751s, submitted: 224
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249c60000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:37.696673+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 187 ms_handle_reset con 0x565249c60000 session 0x5652477bb860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 187 ms_handle_reset con 0x565244b99c00 session 0x5652475ba000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140271616 unmapped: 23715840 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2109501 data_alloc: 184549376 data_used: 7303168
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 187 ms_handle_reset con 0x565247f94800 session 0x565248f4bc20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:38.696837+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140279808 unmapped: 23707648 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 187 heartbeat osd_stat(store_statfs(0x1b379c000/0x0/0x1bfc00000, data 0x5e73707/0x5f90000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc9400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 188 ms_handle_reset con 0x565248bc9400 session 0x56524543a960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247790400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc6c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:39.696969+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 188 ms_handle_reset con 0x565248bc6c00 session 0x565246215680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 188 ms_handle_reset con 0x565247790400 session 0x565248f483c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140320768 unmapped: 23666688 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:40.697097+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247790400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 188 ms_handle_reset con 0x565247790400 session 0x56524769ed20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140345344 unmapped: 23642112 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 188 heartbeat osd_stat(store_statfs(0x1b3e3d000/0x0/0x1bfc00000, data 0x57d3b37/0x58ef000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:41.697222+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 189 ms_handle_reset con 0x565244b99c00 session 0x5652479b8780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 189 ms_handle_reset con 0x565247f94800 session 0x5652476ad0e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140394496 unmapped: 23592960 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:42.697348+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc6c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140435456 unmapped: 23552000 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2057539 data_alloc: 184549376 data_used: 7315456
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:43.697503+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 190 ms_handle_reset con 0x565248bc6c00 session 0x5652465b50e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 140476416 unmapped: 23511040 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:44.697695+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141541376 unmapped: 22446080 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778c000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 190 ms_handle_reset con 0x56524778c000 session 0x565248aecd20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:45.697856+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141631488 unmapped: 22355968 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778c000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 190 handle_osd_map epochs [190,191], i have 190, src has [1,191]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 191 ms_handle_reset con 0x56524778c000 session 0x5652491ca5a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:46.698024+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 191 heartbeat osd_stat(store_statfs(0x1b3f0c000/0x0/0x1bfc00000, data 0x5700517/0x5821000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141631488 unmapped: 22355968 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:47.698221+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.242081642s of 10.349036217s, submitted: 336
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 191 ms_handle_reset con 0x565244b99c00 session 0x5652491cb860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141697024 unmapped: 22290432 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2065959 data_alloc: 184549376 data_used: 7327744
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 191 heartbeat osd_stat(store_statfs(0x1b3eec000/0x0/0x1bfc00000, data 0x572170d/0x5840000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:48.698388+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141697024 unmapped: 22290432 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:49.698583+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141697024 unmapped: 22290432 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:50.698764+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141705216 unmapped: 22282240 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 191 handle_osd_map epochs [191,192], i have 191, src has [1,192]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:51.698954+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141713408 unmapped: 22274048 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 192 heartbeat osd_stat(store_statfs(0x1b3ecb000/0x0/0x1bfc00000, data 0x57421ad/0x5861000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:52.699114+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141729792 unmapped: 22257664 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bca800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 192 ms_handle_reset con 0x565248bca800 session 0x56524543be00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 192 heartbeat osd_stat(store_statfs(0x1b3eb9000/0x0/0x1bfc00000, data 0x57564c7/0x5875000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2075717 data_alloc: 184549376 data_used: 7340032
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:53.699214+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 141819904 unmapped: 22167552 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:54.699434+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 192 handle_osd_map epochs [192,193], i have 192, src has [1,193]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 142893056 unmapped: 21094400 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:55.699657+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 193 handle_osd_map epochs [193,194], i have 193, src has [1,194]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 142852096 unmapped: 21135360 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:56.699836+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 194 handle_osd_map epochs [194,195], i have 194, src has [1,195]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 143048704 unmapped: 20938752 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:57.699998+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 195 handle_osd_map epochs [195,196], i have 195, src has [1,196]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.510826111s of 10.111140251s, submitted: 182
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 196 handle_osd_map epochs [195,196], i have 196, src has [1,196]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 143097856 unmapped: 20889600 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2092358 data_alloc: 184549376 data_used: 7360512
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524a14a400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:58.700207+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 143204352 unmapped: 20783104 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 196 handle_osd_map epochs [196,197], i have 196, src has [1,197]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 197 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 197 ms_handle_reset con 0x56524a14a400 session 0x56524769fe00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 197 heartbeat osd_stat(store_statfs(0x1b3e51000/0x0/0x1bfc00000, data 0x57b59f8/0x58dd000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:59.700360+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd6800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 143220736 unmapped: 20766720 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 197 handle_osd_map epochs [197,198], i have 197, src has [1,198]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 198 ms_handle_reset con 0x565247fd6800 session 0x5652475a9860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:00.700507+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 143237120 unmapped: 20750336 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 198 ms_handle_reset con 0x565244b99c00 session 0x565248f49c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778c000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 198 handle_osd_map epochs [198,199], i have 198, src has [1,199]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:01.700671+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 144375808 unmapped: 19611648 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 199 heartbeat osd_stat(store_statfs(0x1b3e20000/0x0/0x1bfc00000, data 0x57e024d/0x590d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [0,0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:02.700844+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 199 handle_osd_map epochs [199,200], i have 199, src has [1,200]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 200 ms_handle_reset con 0x56524778c000 session 0x5652458dd2c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 144384000 unmapped: 19603456 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2116129 data_alloc: 184549376 data_used: 7393280
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:03.701003+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524a14b400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 200 ms_handle_reset con 0x56524a14b400 session 0x565248cb7e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652461a0800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 144384000 unmapped: 19603456 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 200 ms_handle_reset con 0x5652461a0800 session 0x5652479b90e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:04.701146+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 144449536 unmapped: 19537920 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:05.701303+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 144449536 unmapped: 19537920 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f95800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 200 ms_handle_reset con 0x565247f95800 session 0x565247a04b40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:06.701479+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 144465920 unmapped: 19521536 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:07.701688+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 201 handle_osd_map epochs [201,202], i have 201, src has [1,202]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.337841988s of 10.003434181s, submitted: 236
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 ms_handle_reset con 0x565244b99c00 session 0x5652479b94a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 heartbeat osd_stat(store_statfs(0x1b3de7000/0x0/0x1bfc00000, data 0x581987b/0x5947000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [0,0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 145760256 unmapped: 18227200 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2127724 data_alloc: 184549376 data_used: 7405568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:08.701858+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524bd86c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 ms_handle_reset con 0x56524778e800 session 0x5652475ba3c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 ms_handle_reset con 0x56524bd86c00 session 0x565248aec5a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 145727488 unmapped: 18259968 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 heartbeat osd_stat(store_statfs(0x1b3db1000/0x0/0x1bfc00000, data 0x5847246/0x597a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:09.702059+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bccc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 145727488 unmapped: 18259968 heap: 163987456 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524bd87c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:10.702190+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 heartbeat osd_stat(store_statfs(0x1b3db0000/0x0/0x1bfc00000, data 0x5847257/0x597b000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149725184 unmapped: 31072256 heap: 180797440 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:11.702359+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149848064 unmapped: 30949376 heap: 180797440 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:12.702534+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 145735680 unmapped: 35061760 heap: 180797440 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3141668 data_alloc: 184549376 data_used: 7405568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:13.702729+0000)
Dec 02 10:21:05 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:05.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 150011904 unmapped: 30785536 heap: 180797440 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:14.702908+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 154361856 unmapped: 30638080 heap: 184999936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:15.703057+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 150110208 unmapped: 34889728 heap: 184999936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 heartbeat osd_stat(store_statfs(0x1a315f000/0x0/0x1bfc00000, data 0x1649a3ee/0x165cf000, compress 0x0/0x0/0x0, omap 0x644, meta 0x64cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:16.703202+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 154648576 unmapped: 30351360 heap: 184999936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a02000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:17.703354+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 ms_handle_reset con 0x565247a02000 session 0x565248175680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.112768173s of 10.211714745s, submitted: 148
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161882112 unmapped: 23117824 heap: 184999936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4554812 data_alloc: 184549376 data_used: 7405568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:18.703507+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158892032 unmapped: 26107904 heap: 184999936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524652e000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 ms_handle_reset con 0x56524652e000 session 0x565247557e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 heartbeat osd_stat(store_statfs(0x19b132000/0x0/0x1bfc00000, data 0x1e0c7ed0/0x1e1fc000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,1,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:19.703678+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149995520 unmapped: 35004416 heap: 184999936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:20.703833+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 145801216 unmapped: 56000512 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 ms_handle_reset con 0x56524778e800 session 0x56524778b4a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:21.704000+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 44376064 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 202 handle_osd_map epochs [202,203], i have 202, src has [1,203]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:22.704133+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 203 ms_handle_reset con 0x565244b99c00 session 0x565247e590e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 203 ms_handle_reset con 0x56524bd87c00 session 0x565247796000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 203 ms_handle_reset con 0x565248bccc00 session 0x565248f443c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 145956864 unmapped: 55844864 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248603c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 203 ms_handle_reset con 0x565248603c00 session 0x565246217e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248603c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 6022982 data_alloc: 184549376 data_used: 7421952
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 203 ms_handle_reset con 0x565248603c00 session 0x56524778a5a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:23.704286+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 204 ms_handle_reset con 0x565244b99c00 session 0x565248f44000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 146243584 unmapped: 55558144 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:24.704413+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 204 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 205 ms_handle_reset con 0x56524778e800 session 0x5652474aed20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149413888 unmapped: 52387840 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bccc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 205 ms_handle_reset con 0x565248bccc00 session 0x565247796f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524bd87c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 205 heartbeat osd_stat(store_statfs(0x1b10f4000/0x0/0x1bfc00000, data 0x59017e9/0x5a36000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [0,0,2])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:25.704625+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 206 ms_handle_reset con 0x56524bd87c00 session 0x5652475ba5a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 148504576 unmapped: 53297152 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524bd87c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:26.704782+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 207 ms_handle_reset con 0x56524bd87c00 session 0x565248bc3c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 148652032 unmapped: 53149696 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:27.705022+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 207 ms_handle_reset con 0x565244b99c00 session 0x56524778b680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 147734528 unmapped: 54067200 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 7.703263760s of 10.209542274s, submitted: 449
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2315949 data_alloc: 184549376 data_used: 7426048
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:28.705242+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 209 ms_handle_reset con 0x56524778e800 session 0x5652458e32c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 209 heartbeat osd_stat(store_statfs(0x1b38a7000/0x0/0x1bfc00000, data 0x5949e08/0x5a84000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 147939328 unmapped: 53862400 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:29.705406+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524922fc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 209 ms_handle_reset con 0x56524922fc00 session 0x5652479a5a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 209 heartbeat osd_stat(store_statfs(0x1b3879000/0x0/0x1bfc00000, data 0x5976e0e/0x5ab4000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248651c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 209 ms_handle_reset con 0x565248651c00 session 0x565247556d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 147922944 unmapped: 53878784 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248651c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:30.705563+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 209 handle_osd_map epochs [209,210], i have 209, src has [1,210]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 210 ms_handle_reset con 0x565248651c00 session 0x565248f42000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 147939328 unmapped: 53862400 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 210 handle_osd_map epochs [210,211], i have 210, src has [1,211]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:31.705794+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 147963904 unmapped: 53837824 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:32.705978+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 211 handle_osd_map epochs [211,212], i have 211, src has [1,212]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 212 ms_handle_reset con 0x565244b99c00 session 0x565247833c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 147963904 unmapped: 53837824 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2345156 data_alloc: 184549376 data_used: 7454720
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:33.706124+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652474ee800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 147980288 unmapped: 53821440 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 212 ms_handle_reset con 0x5652474ee800 session 0x5652458db0e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:34.706265+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 147988480 unmapped: 53813248 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:35.706423+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 213 heartbeat osd_stat(store_statfs(0x1b384f000/0x0/0x1bfc00000, data 0x599abf9/0x5ade000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 147996672 unmapped: 53805056 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 213 handle_osd_map epochs [213,214], i have 213, src has [1,214]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 213 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:36.706565+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 148135936 unmapped: 53665792 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249add000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 214 ms_handle_reset con 0x565249add000 session 0x565248bc3e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:37.706692+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149291008 unmapped: 52510720 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 214 heartbeat osd_stat(store_statfs(0x1b3819000/0x0/0x1bfc00000, data 0x59cd28d/0x5b15000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655cc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.321312904s of 10.168735504s, submitted: 251
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2354701 data_alloc: 184549376 data_used: 7462912
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:38.706856+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149307392 unmapped: 52494336 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 ms_handle_reset con 0x56524655cc00 session 0x565248f4be00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:39.707042+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524655cc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bce400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 ms_handle_reset con 0x565248bce400 session 0x5652477972c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 ms_handle_reset con 0x56524655cc00 session 0x56524778ab40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149438464 unmapped: 52363264 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:40.707218+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 ms_handle_reset con 0x565244b99c00 session 0x565248aec000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652474ee800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 heartbeat osd_stat(store_statfs(0x1b37fa000/0x0/0x1bfc00000, data 0x59e9895/0x5b34000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149463040 unmapped: 52338688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 ms_handle_reset con 0x5652474ee800 session 0x565248bc2960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 heartbeat osd_stat(store_statfs(0x1b37fa000/0x0/0x1bfc00000, data 0x59e9895/0x5b34000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 215 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:41.707395+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248651c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 216 ms_handle_reset con 0x565248651c00 session 0x565248cb70e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248651c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149479424 unmapped: 52322304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:42.707602+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 216 ms_handle_reset con 0x565244b99c00 session 0x565247797680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 216 handle_osd_map epochs [216,217], i have 216, src has [1,217]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149479424 unmapped: 52322304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 ms_handle_reset con 0x565248651c00 session 0x5652475bba40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2369239 data_alloc: 184549376 data_used: 7487488
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:43.707824+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 ms_handle_reset con 0x56524778ec00 session 0x5652472741e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249ade000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bc9400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149512192 unmapped: 52289536 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 ms_handle_reset con 0x565248bc9400 session 0x565248af61e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 ms_handle_reset con 0x565249ade000 session 0x565247784f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249ade000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 ms_handle_reset con 0x565249ade000 session 0x565248bc25a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:44.707985+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149610496 unmapped: 52191232 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244b99c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 ms_handle_reset con 0x565244b99c00 session 0x565247797e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524778ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:45.708115+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 heartbeat osd_stat(store_statfs(0x1b37d4000/0x0/0x1bfc00000, data 0x5a0fb34/0x5b5a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 ms_handle_reset con 0x56524778ec00 session 0x565248aecf00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149307392 unmapped: 52494336 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:46.708271+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 heartbeat osd_stat(store_statfs(0x1b37d4000/0x0/0x1bfc00000, data 0x5a0fb85/0x5b5a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 149536768 unmapped: 52264960 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248651c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 ms_handle_reset con 0x565248651c00 session 0x565248f481e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 217 handle_osd_map epochs [217,218], i have 217, src has [1,218]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:47.708488+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 150757376 unmapped: 51044352 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2380032 data_alloc: 184549376 data_used: 7495680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.406662941s of 10.239171028s, submitted: 244
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:48.708647+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 218 handle_osd_map epochs [218,219], i have 218, src has [1,219]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 150806528 unmapped: 50995200 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:49.708844+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 219 heartbeat osd_stat(store_statfs(0x1b3787000/0x0/0x1bfc00000, data 0x5a58c8e/0x5ba5000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151027712 unmapped: 50774016 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:50.709013+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151027712 unmapped: 50774016 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 219 handle_osd_map epochs [219,220], i have 219, src has [1,220]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:51.709185+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151027712 unmapped: 50774016 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565244c1a400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 220 ms_handle_reset con 0x565244c1a400 session 0x565248f49e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:52.709349+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151027712 unmapped: 50774016 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2390302 data_alloc: 184549376 data_used: 7507968
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:53.709542+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 220 heartbeat osd_stat(store_statfs(0x1b3778000/0x0/0x1bfc00000, data 0x5a65346/0x5bb6000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 150888448 unmapped: 50913280 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:54.709711+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 150888448 unmapped: 50913280 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:55.709862+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 150937600 unmapped: 50864128 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c6800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 220 ms_handle_reset con 0x5652476c6800 session 0x565248f49680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd7000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 220 handle_osd_map epochs [220,221], i have 220, src has [1,221]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:56.710011+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 ms_handle_reset con 0x565247fd7000 session 0x565248f483c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151994368 unmapped: 49807360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:57.710194+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151994368 unmapped: 49807360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2396770 data_alloc: 184549376 data_used: 7520256
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:58.710351+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 heartbeat osd_stat(store_statfs(0x1b3756000/0x0/0x1bfc00000, data 0x5a88b07/0x5bd7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151994368 unmapped: 49807360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:59.710482+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.754513741s of 11.091491699s, submitted: 117
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151994368 unmapped: 49807360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:00.710674+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151994368 unmapped: 49807360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:01.710843+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151994368 unmapped: 49807360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:02.710994+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152092672 unmapped: 49709056 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2394610 data_alloc: 184549376 data_used: 7520256
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:03.711135+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 heartbeat osd_stat(store_statfs(0x1b3741000/0x0/0x1bfc00000, data 0x5a9efb3/0x5bed000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152092672 unmapped: 49709056 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 heartbeat osd_stat(store_statfs(0x1b3741000/0x0/0x1bfc00000, data 0x5a9efb3/0x5bed000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:04.711301+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152092672 unmapped: 49709056 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:05.711453+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152092672 unmapped: 49709056 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:06.711607+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152109056 unmapped: 49692672 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:07.711779+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152109056 unmapped: 49692672 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2396527 data_alloc: 184549376 data_used: 7520256
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:08.711912+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152109056 unmapped: 49692672 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 heartbeat osd_stat(store_statfs(0x1b3731000/0x0/0x1bfc00000, data 0x5aaebd5/0x5bfd000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:09.712049+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152109056 unmapped: 49692672 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.451561928s of 10.584728241s, submitted: 30
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:10.712189+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152174592 unmapped: 49627136 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:11.712369+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 heartbeat osd_stat(store_statfs(0x1b370f000/0x0/0x1bfc00000, data 0x5ace741/0x5c1f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151658496 unmapped: 50143232 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:12.712507+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151658496 unmapped: 50143232 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2407787 data_alloc: 184549376 data_used: 7520256
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:13.712665+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151937024 unmapped: 49864704 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:14.712806+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151945216 unmapped: 49856512 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a6f400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 ms_handle_reset con 0x565247a6f400 session 0x565248aec5a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:15.713052+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151969792 unmapped: 49831936 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:16.713277+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151404544 unmapped: 50397184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a02c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 ms_handle_reset con 0x565247a02c00 session 0x565248aec000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:17.713450+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 221 handle_osd_map epochs [221,222], i have 221, src has [1,222]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 222 heartbeat osd_stat(store_statfs(0x1b36a7000/0x0/0x1bfc00000, data 0x5b34539/0x5c87000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248b7ac00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 222 ms_handle_reset con 0x565248b7ac00 session 0x565248f4b0e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151437312 unmapped: 50364416 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2420232 data_alloc: 184549376 data_used: 7528448
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:18.713680+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 222 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151486464 unmapped: 50315264 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:19.713849+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c6800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 223 ms_handle_reset con 0x5652476c6800 session 0x565248f4be00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a02c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 223 ms_handle_reset con 0x565247a02c00 session 0x565247e590e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 151535616 unmapped: 50266112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:20.714021+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.127740860s of 10.675529480s, submitted: 193
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152584192 unmapped: 49217536 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 223 handle_osd_map epochs [223,224], i have 223, src has [1,224]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:21.714202+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a6f400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 224 ms_handle_reset con 0x565247a6f400 session 0x565247799c20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152616960 unmapped: 49184768 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:22.714378+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd7000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 224 ms_handle_reset con 0x565247fd7000 session 0x5652458da960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bca400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 224 ms_handle_reset con 0x565248bca400 session 0x5652458dad20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152641536 unmapped: 49160192 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2432288 data_alloc: 184549376 data_used: 7544832
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:23.714523+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 224 heartbeat osd_stat(store_statfs(0x1b3678000/0x0/0x1bfc00000, data 0x5b5e4c3/0x5cb6000, compress 0x0/0x0/0x0, omap 0x644, meta 0x68cf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248bca400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 224 ms_handle_reset con 0x565248bca400 session 0x5652458dbc20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c6800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 224 ms_handle_reset con 0x5652476c6800 session 0x5652458dbe00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152764416 unmapped: 49037312 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:24.714729+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152772608 unmapped: 49029120 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:25.714945+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a02c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 224 ms_handle_reset con 0x565247a02c00 session 0x5652458db4a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152805376 unmapped: 48996352 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 224 handle_osd_map epochs [224,225], i have 224, src has [1,225]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:26.715114+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152920064 unmapped: 48881664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:27.715262+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152788992 unmapped: 49012736 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2450620 data_alloc: 184549376 data_used: 7561216
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:28.715408+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152903680 unmapped: 48898048 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 227 heartbeat osd_stat(store_statfs(0x1b321b000/0x0/0x1bfc00000, data 0x5bb4486/0x5d12000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:29.715587+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524922ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 227 ms_handle_reset con 0x56524922ec00 session 0x5652458daf00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152903680 unmapped: 48898048 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:30.715739+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152920064 unmapped: 48881664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248603c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.856730461s of 10.683195114s, submitted: 247
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:31.715992+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 227 ms_handle_reset con 0x565248603c00 session 0x5652458da3c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c6800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 227 ms_handle_reset con 0x5652476c6800 session 0x565248cb7e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 152944640 unmapped: 48857088 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 227 heartbeat osd_stat(store_statfs(0x1b321f000/0x0/0x1bfc00000, data 0x5bb442a/0x5d0f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:32.716144+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249ade400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153018368 unmapped: 48783360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:33.716296+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2455733 data_alloc: 184549376 data_used: 7577600
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 227 handle_osd_map epochs [227,228], i have 227, src has [1,228]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 228 ms_handle_reset con 0x565249ade400 session 0x565248cb7860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153018368 unmapped: 48783360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:34.716426+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153018368 unmapped: 48783360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:35.716581+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153026560 unmapped: 48775168 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 228 handle_osd_map epochs [228,229], i have 228, src has [1,229]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:36.716692+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 229 heartbeat osd_stat(store_statfs(0x1b3213000/0x0/0x1bfc00000, data 0x5bb8dee/0x5d1a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153067520 unmapped: 48734208 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:37.716838+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 229 handle_osd_map epochs [229,230], i have 229, src has [1,230]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153116672 unmapped: 48685056 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:38.717004+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2474583 data_alloc: 184549376 data_used: 7598080
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 230 heartbeat osd_stat(store_statfs(0x1b31ee000/0x0/0x1bfc00000, data 0x5bdbe3d/0x5d3f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524922fc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153141248 unmapped: 48660480 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 230 ms_handle_reset con 0x56524922fc00 session 0x565248cb63c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:39.717183+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153157632 unmapped: 48644096 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:40.717343+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153190400 unmapped: 48611328 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:41.717533+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153190400 unmapped: 48611328 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:42.717680+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.268523216s of 11.060644150s, submitted: 159
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247793000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 231 ms_handle_reset con 0x565247793000 session 0x5652478692c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 231 heartbeat osd_stat(store_statfs(0x1b31ee000/0x0/0x1bfc00000, data 0x5bde01f/0x5d40000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153288704 unmapped: 48513024 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:43.717848+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2482585 data_alloc: 184549376 data_used: 7610368
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153346048 unmapped: 48455680 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:44.718005+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524a005400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 231 ms_handle_reset con 0x56524a005400 session 0x565247868d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524a005400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 231 ms_handle_reset con 0x56524a005400 session 0x5652478681e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 153378816 unmapped: 48422912 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c6800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:45.718168+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 231 ms_handle_reset con 0x5652476c6800 session 0x5652479b90e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 156311552 unmapped: 45490176 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 231 handle_osd_map epochs [231,232], i have 231, src has [1,232]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:46.718699+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 156049408 unmapped: 45752320 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247793000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 232 ms_handle_reset con 0x565247793000 session 0x565248f485a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:47.718868+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 156098560 unmapped: 45703168 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524bd86c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 233 ms_handle_reset con 0x56524bd86c00 session 0x5652477974a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:48.719039+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2608862 data_alloc: 184549376 data_used: 7622656
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767c800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 233 ms_handle_reset con 0x56524767c800 session 0x565248f4ab40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 233 heartbeat osd_stat(store_statfs(0x1b2558000/0x0/0x1bfc00000, data 0x686c835/0x69d4000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 156172288 unmapped: 45629440 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:49.719227+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524767c800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 234 ms_handle_reset con 0x56524767c800 session 0x56524543a5a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 234 handle_osd_map epochs [234,235], i have 234, src has [1,235]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 156532736 unmapped: 45268992 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:50.719401+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157581312 unmapped: 44220416 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:51.719693+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157663232 unmapped: 44138496 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:52.719880+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157802496 unmapped: 43999232 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.726247787s of 10.780774117s, submitted: 269
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:53.720077+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 235 handle_osd_map epochs [235,236], i have 235, src has [1,236]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2523072 data_alloc: 184549376 data_used: 7647232
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157155328 unmapped: 44646400 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:54.720252+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 236 heartbeat osd_stat(store_statfs(0x1b3093000/0x0/0x1bfc00000, data 0x5d31488/0x5e9a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157163520 unmapped: 44638208 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:55.720469+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 44621824 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:56.720811+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157343744 unmapped: 44457984 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:57.720988+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524922fc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 239 ms_handle_reset con 0x56524922fc00 session 0x56524778b860
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157343744 unmapped: 44457984 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:58.721177+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2545672 data_alloc: 184549376 data_used: 7647232
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157450240 unmapped: 44351488 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:59.721327+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 240 heartbeat osd_stat(store_statfs(0x1b302c000/0x0/0x1bfc00000, data 0x5d935b1/0x5f01000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157679616 unmapped: 44122112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:00.721503+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 157679616 unmapped: 44122112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:01.721714+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 240 handle_osd_map epochs [240,241], i have 240, src has [1,241]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 241 heartbeat osd_stat(store_statfs(0x1b300f000/0x0/0x1bfc00000, data 0x5db17a4/0x5f1e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158744576 unmapped: 43057152 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:02.721883+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247927800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158744576 unmapped: 43057152 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:03.722086+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.866328239s of 10.245676041s, submitted: 106
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2544400 data_alloc: 184549376 data_used: 7655424
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 241 handle_osd_map epochs [241,242], i have 241, src has [1,242]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 242 ms_handle_reset con 0x565247927800 session 0x565247597680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158752768 unmapped: 43048960 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:04.722324+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158752768 unmapped: 43048960 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:05.722533+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158769152 unmapped: 43032576 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:06.722674+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 243 handle_osd_map epochs [243,244], i have 243, src has [1,244]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 244 handle_osd_map epochs [243,244], i have 244, src has [1,244]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c6400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 244 ms_handle_reset con 0x5652476c6400 session 0x5652475961e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158793728 unmapped: 43008000 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:07.722862+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 244 heartbeat osd_stat(store_statfs(0x1b3004000/0x0/0x1bfc00000, data 0x5db7d7b/0x5f28000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158801920 unmapped: 42999808 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:08.723087+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2553886 data_alloc: 184549376 data_used: 7655424
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 244 heartbeat osd_stat(store_statfs(0x1b3004000/0x0/0x1bfc00000, data 0x5db7d7b/0x5f28000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158801920 unmapped: 42999808 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:09.723210+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158810112 unmapped: 42991616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:10.723342+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 244 heartbeat osd_stat(store_statfs(0x1b3003000/0x0/0x1bfc00000, data 0x5db7ea3/0x5f2a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158810112 unmapped: 42991616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:11.723560+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 244 handle_osd_map epochs [244,245], i have 244, src has [1,245]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 245 heartbeat osd_stat(store_statfs(0x1b3000000/0x0/0x1bfc00000, data 0x5db9f99/0x5f2d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158818304 unmapped: 42983424 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:12.723675+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158818304 unmapped: 42983424 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:13.723838+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2562052 data_alloc: 184549376 data_used: 7671808
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158818304 unmapped: 42983424 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 245 heartbeat osd_stat(store_statfs(0x1b2ffd000/0x0/0x1bfc00000, data 0x5dba093/0x5f2e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:14.724009+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158826496 unmapped: 42975232 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:15.724696+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 12.263932228s of 12.493651390s, submitted: 91
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 245 heartbeat osd_stat(store_statfs(0x1b2ffd000/0x0/0x1bfc00000, data 0x5dba093/0x5f2e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158834688 unmapped: 42967040 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:16.724839+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 245 heartbeat osd_stat(store_statfs(0x1b2ffd000/0x0/0x1bfc00000, data 0x5dba093/0x5f2e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158842880 unmapped: 42958848 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:17.725043+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158842880 unmapped: 42958848 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:18.725224+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2567098 data_alloc: 184549376 data_used: 7671808
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158842880 unmapped: 42958848 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:19.725345+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 245 heartbeat osd_stat(store_statfs(0x1b2ffc000/0x0/0x1bfc00000, data 0x5dba196/0x5f30000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [0,0,0,1])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158851072 unmapped: 42950656 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:20.725491+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158867456 unmapped: 42934272 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248602800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 245 ms_handle_reset con 0x565248602800 session 0x565247868960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:21.725676+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158892032 unmapped: 42909696 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:22.725812+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 158892032 unmapped: 42909696 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:23.725968+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2665216 data_alloc: 184549376 data_used: 7671808
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 245 handle_osd_map epochs [245,246], i have 245, src has [1,246]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a03000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd6c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 246 heartbeat osd_stat(store_statfs(0x1b243e000/0x0/0x1bfc00000, data 0x6974508/0x6aed000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 246 ms_handle_reset con 0x565247a03000 session 0x5652479a4d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161062912 unmapped: 40738816 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:24.726126+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 246 heartbeat osd_stat(store_statfs(0x1b243e000/0x0/0x1bfc00000, data 0x6974508/0x6aed000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 247 ms_handle_reset con 0x565247fd6c00 session 0x565248f43e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161071104 unmapped: 40730624 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:25.726260+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f9c800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.406013489s of 10.024027824s, submitted: 143
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 247 ms_handle_reset con 0x565247f9c800 session 0x565247e594a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 160940032 unmapped: 40861696 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:26.726380+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 160940032 unmapped: 40861696 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:27.726520+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f95c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd9400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a6e800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 168935424 unmapped: 32866304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:28.726670+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 247 ms_handle_reset con 0x565247fd9400 session 0x565248aec780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2843547 data_alloc: 184549376 data_used: 7684096
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 248 ms_handle_reset con 0x565247a6e800 session 0x565248f42d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 248 ms_handle_reset con 0x565247f95c00 session 0x5652477bb2c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 248 heartbeat osd_stat(store_statfs(0x1b0ef8000/0x0/0x1bfc00000, data 0x7ebb7dd/0x8036000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161669120 unmapped: 40132608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:29.726857+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a03000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161669120 unmapped: 40132608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:30.727018+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 248 handle_osd_map epochs [248,249], i have 248, src has [1,249]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 249 ms_handle_reset con 0x565247a03000 session 0x565248af6d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f9c800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161685504 unmapped: 40116224 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 249 ms_handle_reset con 0x565247f9c800 session 0x5652476ad4a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:31.727181+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161685504 unmapped: 40116224 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:32.727298+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd6c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 249 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161685504 unmapped: 40116224 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:33.727472+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2694808 data_alloc: 184549376 data_used: 7700480
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 250 ms_handle_reset con 0x565247fd6c00 session 0x5652465b41e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 250 heartbeat osd_stat(store_statfs(0x1b2434000/0x0/0x1bfc00000, data 0x697cf77/0x6af7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161742848 unmapped: 40058880 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:34.727678+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 250 handle_osd_map epochs [250,251], i have 250, src has [1,251]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565245946c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161751040 unmapped: 40050688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 251 ms_handle_reset con 0x565245946c00 session 0x5652474aef00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:35.727776+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 251 heartbeat osd_stat(store_statfs(0x1b2fea000/0x0/0x1bfc00000, data 0x5dc72c1/0x5f43000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.411861420s of 10.101967812s, submitted: 186
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 251 heartbeat osd_stat(store_statfs(0x1b2fea000/0x0/0x1bfc00000, data 0x5dc72c1/0x5f43000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161841152 unmapped: 39960576 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:36.727947+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 251 handle_osd_map epochs [251,252], i have 251, src has [1,252]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a03000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 252 ms_handle_reset con 0x565247a03000 session 0x565247869680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161849344 unmapped: 39952384 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:37.728133+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 252 heartbeat osd_stat(store_statfs(0x1b2fe3000/0x0/0x1bfc00000, data 0x5dc94c6/0x5f49000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161849344 unmapped: 39952384 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:38.728293+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2618422 data_alloc: 184549376 data_used: 7712768
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161849344 unmapped: 39952384 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:39.728443+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 252 heartbeat osd_stat(store_statfs(0x1b2fe3000/0x0/0x1bfc00000, data 0x5dc94c6/0x5f49000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161857536 unmapped: 39944192 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:40.728606+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 252 heartbeat osd_stat(store_statfs(0x1b2fe4000/0x0/0x1bfc00000, data 0x5dc95c6/0x5f4a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x6ccf9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161857536 unmapped: 39944192 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:41.728847+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 252 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 161865728 unmapped: 39936000 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:42.729003+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a6e400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 162004992 unmapped: 39796736 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:43.729527+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2724998 data_alloc: 184549376 data_used: 7725056
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 ms_handle_reset con 0x565247a6e400 session 0x56524778b0e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 ms_handle_reset con 0x565247f94400 session 0x565248f4a3c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a6ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 ms_handle_reset con 0x565247a6ec00 session 0x565247868f00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c7c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 ms_handle_reset con 0x5652476c7c00 session 0x5652475baf00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c7c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 ms_handle_reset con 0x5652476c7c00 session 0x5652481750e0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 163545088 unmapped: 38256640 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:44.729703+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 163545088 unmapped: 38256640 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:45.729863+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a03000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 ms_handle_reset con 0x565247a03000 session 0x5652462174a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a6e400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 ms_handle_reset con 0x565247a6e400 session 0x565247e58000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.534039497s of 10.095775604s, submitted: 151
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a6ec00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 ms_handle_reset con 0x565247a6ec00 session 0x56524769f2c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 163553280 unmapped: 38248448 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:46.730028+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 ms_handle_reset con 0x565247f94400 session 0x565248af6960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f94400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 heartbeat osd_stat(store_statfs(0x1b133f000/0x0/0x1bfc00000, data 0x68c5a5f/0x6a4f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x7e6f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 163536896 unmapped: 38264832 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:47.730161+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 164757504 unmapped: 37044224 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:48.730332+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2762376 data_alloc: 184549376 data_used: 11538432
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 164986880 unmapped: 36814848 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:49.730510+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 heartbeat osd_stat(store_statfs(0x1b133e000/0x0/0x1bfc00000, data 0x68c5cfc/0x6a4f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x7e6f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 165019648 unmapped: 36782080 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:50.730675+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 165027840 unmapped: 36773888 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:51.730891+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 165027840 unmapped: 36773888 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:52.731010+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 165027840 unmapped: 36773888 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:53.731164+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 heartbeat osd_stat(store_statfs(0x1b133c000/0x0/0x1bfc00000, data 0x68c5e3a/0x6a51000, compress 0x0/0x0/0x0, omap 0x644, meta 0x7e6f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2766652 data_alloc: 184549376 data_used: 11763712
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 165027840 unmapped: 36773888 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:54.731331+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 165036032 unmapped: 36765696 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:55.731513+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 heartbeat osd_stat(store_statfs(0x1b133d000/0x0/0x1bfc00000, data 0x68c5e87/0x6a50000, compress 0x0/0x0/0x0, omap 0x644, meta 0x7e6f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 165044224 unmapped: 36757504 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 heartbeat osd_stat(store_statfs(0x1b133d000/0x0/0x1bfc00000, data 0x68c5e87/0x6a50000, compress 0x0/0x0/0x0, omap 0x644, meta 0x7e6f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain rsyslogd[754]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:56.731690+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.391911507s of 10.575737000s, submitted: 38
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 heartbeat osd_stat(store_statfs(0x1b1340000/0x0/0x1bfc00000, data 0x68c5ebf/0x6a4e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x7e6f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 165044224 unmapped: 36757504 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:57.731858+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 165044224 unmapped: 36757504 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:58.732072+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 heartbeat osd_stat(store_statfs(0x1b1340000/0x0/0x1bfc00000, data 0x68c5f24/0x6a4e000, compress 0x0/0x0/0x0, omap 0x644, meta 0x7e6f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2765708 data_alloc: 184549376 data_used: 11763712
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 170844160 unmapped: 30957568 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:59.732189+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 171024384 unmapped: 30777344 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:00.732291+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:01.732430+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 171827200 unmapped: 29974528 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:02.732584+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 171827200 unmapped: 29974528 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 253 handle_osd_map epochs [253,254], i have 253, src has [1,254]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 254 heartbeat osd_stat(store_statfs(0x1b01c2000/0x0/0x1bfc00000, data 0x7641403/0x77cb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x826f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:03.732788+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 171941888 unmapped: 29859840 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2886936 data_alloc: 184549376 data_used: 13266944
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:04.732965+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 171941888 unmapped: 29859840 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:05.733089+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 171941888 unmapped: 29859840 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 254 heartbeat osd_stat(store_statfs(0x1b01c1000/0x0/0x1bfc00000, data 0x76413bc/0x77cb000, compress 0x0/0x0/0x0, omap 0x644, meta 0x826f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:06.733258+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 172154880 unmapped: 29646848 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.089838028s of 10.029912949s, submitted: 241
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:07.733383+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 172425216 unmapped: 29376512 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 254 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565249adf000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:08.733539+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 173629440 unmapped: 28172288 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2905982 data_alloc: 184549376 data_used: 13283328
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0143000/0x0/0x1bfc00000, data 0x76bc0f1/0x7847000, compress 0x0/0x0/0x0, omap 0x644, meta 0x826f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:09.733721+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd7400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 173948928 unmapped: 27852800 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565248651c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 256 ms_handle_reset con 0x565248651c00 session 0x5652477ba000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:10.733936+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 173957120 unmapped: 27844608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 257 ms_handle_reset con 0x565247fd7400 session 0x565248f4a960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:11.734134+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 174153728 unmapped: 27648000 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 258 heartbeat osd_stat(store_statfs(0x1b00b6000/0x0/0x1bfc00000, data 0x7747c15/0x78d7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x826f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:12.734281+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 174243840 unmapped: 27557888 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652461a0400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:13.734444+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 175374336 unmapped: 26427392 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2932439 data_alloc: 184549376 data_used: 13299712
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd8c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 259 ms_handle_reset con 0x565247fd8c00 session 0x5652466a32c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:14.734559+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 175996928 unmapped: 25804800 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 259 heartbeat osd_stat(store_statfs(0x1b007b000/0x0/0x1bfc00000, data 0x777cf09/0x7910000, compress 0x0/0x0/0x0, omap 0x644, meta 0x826f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 259 handle_osd_map epochs [259,260], i have 259, src has [1,260]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 260 ms_handle_reset con 0x5652461a0400 session 0x565248cb65a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:15.734647+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 175882240 unmapped: 25919488 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 260 heartbeat osd_stat(store_statfs(0x1b1020000/0x0/0x1bfc00000, data 0x77d7de1/0x796d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x726f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:16.734825+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 175915008 unmapped: 25886720 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 260 handle_osd_map epochs [260,261], i have 260, src has [1,261]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.544684410s of 10.114748001s, submitted: 182
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:17.734979+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 175923200 unmapped: 25878528 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247a02400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:18.735174+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178020352 unmapped: 23781376 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2945081 data_alloc: 184549376 data_used: 13299712
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 261 heartbeat osd_stat(store_statfs(0x1afe31000/0x0/0x1bfc00000, data 0x7825379/0x79bc000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 261 handle_osd_map epochs [261,262], i have 261, src has [1,262]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd6c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 262 ms_handle_reset con 0x565247fd6c00 session 0x565247e59a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:19.735287+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178331648 unmapped: 23470080 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 262 handle_osd_map epochs [262,263], i have 262, src has [1,263]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 263 ms_handle_reset con 0x565247a02400 session 0x5652476aa960
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:20.735437+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178331648 unmapped: 23470080 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 263 handle_osd_map epochs [263,264], i have 263, src has [1,264]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 263 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 263 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:21.735599+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178438144 unmapped: 23363584 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 264 handle_osd_map epochs [265,265], i have 264, src has [1,265]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:22.735782+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178454528 unmapped: 23347200 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fd6c00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 266 ms_handle_reset con 0x565247fd6c00 session 0x5652479a43c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 266 heartbeat osd_stat(store_statfs(0x1afd8c000/0x0/0x1bfc00000, data 0x78c3928/0x7a5f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:23.735969+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178462720 unmapped: 23339008 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2967831 data_alloc: 184549376 data_used: 13320192
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 266 ms_handle_reset con 0x565249adf000 session 0x5652477965a0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:24.736182+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179847168 unmapped: 21954560 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 266 ms_handle_reset con 0x565247f94400 session 0x565248cb6d20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:25.736354+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179855360 unmapped: 21946368 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524652f800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 266 ms_handle_reset con 0x56524652f800 session 0x565248f443c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:26.736461+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 266 handle_osd_map epochs [266,267], i have 266, src has [1,267]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177012736 unmapped: 24788992 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:27.736691+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 267 handle_osd_map epochs [267,268], i have 267, src has [1,268]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.654572487s of 10.606867790s, submitted: 287
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177258496 unmapped: 24543232 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 268 heartbeat osd_stat(store_statfs(0x1b1527000/0x0/0x1bfc00000, data 0x61271dd/0x62c4000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:28.736848+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177364992 unmapped: 24436736 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2760479 data_alloc: 184549376 data_used: 7819264
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:29.737027+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177512448 unmapped: 24289280 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 268 heartbeat osd_stat(store_statfs(0x1b1528000/0x0/0x1bfc00000, data 0x6127245/0x62c3000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:30.737224+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 176340992 unmapped: 25460736 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x5652476c4400
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:31.737430+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 268 handle_osd_map epochs [268,269], i have 268, src has [1,269]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177160192 unmapped: 24641536 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 269 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 269 heartbeat osd_stat(store_statfs(0x1b14db000/0x0/0x1bfc00000, data 0x6176f96/0x6313000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:32.737541+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 269 handle_osd_map epochs [269,270], i have 269, src has [1,270]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177364992 unmapped: 24436736 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:33.737714+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177463296 unmapped: 24338432 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 270 handle_osd_map epochs [270,271], i have 270, src has [1,271]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 271 ms_handle_reset con 0x5652476c4400 session 0x5652458e03c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247fdac00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2776125 data_alloc: 184549376 data_used: 7843840
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 271 ms_handle_reset con 0x565247fdac00 session 0x5652456ee780
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:34.737862+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177512448 unmapped: 24289280 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:35.738036+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177512448 unmapped: 24289280 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 271 heartbeat osd_stat(store_statfs(0x1b142d000/0x0/0x1bfc00000, data 0x6222189/0x63c0000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:36.738183+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177512448 unmapped: 24289280 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 271 handle_osd_map epochs [272,272], i have 271, src has [1,272]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 272 handle_osd_map epochs [272,272], i have 272, src has [1,272]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:37.738345+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177537024 unmapped: 24264704 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.686280251s of 10.627582550s, submitted: 260
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:38.738490+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177537024 unmapped: 24264704 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2785123 data_alloc: 184549376 data_used: 7852032
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:39.738690+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177684480 unmapped: 24117248 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:40.738844+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 177684480 unmapped: 24117248 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:41.739044+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 272 handle_osd_map epochs [272,273], i have 272, src has [1,273]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178741248 unmapped: 23060480 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 273 heartbeat osd_stat(store_statfs(0x1b13b1000/0x0/0x1bfc00000, data 0x629fa6c/0x643d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247ee9800
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 273 ms_handle_reset con 0x565247ee9800 session 0x5652458e1a40
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:42.739212+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178749440 unmapped: 23052288 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 273 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:43.739358+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178749440 unmapped: 23052288 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x56524922dc00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2801090 data_alloc: 184549376 data_used: 7872512
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:44.739522+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178749440 unmapped: 23052288 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 274 handle_osd_map epochs [274,275], i have 274, src has [1,275]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 51
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 275 handle_osd_map epochs [275,275], i have 275, src has [1,275]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 275 handle_osd_map epochs [274,275], i have 275, src has [1,275]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 275 handle_osd_map epochs [275,275], i have 275, src has [1,275]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:45.739689+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178896896 unmapped: 22904832 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b1360000/0x0/0x1bfc00000, data 0x62eca8a/0x648c000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:46.739877+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: handle_auth_request added challenge on 0x565247f9d000
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 275 ms_handle_reset con 0x565247f9d000 session 0x565248175680
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178913280 unmapped: 22888448 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:47.740057+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178921472 unmapped: 22880256 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:48.740237+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178921472 unmapped: 22880256 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b1327000/0x0/0x1bfc00000, data 0x63233e6/0x64c6000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.292626381s of 10.659793854s, submitted: 104
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2808060 data_alloc: 184549376 data_used: 7876608
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:49.740386+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178913280 unmapped: 22888448 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b130f000/0x0/0x1bfc00000, data 0x633cb25/0x64df000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:50.740544+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178913280 unmapped: 22888448 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:51.740705+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 275 handle_osd_map epochs [275,276], i have 275, src has [1,276]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179011584 unmapped: 22790144 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 276 handle_osd_map epochs [276,276], i have 276, src has [1,276]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:52.740855+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179019776 unmapped: 22781952 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 276 heartbeat osd_stat(store_statfs(0x1b12c7000/0x0/0x1bfc00000, data 0x6382367/0x6526000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:53.741044+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179019776 unmapped: 22781952 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2820666 data_alloc: 184549376 data_used: 7888896
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:54.741245+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178528256 unmapped: 23273472 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:55.741392+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 178626560 unmapped: 23175168 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:56.741587+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179691520 unmapped: 22110208 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:57.741731+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179691520 unmapped: 22110208 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 276 heartbeat osd_stat(store_statfs(0x1b1252000/0x0/0x1bfc00000, data 0x63f70d5/0x659c000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:58.741888+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179765248 unmapped: 22036480 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2824904 data_alloc: 184549376 data_used: 7888896
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.049480438s of 10.329094887s, submitted: 75
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:59.742045+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179871744 unmapped: 21929984 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:00.742209+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179888128 unmapped: 21913600 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:01.742445+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179896320 unmapped: 21905408 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:02.742685+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179896320 unmapped: 21905408 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _renew_subs
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 276 handle_osd_map epochs [277,277], i have 276, src has [1,277]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 277 heartbeat osd_stat(store_statfs(0x1b11d3000/0x0/0x1bfc00000, data 0x6478af4/0x661b000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:03.742893+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180035584 unmapped: 21766144 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2829976 data_alloc: 184549376 data_used: 7897088
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:04.743062+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180035584 unmapped: 21766144 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 277 heartbeat osd_stat(store_statfs(0x1b11ce000/0x0/0x1bfc00000, data 0x647ad6d/0x661f000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:05.743210+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180035584 unmapped: 21766144 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:06.743336+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180035584 unmapped: 21766144 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:07.743501+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180035584 unmapped: 21766144 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:08.743705+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180109312 unmapped: 21692416 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2839612 data_alloc: 184549376 data_used: 7897088
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.781299591s of 10.034886360s, submitted: 72
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:09.743880+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180256768 unmapped: 21544960 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 277 heartbeat osd_stat(store_statfs(0x1b1146000/0x0/0x1bfc00000, data 0x65029f3/0x66a8000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:10.744019+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180256768 unmapped: 21544960 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:11.744191+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180256768 unmapped: 21544960 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b1143000/0x0/0x1bfc00000, data 0x6505dca/0x66ab000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:12.744346+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181428224 unmapped: 20373504 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:13.744499+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b110e000/0x0/0x1bfc00000, data 0x6538470/0x66df000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180379648 unmapped: 21422080 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2845786 data_alloc: 184549376 data_used: 7909376
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:14.744670+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180379648 unmapped: 21422080 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b110b000/0x0/0x1bfc00000, data 0x653ba89/0x66e2000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:15.744824+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179855360 unmapped: 21946368 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:16.744985+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179855360 unmapped: 21946368 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:17.745157+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179855360 unmapped: 21946368 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:18.745342+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179863552 unmapped: 21938176 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2845094 data_alloc: 184549376 data_used: 7909376
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:19.745503+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b10ac000/0x0/0x1bfc00000, data 0x659c9dc/0x6742000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179863552 unmapped: 21938176 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:20.745700+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179863552 unmapped: 21938176 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:21.745900+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 12.407800674s of 12.577337265s, submitted: 51
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 179871744 unmapped: 21929984 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:22.746140+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180994048 unmapped: 20807680 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:23.746315+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 180994048 unmapped: 20807680 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2853174 data_alloc: 184549376 data_used: 7917568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b1047000/0x0/0x1bfc00000, data 0x66024e1/0x67a7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:24.746483+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b1047000/0x0/0x1bfc00000, data 0x66024e1/0x67a7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181149696 unmapped: 20652032 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:25.746685+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181149696 unmapped: 20652032 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:26.746861+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181223424 unmapped: 20578304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:27.747077+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181305344 unmapped: 20496384 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:28.747257+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181305344 unmapped: 20496384 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2865916 data_alloc: 184549376 data_used: 7917568
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b0feb000/0x0/0x1bfc00000, data 0x665d003/0x6801000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:29.747422+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181321728 unmapped: 20480000 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:30.747577+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181452800 unmapped: 20348928 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b0fc2000/0x0/0x1bfc00000, data 0x66849ec/0x682a000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:31.747796+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.631485939s of 10.000844955s, submitted: 59
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181559296 unmapped: 20242432 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:32.747958+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 handle_osd_map epochs [278,279], i have 278, src has [1,279]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 278 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181575680 unmapped: 20226048 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 279 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:33.748162+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181575680 unmapped: 20226048 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2874776 data_alloc: 184549376 data_used: 7925760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:34.748335+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181575680 unmapped: 20226048 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:35.748734+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181575680 unmapped: 20226048 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 279 heartbeat osd_stat(store_statfs(0x1b0f10000/0x0/0x1bfc00000, data 0x6736c9b/0x68de000, compress 0x0/0x0/0x0, omap 0x644, meta 0x840f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:36.748860+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181764096 unmapped: 20037632 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:37.749030+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181764096 unmapped: 20037632 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:38.749181+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181772288 unmapped: 20029440 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2878984 data_alloc: 184549376 data_used: 7925760
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:39.749321+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181772288 unmapped: 20029440 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:40.749469+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 279 heartbeat osd_stat(store_statfs(0x1b0adc000/0x0/0x1bfc00000, data 0x676a813/0x6912000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181772288 unmapped: 20029440 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:41.749691+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 279 handle_osd_map epochs [280,280], i have 279, src has [1,280]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.665983200s of 10.012834549s, submitted: 110
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 181886976 unmapped: 19914752 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:42.749872+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 182042624 unmapped: 19759104 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 280 heartbeat osd_stat(store_statfs(0x1b0a39000/0x0/0x1bfc00000, data 0x68098c3/0x69b5000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:43.749996+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 182042624 unmapped: 19759104 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2899282 data_alloc: 184549376 data_used: 7942144
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:44.750141+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.2 total, 600.0 interval
                                                          Cumulative writes: 24K writes, 93K keys, 24K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 24K writes, 8890 syncs, 2.79 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 14K writes, 53K keys, 14K commit groups, 1.0 writes per commit group, ingest: 39.13 MB, 0.07 MB/s
                                                          Interval WAL: 14K writes, 6185 syncs, 2.36 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 182181888 unmapped: 19619840 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:45.750278+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 182214656 unmapped: 19587072 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:46.750427+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 182214656 unmapped: 19587072 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:47.750657+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 280 heartbeat osd_stat(store_statfs(0x1b09fb000/0x0/0x1bfc00000, data 0x6848ae5/0x69f2000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 182239232 unmapped: 19562496 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:48.750823+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183287808 unmapped: 18513920 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2902186 data_alloc: 184549376 data_used: 7938048
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:49.750952+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183328768 unmapped: 18472960 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:50.751098+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183476224 unmapped: 18325504 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:51.751297+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.677766800s of 10.003397942s, submitted: 92
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183721984 unmapped: 18079744 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 280 heartbeat osd_stat(store_statfs(0x1b097e000/0x0/0x1bfc00000, data 0x68c4ecd/0x6a70000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:52.751476+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 280 handle_osd_map epochs [280,281], i have 280, src has [1,281]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183730176 unmapped: 18071552 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:53.751642+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183730176 unmapped: 18071552 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 281 heartbeat osd_stat(store_statfs(0x1b0910000/0x0/0x1bfc00000, data 0x692f97c/0x6add000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2921396 data_alloc: 184549376 data_used: 7946240
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:54.751815+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183050240 unmapped: 18751488 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:55.752015+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183279616 unmapped: 18522112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:56.752147+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183279616 unmapped: 18522112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:57.752312+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 281 handle_osd_map epochs [281,282], i have 281, src has [1,282]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183287808 unmapped: 18513920 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 282 heartbeat osd_stat(store_statfs(0x1b08bd000/0x0/0x1bfc00000, data 0x698762b/0x6b31000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:58.752511+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183017472 unmapped: 18784256 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2922920 data_alloc: 184549376 data_used: 7958528
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:59.752784+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183017472 unmapped: 18784256 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:00.752933+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183017472 unmapped: 18784256 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:01.753115+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 282 handle_osd_map epochs [282,283], i have 282, src has [1,283]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.519504547s of 10.002376556s, submitted: 143
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183156736 unmapped: 18644992 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:02.758680+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183468032 unmapped: 18333696 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 283 heartbeat osd_stat(store_statfs(0x1b0851000/0x0/0x1bfc00000, data 0x69ed2c7/0x6b9c000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:03.758805+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183730176 unmapped: 18071552 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2935936 data_alloc: 184549376 data_used: 7958528
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:04.758991+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183730176 unmapped: 18071552 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:05.759120+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 283 heartbeat osd_stat(store_statfs(0x1b0821000/0x0/0x1bfc00000, data 0x6a1f465/0x6bcd000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183738368 unmapped: 18063360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:06.759292+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 283 handle_osd_map epochs [283,284], i have 283, src has [1,284]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 handle_osd_map epochs [284,284], i have 284, src has [1,284]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183738368 unmapped: 18063360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:07.759430+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183738368 unmapped: 18063360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:08.759660+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183738368 unmapped: 18063360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2936328 data_alloc: 184549376 data_used: 7970816
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:09.759876+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b0800000/0x0/0x1bfc00000, data 0x6a3e0cd/0x6bed000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183738368 unmapped: 18063360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:10.760095+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183738368 unmapped: 18063360 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:11.760302+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.869452477s of 10.003491402s, submitted: 41
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183828480 unmapped: 17973248 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:12.760485+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183967744 unmapped: 17833984 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:13.760665+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183967744 unmapped: 17833984 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2941858 data_alloc: 184549376 data_used: 7970816
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:14.760869+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183975936 unmapped: 17825792 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b07a2000/0x0/0x1bfc00000, data 0x6a9d541/0x6c4c000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:15.761037+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 183975936 unmapped: 17825792 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:16.761212+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184221696 unmapped: 17580032 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:17.761406+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b076d000/0x0/0x1bfc00000, data 0x6ad1ee2/0x6c81000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184221696 unmapped: 17580032 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:18.761550+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184221696 unmapped: 17580032 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b076d000/0x0/0x1bfc00000, data 0x6ad1f47/0x6c81000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2941810 data_alloc: 184549376 data_used: 7970816
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:19.761719+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184221696 unmapped: 17580032 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:20.761865+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184221696 unmapped: 17580032 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:21.762068+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.894302368s of 10.003356934s, submitted: 20
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184303616 unmapped: 17498112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:22.762250+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184303616 unmapped: 17498112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:23.762419+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184303616 unmapped: 17498112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2945904 data_alloc: 184549376 data_used: 7974912
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:24.762587+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b072a000/0x0/0x1bfc00000, data 0x6b16162/0x6cc4000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184492032 unmapped: 17309696 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:25.762791+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184295424 unmapped: 17506304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:26.762917+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184295424 unmapped: 17506304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:27.763048+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184377344 unmapped: 17424384 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b0707000/0x0/0x1bfc00000, data 0x6b39c6c/0x6ce7000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:28.763206+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184377344 unmapped: 17424384 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2949432 data_alloc: 184549376 data_used: 7974912
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:29.763376+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184377344 unmapped: 17424384 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:30.763559+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184377344 unmapped: 17424384 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:31.763765+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.887157440s of 10.003548622s, submitted: 21
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184524800 unmapped: 17276928 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:32.763945+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184524800 unmapped: 17276928 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b066c000/0x0/0x1bfc00000, data 0x6bd3d78/0x6d82000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:33.764130+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b0667000/0x0/0x1bfc00000, data 0x6bd73f2/0x6d87000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184467456 unmapped: 17334272 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b0667000/0x0/0x1bfc00000, data 0x6bd73f2/0x6d87000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2953912 data_alloc: 184549376 data_used: 7974912
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:34.764316+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184467456 unmapped: 17334272 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:35.764462+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b0668000/0x0/0x1bfc00000, data 0x6bd7421/0x6d86000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184467456 unmapped: 17334272 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:36.764661+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184557568 unmapped: 17244160 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:37.764783+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 52
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184328192 unmapped: 17473536 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:38.764940+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184328192 unmapped: 17473536 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2959582 data_alloc: 184549376 data_used: 7974912
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:39.765076+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b063c000/0x0/0x1bfc00000, data 0x6c0290a/0x6db2000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184328192 unmapped: 17473536 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:40.765310+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b0621000/0x0/0x1bfc00000, data 0x6c1d2d1/0x6dcd000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184442880 unmapped: 17358848 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:41.765566+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184557568 unmapped: 17244160 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:42.765760+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184557568 unmapped: 17244160 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b05f5000/0x0/0x1bfc00000, data 0x6c49a11/0x6df9000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.508700371s of 11.623811722s, submitted: 27
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:43.765955+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184467456 unmapped: 17334272 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2962454 data_alloc: 184549376 data_used: 7974912
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b05d1000/0x0/0x1bfc00000, data 0x6c6dcc0/0x6e1d000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:44.766116+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184467456 unmapped: 17334272 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:45.766256+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b05cd000/0x0/0x1bfc00000, data 0x6c712d5/0x6e21000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 184467456 unmapped: 17334272 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:46.766387+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 185622528 unmapped: 16179200 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:47.766558+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 185622528 unmapped: 16179200 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b0599000/0x0/0x1bfc00000, data 0x6ca5212/0x6e55000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:48.766729+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 185622528 unmapped: 16179200 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2969064 data_alloc: 184549376 data_used: 7974912
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:49.766913+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 185622528 unmapped: 16179200 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:50.767034+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186925056 unmapped: 14876672 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:51.767242+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186925056 unmapped: 14876672 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:52.767411+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186925056 unmapped: 14876672 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:53.767598+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186925056 unmapped: 14876672 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b051f000/0x0/0x1bfc00000, data 0x6d20ef7/0x6ecf000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2973542 data_alloc: 184549376 data_used: 7974912
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:54.767821+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187088896 unmapped: 14712832 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:55.767971+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187088896 unmapped: 14712832 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:56.768178+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 13.145141602s of 13.344819069s, submitted: 38
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 185974784 unmapped: 15826944 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:57.768355+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 185974784 unmapped: 15826944 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:58.768554+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b04fd000/0x0/0x1bfc00000, data 0x6d4246d/0x6ef1000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 185974784 unmapped: 15826944 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2971568 data_alloc: 184549376 data_used: 7970816
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:59.768727+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186064896 unmapped: 15736832 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:00.768937+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186105856 unmapped: 15695872 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:01.769201+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186105856 unmapped: 15695872 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:02.769399+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186105856 unmapped: 15695872 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:03.769567+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 heartbeat osd_stat(store_statfs(0x1b04ab000/0x0/0x1bfc00000, data 0x6d968e5/0x6f43000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186105856 unmapped: 15695872 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2975050 data_alloc: 184549376 data_used: 7970816
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:04.769751+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186105856 unmapped: 15695872 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:05.769927+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186114048 unmapped: 15687680 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:06.770116+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 ms_handle_reset con 0x56524922dc00 session 0x565246217e00
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.865916252s of 10.006187439s, submitted: 229
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186580992 unmapped: 15220736 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:07.770314+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 284 handle_osd_map epochs [284,285], i have 284, src has [1,285]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 53
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 285 heartbeat osd_stat(store_statfs(0x1b0476000/0x0/0x1bfc00000, data 0x6dcbfa7/0x6f78000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186621952 unmapped: 15179776 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:08.770493+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186621952 unmapped: 15179776 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2981360 data_alloc: 184549376 data_used: 7979008
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:09.770699+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:10.770877+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:11.771063+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:12.771234+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 285 heartbeat osd_stat(store_statfs(0x1b046c000/0x0/0x1bfc00000, data 0x6dd48f8/0x6f82000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:13.771380+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2979560 data_alloc: 184549376 data_used: 7979008
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:14.771565+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:15.771705+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:16.771849+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 285 handle_osd_map epochs [285,286], i have 285, src has [1,286]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.883286476s of 10.041259766s, submitted: 164
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:17.772031+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:18.772226+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:19.772394+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:20.772563+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186712064 unmapped: 15089664 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:21.772783+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:22.772953+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:23.773153+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:24.773321+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:25.773476+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:26.773684+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:27.773849+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:28.774053+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:29.774197+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:30.774368+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:31.774573+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:32.774802+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:33.774958+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:34.775158+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:35.775309+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:36.775465+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:37.775652+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:38.775779+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:39.775960+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:40.776111+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:41.776308+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:42.776479+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:43.776677+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:44.776833+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:45.777068+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:46.777267+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:47.777420+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:48.777579+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:49.777752+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:50.777904+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:51.778077+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:52.778237+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:53.778376+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:54.778548+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:55.778690+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:56.778845+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:57.778987+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:58.779150+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:59.779280+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:00.779425+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:01.779603+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:02.779833+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:03.779966+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:04.780100+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2983014 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:05.780268+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 186458112 unmapped: 15343616 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:06.797720+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 49.849899292s of 49.877525330s, submitted: 8
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 ms_handle_reset con 0x56524655c800 session 0x56524778b2c0
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0468000/0x0/0x1bfc00000, data 0x6dd6a0e/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187064320 unmapped: 14737408 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:07.797915+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Got map version 54
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187211776 unmapped: 14589952 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:08.798132+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187211776 unmapped: 14589952 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:09.798297+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187219968 unmapped: 14581760 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:10.798465+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187228160 unmapped: 14573568 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:11.798654+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187244544 unmapped: 14557184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:12.798792+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187244544 unmapped: 14557184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:13.798997+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187244544 unmapped: 14557184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:14.799192+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187244544 unmapped: 14557184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:15.799360+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187244544 unmapped: 14557184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:16.799595+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187244544 unmapped: 14557184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:17.799798+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187244544 unmapped: 14557184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:18.799938+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187244544 unmapped: 14557184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:19.800116+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187269120 unmapped: 14532608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:20.800298+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187269120 unmapped: 14532608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:21.800526+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187269120 unmapped: 14532608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:22.800718+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187269120 unmapped: 14532608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:23.800882+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187269120 unmapped: 14532608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:24.801065+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187269120 unmapped: 14532608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:25.801178+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187269120 unmapped: 14532608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:26.801330+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187269120 unmapped: 14532608 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:27.801474+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187285504 unmapped: 14516224 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:28.801752+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187285504 unmapped: 14516224 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:29.801901+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187285504 unmapped: 14516224 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:30.802082+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187285504 unmapped: 14516224 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:31.802266+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187285504 unmapped: 14516224 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:32.802413+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187293696 unmapped: 14508032 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:33.802541+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187301888 unmapped: 14499840 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:34.802680+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187301888 unmapped: 14499840 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:35.821456+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:36.821684+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187310080 unmapped: 14491648 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:37.821829+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187310080 unmapped: 14491648 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:38.821982+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187310080 unmapped: 14491648 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:39.822220+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187310080 unmapped: 14491648 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:40.822408+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187310080 unmapped: 14491648 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:41.822589+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187310080 unmapped: 14491648 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:42.822737+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187310080 unmapped: 14491648 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:43.822918+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187310080 unmapped: 14491648 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:44.823093+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187318272 unmapped: 14483456 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:45.823244+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187318272 unmapped: 14483456 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:46.823428+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187318272 unmapped: 14483456 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:47.823588+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187318272 unmapped: 14483456 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:48.823756+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187318272 unmapped: 14483456 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:49.823917+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187318272 unmapped: 14483456 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:50.824052+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187318272 unmapped: 14483456 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:51.824243+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187318272 unmapped: 14483456 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:52.824391+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187326464 unmapped: 14475264 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:53.824556+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187326464 unmapped: 14475264 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:54.824696+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187326464 unmapped: 14475264 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:55.824847+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187326464 unmapped: 14475264 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:56.824991+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187334656 unmapped: 14467072 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:57.825161+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187342848 unmapped: 14458880 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:58.825340+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187342848 unmapped: 14458880 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:59.825551+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187342848 unmapped: 14458880 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:00.825722+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187351040 unmapped: 14450688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:01.825921+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187351040 unmapped: 14450688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:02.826068+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187351040 unmapped: 14450688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:03.826219+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187351040 unmapped: 14450688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:04.826388+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187351040 unmapped: 14450688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:05.826533+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187351040 unmapped: 14450688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:06.826685+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187351040 unmapped: 14450688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:07.826843+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187351040 unmapped: 14450688 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:08.826994+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187359232 unmapped: 14442496 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:09.827171+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187359232 unmapped: 14442496 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:10.827388+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187359232 unmapped: 14442496 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:11.827580+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187359232 unmapped: 14442496 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:12.827740+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187359232 unmapped: 14442496 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:13.827895+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187359232 unmapped: 14442496 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:14.828062+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187367424 unmapped: 14434304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:15.828218+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187367424 unmapped: 14434304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:16.828360+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187367424 unmapped: 14434304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:17.828520+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187367424 unmapped: 14434304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:18.828667+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187367424 unmapped: 14434304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:19.828782+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187367424 unmapped: 14434304 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:20.828947+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187375616 unmapped: 14426112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:21.829143+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187375616 unmapped: 14426112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:22.829585+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187375616 unmapped: 14426112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:23.829788+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187375616 unmapped: 14426112 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:24.829954+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187383808 unmapped: 14417920 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:25.830102+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187383808 unmapped: 14417920 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:26.830258+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187383808 unmapped: 14417920 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:27.830403+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187392000 unmapped: 14409728 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:28.830571+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187392000 unmapped: 14409728 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:29.830712+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187392000 unmapped: 14409728 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:30.830839+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187400192 unmapped: 14401536 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:31.831024+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187400192 unmapped: 14401536 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: osd.3 286 heartbeat osd_stat(store_statfs(0x1b0469000/0x0/0x1bfc00000, data 0x6dd6c21/0x6f85000, compress 0x0/0x0/0x0, omap 0x644, meta 0x880f9bc), peers [0,1,2,4,5] op hist [])
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:32.831140+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: do_command 'config diff' '{prefix=config diff}'
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187383808 unmapped: 14417920 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: do_command 'config show' '{prefix=config show}'
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: do_command 'counter dump' '{prefix=counter dump}'
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: do_command 'counter schema' '{prefix=counter schema}'
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:33.831288+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187260928 unmapped: 14540800 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: tick
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_tickets
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:34.831417+0000)
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: prioritycache tune_memory target: 3561601228 mapped: 187244544 unmapped: 14557184 heap: 201801728 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: bluestore.MempoolThread(0x565243ea3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2982358 data_alloc: 184549376 data_used: 7991296
Dec 02 10:21:05 np0005541913.localdomain ceph-osd[32582]: do_command 'log dump' '{prefix=log dump}'
Dec 02 10:21:06 np0005541913.localdomain podman[240799]: time="2025-12-02T10:21:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 10:21:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:21:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154283 "" "Go-http-client/1.1"
Dec 02 10:21:06 np0005541913.localdomain podman[240799]: @ - - [02/Dec/2025:10:21:06 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18801 "" "Go-http-client/1.1"
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1021344335' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:21:06 np0005541913.localdomain crontab[341265]: (root) LIST (root)
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1199995000' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.49503 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.69671 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: pgmap v827: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.59263 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3955623284' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/265532473' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/557616291' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.69686 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.49518 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.49524 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.69704 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3884556716' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3079201673' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.49539 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1021344335' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/848192841' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1199995000' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1425092102' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 02 10:21:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:06.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:21:06 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:06.828 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 02 10:21:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:07.089 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:21:07 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:07.097 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:21:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 02 10:21:07 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2279675103' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 02 10:21:07 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec 02 10:21:07 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1320677971' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.59278 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.69716 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.49551 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.59290 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.69731 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3237429667' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2279675103' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.59311 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.49569 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/748612833' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2019927326' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1320677971' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/339330554' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 02 10:21:08 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.
Dec 02 10:21:08 np0005541913.localdomain podman[341518]: 2025-12-02 10:21:08.421895286 +0000 UTC m=+0.057845251 container health_status 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 02 10:21:08 np0005541913.localdomain podman[341518]: 2025-12-02 10:21:08.428339929 +0000 UTC m=+0.064289884 container exec_died 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Dec 02 10:21:08 np0005541913.localdomain systemd[1]: 31391dff098ec12c4407be6c4a42b061b76ff5e7040461487a3665b0548d6563.service: Deactivated successfully.
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/176858326' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3322998944' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 02 10:21:08 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:08.826 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec 02 10:21:08 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2835644307' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: pgmap v828: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.59326 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.49581 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.69761 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.49593 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1381129295' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.59356 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/47451548' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/830079725' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/176858326' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3322998944' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2401852479' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1001684263' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2835644307' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/74853862' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/963341960' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1367499796' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:09.345 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:21:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:09.345 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:21:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:09.346 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:21:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:09.346 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Auditing locally available compute resources for np0005541913.localdomain (node: np0005541913.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 02 10:21:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:09.347 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/846716803' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3628003697' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:21:09 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1240671825' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:21:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:09.781 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:21:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:09.860 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:21:09 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:09.860 281858 DEBUG nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.002 281858 WARNING nova.virt.libvirt.driver [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.004 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Hypervisor/Node resource view: name=np0005541913.localdomain free_ram=10894MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.004 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.004 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3646386249' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.062 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Instance b254bb7f-2891-4b37-9c44-9700e301ce16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.066 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.066 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Final resource view: name=np0005541913.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/550533180' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.114 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.49611 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2776594278' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/963341960' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2033131510' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1367499796' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2067912337' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2812853472' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/144381832' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/538524888' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/846716803' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3628003697' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1240671825' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3134299501' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3941513563' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1189769955' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3646386249' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/550533180' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1679132258' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:44.723844+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.982009888s of 34.994865417s, submitted: 4
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce5b0400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76881920 unmapped: 5898240 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 29
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/3550144330,v1:172.18.0.107:6811/3550144330]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/3550144330,v1:172.18.0.107:6811/3550144330]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: get_auth_request con 0x5581cee81000 auth_method 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_configure stats_period=5
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:45.723983+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76890112 unmapped: 5890048 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:46.724115+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 30
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/3550144330,v1:172.18.0.107:6811/3550144330]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76890112 unmapped: 5890048 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:47.724256+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76890112 unmapped: 5890048 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:48.724443+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 31
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/3550144330,v1:172.18.0.107:6811/3550144330]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:49.724598+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:50.724729+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:51.724855+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:52.725022+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:53.725149+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:54.725345+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:55.725457+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:56.725635+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:57.725792+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:58.725948+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:59.726086+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_monmap mon_map magic: 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient:  got monmap 12 from mon.np0005541914 (according to old e12)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: dump:
                                                          epoch 12
                                                          fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
                                                          last_changed 2025-12-02T09:57:29.744140+0000
                                                          created 2025-12-02T07:44:17.411659+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: ms_handle_reset current mon [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _reopen_session rank -1
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _add_conns ranks=[1,0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): picked mon.np0005541912 con 0x5581ce55f400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): picked mon.np0005541914 con 0x5581cee23c00 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): start opening mon connection
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): start opening mon connection
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 ms_handle_reset con 0x5581cee80800 session 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): get_auth_request con 0x5581cee23c00 auth_method 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _init_auth method 2
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _init_auth already have auth, reseting
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): get_auth_request con 0x5581ce55f400 auth_method 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _init_auth method 2
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _init_auth already have auth, reseting
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more payload 9
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more payload_len 9
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more payload 9
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more payload_len 9
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_done global_id 24199 payload 293
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _finish_hunting 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: found mon.np0005541914
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541914 at v2:172.18.0.108:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _finish_auth 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:56:59.769673+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541914 at v2:172.18.0.108:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_monmap mon_map magic: 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient:  got monmap 12 from mon.np0005541914 (according to old e12)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: dump:
                                                          epoch 12
                                                          fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
                                                          last_changed 2025-12-02T09:57:29.744140+0000
                                                          created 2025-12-02T07:44:17.411659+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_config config(7 keys)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: set_mon_vals no callback set
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 31
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/3550144330,v1:172.18.0.107:6811/3550144330]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:00.726239+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_monmap mon_map magic: 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient:  got monmap 13 from mon.np0005541914 (according to old e13)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: dump:
                                                          epoch 13
                                                          fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
                                                          last_changed 2025-12-02T09:57:30.836166+0000
                                                          created 2025-12-02T07:44:17.411659+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:01.726366+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:02.726511+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:03.726675+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:04.726841+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:05.726991+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:06.727148+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:07.727293+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:08.727474+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_monmap mon_map magic: 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient:  got monmap 14 from mon.np0005541914 (according to old e14)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: dump:
                                                          epoch 14
                                                          fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
                                                          last_changed 2025-12-02T09:57:38.994501+0000
                                                          created 2025-12-02T07:44:17.411659+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
                                                          3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541911
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:09.727575+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:10.727693+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:11.727838+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:12.727932+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:13.728079+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:14.728213+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:15.728396+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:16.728520+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:17.728658+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_monmap mon_map magic: 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient:  got monmap 15 from mon.np0005541914 (according to old e15)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: dump:
                                                          epoch 15
                                                          fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
                                                          last_changed 2025-12-02T09:57:47.906570+0000
                                                          created 2025-12-02T07:44:17.411659+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005541914
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:18.728810+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:19.728939+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:20.729117+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:21.729277+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:22.729435+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:23.729598+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:24.729798+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:25.729953+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:26.730117+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:27.730299+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:28.730480+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:29.730654+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:30.730794+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:31.730969+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:32.731128+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:33.731256+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:34.731419+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:35.731559+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:36.731748+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_monmap mon_map magic: 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient:  got monmap 16 from mon.np0005541914 (according to old e16)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: dump:
                                                          epoch 16
                                                          fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
                                                          last_changed 2025-12-02T09:58:06.915003+0000
                                                          created 2025-12-02T07:44:17.411659+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: mon.np0005541914 at [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] went away
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _reopen_session rank -1
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _add_conns ranks=[1,0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): picked mon.np0005541913 con 0x5581cee80800 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): picked mon.np0005541912 con 0x5581ce55f400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): start opening mon connection
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): start opening mon connection
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _finish_auth 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): get_auth_request con 0x5581cee80800 auth_method 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _init_auth method 2
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _init_auth already have auth, reseting
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more payload 9
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more payload_len 9
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_done global_id 24199 payload 293
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _finish_hunting 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: found mon.np0005541913
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _finish_auth 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:36.927781+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: ms_handle_reset current mon [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _reopen_session rank -1
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _add_conns ranks=[0,1]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): picked mon.np0005541912 con 0x5581ce55f400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): picked mon.np0005541913 con 0x5581cee23800 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): start opening mon connection
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): start opening mon connection
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 ms_handle_reset con 0x5581cee80800 session 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): get_auth_request con 0x5581cee23800 auth_method 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _init_auth method 2
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): _init_auth already have auth, reseting
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more payload 9
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more payload_len 9
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient(hunting): handle_auth_done global_id 24199 payload 293
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _finish_hunting 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: found mon.np0005541913
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _finish_auth 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:36.934812+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_monmap mon_map magic: 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient:  got monmap 16 from mon.np0005541913 (according to old e16)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: dump:
                                                          epoch 16
                                                          fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
                                                          last_changed 2025-12-02T09:58:06.915003+0000
                                                          created 2025-12-02T07:44:17.411659+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_config config(7 keys)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: set_mon_vals no callback set
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 31
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/3550144330,v1:172.18.0.107:6811/3550144330]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:37.731886+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:38.732066+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:39.732193+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:40.732355+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:41.732535+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:42.732857+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:43.732996+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:44.733123+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:45.733245+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:46.733346+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:47.733488+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:48.733681+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:49.733841+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:50.734006+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:51.734153+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:52.734301+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:53.734449+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:54.734598+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:55.734715+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:56.734894+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_monmap mon_map magic: 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient:  got monmap 17 from mon.np0005541913 (according to old e17)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: dump:
                                                          epoch 17
                                                          fsid c7c8e171-a193-56fb-95fa-8879fcfa7074
                                                          last_changed 2025-12-02T09:58:27.543539+0000
                                                          created 2025-12-02T07:44:17.411659+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005541912
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005541913
                                                          2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005541914
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:57.735014+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:58.735176+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:57:59.735323+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:00.735486+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:01.735647+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:02.735792+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:03.735934+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:04.736072+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:05.736208+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 715947 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:06.736380+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:07.736498+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:08.736699+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bb74d000/0x0/0x1bfc00000, data 0x185e355/0x18e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76840960 unmapped: 5939200 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:09.736845+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 32
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now 
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/3550144330
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc reconnect No active mgr available yet
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 85.166183472s of 85.190979004s, submitted: 4
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 ms_handle_reset con 0x5581ce5b0400 session 0x5581cee7c3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0e400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:10.736992+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 33
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc reconnect Starting new session with [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: get_auth_request con 0x5581cee80800 auth_method 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_configure stats_period=5
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77209600 unmapped: 5570560 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:11.737200+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 34
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77209600 unmapped: 5570560 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:12.737381+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77209600 unmapped: 5570560 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:13.737497+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77209600 unmapped: 5570560 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:14.737689+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 35
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77217792 unmapped: 5562368 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:15.737831+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 36
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:16.738008+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:17.738142+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:18.738316+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:19.738492+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:20.738663+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:21.738792+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:22.738897+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:23.739033+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:24.739162+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:25.739320+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:26.739448+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:27.739601+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:28.739853+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:29.739988+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:30.740176+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:31.740342+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:32.740484+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:33.740670+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76791808 unmapped: 5988352 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:34.740811+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 37
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2383186409,v1:172.18.0.106:6811/2383186409]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:35.740955+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:36.741081+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:37.741222+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:38.741395+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:39.741525+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:40.741688+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:41.741834+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:42.741995+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:43.742157+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:44.742311+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:45.742541+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:46.742687+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:47.742831+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:48.743024+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:49.743170+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:50.743319+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:51.743486+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:52.743670+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:53.743813+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:54.743983+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:55.744151+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:56.744307+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:57.744449+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:58.744677+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:58:59.744809+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:00.744981+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:01.745110+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:02.745326+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:03.745475+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:04.745597+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:05.745781+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:06.745920+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:07.746037+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:08.746227+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:09.746356+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:10.746532+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:11.746679+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:12.746902+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:13.747098+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:14.747275+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:15.747446+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:16.747593+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:17.747781+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:18.747959+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:19.748103+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:20.748255+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:21.748394+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:22.748523+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:23.748662+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:24.748767+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:25.748872+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:26.749086+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:27.749234+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:28.749444+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:29.749587+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:30.749686+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:31.749882+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:32.750050+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:33.750172+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:34.750334+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:35.750482+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 76947456 unmapped: 5832704 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:36.750687+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 719179 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bb749000/0x0/0x1bfc00000, data 0x1860877/0x18e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 90 handle_osd_map epochs [90,91], i have 90, src has [1,91]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 86.818717957s of 86.839935303s, submitted: 6
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 38
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now 
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/2383186409
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc reconnect No active mgr available yet
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 handle_osd_map epochs [91,91], i have 91, src has [1,91]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 ms_handle_reset con 0x5581cea0e400 session 0x5581ce084d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77094912 unmapped: 5685248 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce55f400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:37.750808+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 39
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: get_auth_request con 0x5581cc7c2c00 auth_method 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_configure stats_period=5
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77234176 unmapped: 5545984 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:38.750959+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77234176 unmapped: 5545984 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:39.751083+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77250560 unmapped: 5529600 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:40.751230+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77250560 unmapped: 5529600 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 41
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:41.752021+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77250560 unmapped: 5529600 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 42
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:42.752181+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:43.752346+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:44.752508+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:45.752691+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:46.752889+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:47.753170+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:48.753645+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:49.754355+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:50.755231+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:51.755967+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:52.756901+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:53.757858+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:54.758398+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77127680 unmapped: 5652480 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:55.758654+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:56.758858+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:57.759020+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:58.759215+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T09:59:59.759886+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:00.760536+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:01.760988+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:02.761377+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:03.788760+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:04.789206+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:05.789520+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:06.789908+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:07.790353+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:08.790645+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:09.790856+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:10.791116+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:11.791268+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:12.791540+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:13.791826+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:14.791984+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:15.792105+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:16.792278+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77029376 unmapped: 5750784 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:17.792437+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:18.792699+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:19.792915+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:20.793452+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:21.793586+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:22.793833+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:23.794016+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:24.794231+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:25.794451+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:26.794668+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:27.794819+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:28.795064+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:29.795275+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:30.795420+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:31.795558+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77037568 unmapped: 5742592 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:32.795682+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:33.796004+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:34.796209+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:35.796439+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:36.796752+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:37.797019+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:38.797241+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:39.798909+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:40.799081+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:41.799307+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:42.799430+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:43.799676+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:44.799808+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:45.799989+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:46.800132+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:47.800253+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77045760 unmapped: 5734400 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:48.800530+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:49.800696+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:50.800846+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:51.801002+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:52.801135+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:53.801289+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:54.801460+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:55.801596+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:56.801854+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:57.802173+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77053952 unmapped: 5726208 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:58.802384+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77062144 unmapped: 5718016 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:00:59.802544+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77062144 unmapped: 5718016 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:00.802726+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77062144 unmapped: 5718016 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:01.802865+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721975 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77062144 unmapped: 5718016 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb746000/0x0/0x1bfc00000, data 0x1862ef1/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:02.803020+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77062144 unmapped: 5718016 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:03.803193+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77062144 unmapped: 5718016 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:04.803336+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77062144 unmapped: 5718016 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:05.803521+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77062144 unmapped: 5718016 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:06.803685+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 90.124633789s of 90.144912720s, submitted: 7
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721063 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77103104 unmapped: 5677056 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb747000/0x0/0x1bfc00000, data 0x186300b/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:07.803845+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77103104 unmapped: 5677056 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:08.804054+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 43
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb747000/0x0/0x1bfc00000, data 0x186300b/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:09.804178+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:10.804318+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:11.804446+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721063 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:12.804674+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:13.804832+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb747000/0x0/0x1bfc00000, data 0x186300b/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:14.804979+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:15.805103+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:16.805268+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721063 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb747000/0x0/0x1bfc00000, data 0x186300b/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:17.805380+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:18.805533+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:19.805698+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:20.805880+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:21.806037+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 721063 data_alloc: 184549376 data_used: 126976
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb747000/0x0/0x1bfc00000, data 0x186300b/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:22.806169+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bb747000/0x0/0x1bfc00000, data 0x186300b/0x18e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77119488 unmapped: 5660672 heap: 82780160 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce5b0400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:23.806315+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.872501373s of 16.882905960s, submitted: 3
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77201408 unmapped: 16072704 heap: 93274112 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:24.806447+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 77291520 unmapped: 15982592 heap: 93274112 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:25.806593+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 92 ms_handle_reset con 0x5581ce5b0400 session 0x5581cce8f860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78430208 unmapped: 14843904 heap: 93274112 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 92 heartbeat osd_stat(store_statfs(0x1baf41000/0x0/0x1bfc00000, data 0x2065261/0x20ec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:26.806761+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841442 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78536704 unmapped: 15785984 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:27.806876+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 92 handle_osd_map epochs [92,93], i have 92, src has [1,93]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:28.807108+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:29.807222+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:30.807352+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:31.807537+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:32.807707+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:33.807893+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:34.808126+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:35.808289+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:36.808414+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:37.808533+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:38.808718+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:39.808885+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:40.809053+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:41.809195+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets getting new tickets!
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:42.809473+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _finish_auth 0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:42.810634+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:43.809639+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:44.809776+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:45.809919+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:46.810120+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:47.810341+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:48.810553+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:49.810729+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:50.810909+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:51.811078+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:52.811261+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:53.811431+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:54.811641+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:55.811788+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:56.811927+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:57.812099+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:58.812467+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:01:59.814269+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:00.815868+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:01.816016+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:02.816326+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:03.816473+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:04.816883+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:05.817142+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:06.817572+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:07.817698+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:08.818219+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:09.818363+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:10.818581+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:11.818771+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:12.818907+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:13.819039+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:14.819240+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:15.819374+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:16.819523+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843790 data_alloc: 184549376 data_used: 135168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:17.819651+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1ba73e000/0x0/0x1bfc00000, data 0x2867471/0x28ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:18.819859+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78553088 unmapped: 15769600 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:19.820090+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee23c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 55.974792480s of 56.213272095s, submitted: 36
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78561280 unmapped: 15761408 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:20.820258+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 ms_handle_reset con 0x5581cee23c00 session 0x5581cc16c000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 78594048 unmapped: 15728640 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 ms_handle_reset con 0x5581cee80800 session 0x5581cce8fc20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:21.820398+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 843070 data_alloc: 184549376 data_used: 139264
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee81000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 79642624 unmapped: 14680064 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:22.820557+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 ms_handle_reset con 0x5581cee81000 session 0x5581ce084000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce5b0400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 ms_handle_reset con 0x5581ce5b0400 session 0x5581cf84c960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 79929344 unmapped: 14393344 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:23.820941+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9402000/0x0/0x1bfc00000, data 0x37a34d3/0x382c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 79896576 unmapped: 14426112 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:24.821078+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9402000/0x0/0x1bfc00000, data 0x37a34d3/0x382c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 79896576 unmapped: 14426112 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:25.821229+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9402000/0x0/0x1bfc00000, data 0x37a34d3/0x382c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0e400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 ms_handle_reset con 0x5581cea0e400 session 0x5581cf84cb40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 79921152 unmapped: 14401536 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee23c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:26.821369+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9401000/0x0/0x1bfc00000, data 0x37a34e3/0x382d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 94 ms_handle_reset con 0x5581cee23c00 session 0x5581cf84d2c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 975516 data_alloc: 184549376 data_used: 147456
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 81092608 unmapped: 13230080 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:27.821470+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 81092608 unmapped: 13230080 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:28.821676+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 95 heartbeat osd_stat(store_statfs(0x1b93fe000/0x0/0x1bfc00000, data 0x37a56f3/0x3830000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 80928768 unmapped: 13393920 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:29.821820+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 95 ms_handle_reset con 0x5581cee80000 session 0x5581cf84d4a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 80986112 unmapped: 13336576 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:30.821974+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 80986112 unmapped: 13336576 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:31.822119+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 976465 data_alloc: 184549376 data_used: 155648
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 80986112 unmapped: 13336576 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:32.822239+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 95 heartbeat osd_stat(store_statfs(0x1b93fa000/0x0/0x1bfc00000, data 0x37a7947/0x3833000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 80994304 unmapped: 13328384 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:33.822320+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 80994304 unmapped: 13328384 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:34.822458+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 80994304 unmapped: 13328384 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:35.822604+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.296033859s of 15.826134682s, submitted: 114
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 80994304 unmapped: 13328384 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:36.822770+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 979437 data_alloc: 184549376 data_used: 155648
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 80994304 unmapped: 13328384 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:37.822925+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 87474176 unmapped: 6848512 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:38.823125+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 heartbeat osd_stat(store_statfs(0x1b78ed000/0x0/0x1bfc00000, data 0x4114a3d/0x41a1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 89407488 unmapped: 4915200 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:39.823273+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88260608 unmapped: 6062080 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:40.823420+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88260608 unmapped: 6062080 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:41.823591+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1109117 data_alloc: 184549376 data_used: 155648
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88268800 unmapped: 6053888 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:42.823774+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 heartbeat osd_stat(store_statfs(0x1b7125000/0x0/0x1bfc00000, data 0x48dca3d/0x4969000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce02ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 ms_handle_reset con 0x5581ce02ec00 session 0x5581ce084d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce5b0400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 ms_handle_reset con 0x5581ce5b0400 session 0x5581ce4890e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0e400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 ms_handle_reset con 0x5581cea0e400 session 0x5581cc3ab2c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88301568 unmapped: 6021120 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:43.823943+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee23c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 ms_handle_reset con 0x5581cee23c00 session 0x5581cc62c960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 ms_handle_reset con 0x5581cee80000 session 0x5581cee7cb40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88326144 unmapped: 5996544 heap: 94322688 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:44.824101+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc6e1800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 ms_handle_reset con 0x5581cc6e1800 session 0x5581cee7d0e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc6e1800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 ms_handle_reset con 0x5581cc6e1800 session 0x5581cee7d860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88039424 unmapped: 11624448 heap: 99663872 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:45.824237+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.810575485s of 10.573578835s, submitted: 189
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88047616 unmapped: 11616256 heap: 99663872 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:46.824431+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1171144 data_alloc: 184549376 data_used: 159744
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88047616 unmapped: 11616256 heap: 99663872 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:47.824727+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88047616 unmapped: 11616256 heap: 99663872 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:48.824939+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 heartbeat osd_stat(store_statfs(0x1b6a17000/0x0/0x1bfc00000, data 0x4fe9a9f/0x5077000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 87711744 unmapped: 11952128 heap: 99663872 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:49.825096+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 87711744 unmapped: 11952128 heap: 99663872 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:50.825262+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 87711744 unmapped: 11952128 heap: 99663872 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:51.825407+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce5b0400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 ms_handle_reset con 0x5581cf326000 session 0x5581cf84d680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 ms_handle_reset con 0x5581cee80800 session 0x5581cc16c5a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1179933 data_alloc: 184549376 data_used: 778240
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 87531520 unmapped: 12132352 heap: 99663872 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:52.825578+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 heartbeat osd_stat(store_statfs(0x1b69f1000/0x0/0x1bfc00000, data 0x500fa9f/0x509d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:53.825780+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 88317952 unmapped: 11345920 heap: 99663872 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0e400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 97 ms_handle_reset con 0x5581cea0e400 session 0x5581cc5b2b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee23c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 97 ms_handle_reset con 0x5581cee23c00 session 0x5581cc3aa3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc6e1800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:54.826045+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 94855168 unmapped: 7995392 heap: 102850560 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 97 ms_handle_reset con 0x5581cc6e1800 session 0x5581cd3770e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0e400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:55.826366+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 94126080 unmapped: 14581760 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 97 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 98 ms_handle_reset con 0x5581cea0e400 session 0x5581cc3a92c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 98 heartbeat osd_stat(store_statfs(0x1b5a9c000/0x0/0x1bfc00000, data 0x5f63caf/0x5ff2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.624345779s of 10.015730858s, submitted: 66
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:56.826554+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 94142464 unmapped: 14565376 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 99 ms_handle_reset con 0x5581cee80800 session 0x5581cee7cb40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1352119 data_alloc: 184549376 data_used: 4124672
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:57.826663+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 94142464 unmapped: 14565376 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 99 heartbeat osd_stat(store_statfs(0x1b5a92000/0x0/0x1bfc00000, data 0x5f681a1/0x5ffa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:58.826819+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 94142464 unmapped: 14565376 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 99 ms_handle_reset con 0x5581cf326000 session 0x5581cee7d860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:02:59.826960+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 94191616 unmapped: 14516224 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 99 ms_handle_reset con 0x5581cee80000 session 0x5581cd399c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc6e1800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 99 ms_handle_reset con 0x5581cc6e1800 session 0x5581cc4f23c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 99 heartbeat osd_stat(store_statfs(0x1b5a92000/0x0/0x1bfc00000, data 0x5f681a1/0x5ffa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:00.827136+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 90619904 unmapped: 18087936 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:01.827298+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 90628096 unmapped: 18079744 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1343893 data_alloc: 184549376 data_used: 4124672
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:02.827450+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 90628096 unmapped: 18079744 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0e400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581cea0e400 session 0x5581cc4f21e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:03.827639+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 91725824 unmapped: 16982016 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 heartbeat osd_stat(store_statfs(0x1b5a90000/0x0/0x1bfc00000, data 0x5f6a297/0x5ffd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:04.827789+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 93102080 unmapped: 15605760 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:05.828102+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96772096 unmapped: 11935744 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.657964706s of 10.196088791s, submitted: 181
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:06.828292+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96051200 unmapped: 12656640 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1225149 data_alloc: 184549376 data_used: 5615616
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:07.828418+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96051200 unmapped: 12656640 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:08.828605+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96051200 unmapped: 12656640 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 heartbeat osd_stat(store_statfs(0x1b6bf4000/0x0/0x1bfc00000, data 0x4e02235/0x4e94000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:09.828775+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581cee80800 session 0x5581cc62d2c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96051200 unmapped: 12656640 heap: 108707840 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581cf326000 session 0x5581ce295860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4eb400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581ce4eb400 session 0x5581ce461e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc6e1800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581cc6e1800 session 0x5581cc3a81e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0e400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581cea0e400 session 0x5581cc3a83c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581cee80800 session 0x5581cc3a9c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:10.828916+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 116121600 unmapped: 3350528 heap: 119472128 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581cf326000 session 0x5581ce0952c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:11.829343+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103768064 unmapped: 17801216 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 heartbeat osd_stat(store_statfs(0x1b58a8000/0x0/0x1bfc00000, data 0x6154235/0x61e6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1401732 data_alloc: 184549376 data_used: 12615680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:12.829586+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103776256 unmapped: 17793024 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:13.829803+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581ce5b0400 session 0x5581cee7d4a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103940096 unmapped: 17629184 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc6e1800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 heartbeat osd_stat(store_statfs(0x1b5882000/0x0/0x1bfc00000, data 0x617b1d3/0x620c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:14.829983+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 heartbeat osd_stat(store_statfs(0x1b5882000/0x0/0x1bfc00000, data 0x617b1d3/0x620c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0e400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 ms_handle_reset con 0x5581cea0e400 session 0x5581cce8fe00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 100 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104005632 unmapped: 17563648 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 101 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 101 ms_handle_reset con 0x5581cf329c00 session 0x5581ce489860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:15.830415+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98852864 unmapped: 22716416 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:16.830907+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98852864 unmapped: 22716416 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1273630 data_alloc: 184549376 data_used: 5709824
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:17.831283+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98852864 unmapped: 22716416 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:18.831726+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98852864 unmapped: 22716416 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:19.831861+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98852864 unmapped: 22716416 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.029957771s of 13.542131424s, submitted: 97
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 101 heartbeat osd_stat(store_statfs(0x1b67d0000/0x0/0x1bfc00000, data 0x522b437/0x52be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:20.831996+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95928320 unmapped: 25640960 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 102 ms_handle_reset con 0x5581cee80800 session 0x5581cc3ae960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:21.832175+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95928320 unmapped: 25640960 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1076532 data_alloc: 184549376 data_used: 278528
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:22.832314+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95928320 unmapped: 25640960 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:23.832468+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95928320 unmapped: 25640960 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:24.832629+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95928320 unmapped: 25640960 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:25.832770+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95903744 unmapped: 25665536 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 102 heartbeat osd_stat(store_statfs(0x1b7e0b000/0x0/0x1bfc00000, data 0x3bf04cb/0x3c83000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:26.832929+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95461376 unmapped: 26107904 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1083232 data_alloc: 184549376 data_used: 380928
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:27.833072+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95510528 unmapped: 26058752 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:28.833383+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 102 ms_handle_reset con 0x5581cc6e1800 session 0x5581cd398d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95379456 unmapped: 26189824 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:29.833536+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95395840 unmapped: 26173440 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.759989738s of 10.007678986s, submitted: 90
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 103 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:30.833743+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 103 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100917248 unmapped: 20652032 heap: 121569280 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 103 ms_handle_reset con 0x5581cf326000 session 0x5581cc67de00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffe000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:31.833935+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 97206272 unmapped: 35160064 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 103 heartbeat osd_stat(store_statfs(0x1b7688000/0x0/0x1bfc00000, data 0x43716db/0x4405000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 103 handle_osd_map epochs [103,104], i have 103, src has [1,104]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 104 ms_handle_reset con 0x5581ceffe000 session 0x5581ce4612c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 104 heartbeat osd_stat(store_statfs(0x1b64fa000/0x0/0x1bfc00000, data 0x55006db/0x5594000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:32.834120+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1293906 data_alloc: 184549376 data_used: 495616
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 97312768 unmapped: 35053568 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:33.834603+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 35037184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:34.834838+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 97345536 unmapped: 35020800 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc6e1800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 105 ms_handle_reset con 0x5581cc6e1800 session 0x5581ce0812c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 105 heartbeat osd_stat(store_statfs(0x1b7848000/0x0/0x1bfc00000, data 0x41aebcd/0x4246000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:35.834996+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96428032 unmapped: 35938304 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 105 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:36.835456+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96452608 unmapped: 35913728 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:37.835750+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1146460 data_alloc: 184549376 data_used: 221184
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96452608 unmapped: 35913728 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:38.836439+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 106 heartbeat osd_stat(store_statfs(0x1b7868000/0x0/0x1bfc00000, data 0x418ccc3/0x4225000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96452608 unmapped: 35913728 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:39.836746+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96468992 unmapped: 35897344 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee80800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.057348251s of 10.022316933s, submitted: 105
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:40.837944+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 106 ms_handle_reset con 0x5581cee80800 session 0x5581ce081c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108290048 unmapped: 24076288 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:41.838163+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108290048 unmapped: 24076288 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:42.838334+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1287417 data_alloc: 184549376 data_used: 9265152
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108453888 unmapped: 23912448 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:43.840306+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108462080 unmapped: 23904256 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 106 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 106 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 106 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 107 ms_handle_reset con 0x5581cf326000 session 0x5581ccdd8780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:44.840494+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 107 heartbeat osd_stat(store_statfs(0x1b857a000/0x0/0x1bfc00000, data 0x3478f27/0x3513000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 97910784 unmapped: 34455552 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 107 heartbeat osd_stat(store_statfs(0x1b857a000/0x0/0x1bfc00000, data 0x3478f27/0x3513000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:45.840814+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99966976 unmapped: 32399360 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:46.841062+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99966976 unmapped: 32399360 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:47.841239+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1097895 data_alloc: 184549376 data_used: 4427776
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99966976 unmapped: 32399360 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 107 ms_handle_reset con 0x5581cf329c00 session 0x5581cd376960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffe400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:48.841390+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 107 ms_handle_reset con 0x5581ceffe400 session 0x5581cc16d0e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98353152 unmapped: 34013184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 107 heartbeat osd_stat(store_statfs(0x1b916f000/0x0/0x1bfc00000, data 0x2884ec5/0x291e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:49.841517+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98418688 unmapped: 33947648 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.430244446s of 10.046833992s, submitted: 135
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:50.841701+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96280576 unmapped: 36085760 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:51.841878+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96346112 unmapped: 36020224 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:52.842164+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1024995 data_alloc: 184549376 data_used: 241664
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c02000/0x0/0x1bfc00000, data 0x2df0fbb/0x2e8b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96346112 unmapped: 36020224 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:53.842440+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96387072 unmapped: 35979264 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:54.842705+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96387072 unmapped: 35979264 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:55.842889+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 96387072 unmapped: 35979264 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:56.843063+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c02000/0x0/0x1bfc00000, data 0x2df0fbb/0x2e8b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95805440 unmapped: 36560896 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:57.843253+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1027254 data_alloc: 184549376 data_used: 241664
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95608832 unmapped: 36757504 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:58.843477+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95608832 unmapped: 36757504 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:03:59.843700+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c01000/0x0/0x1bfc00000, data 0x2df0fee/0x2e8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95608832 unmapped: 36757504 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:00.844186+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c01000/0x0/0x1bfc00000, data 0x2df0fee/0x2e8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:01.844393+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:02.844556+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1027254 data_alloc: 184549376 data_used: 241664
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:03.844734+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:04.844872+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c01000/0x0/0x1bfc00000, data 0x2df0fee/0x2e8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c01000/0x0/0x1bfc00000, data 0x2df0fee/0x2e8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:05.845029+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c01000/0x0/0x1bfc00000, data 0x2df0fee/0x2e8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:06.845169+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:07.845516+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c01000/0x0/0x1bfc00000, data 0x2df0fee/0x2e8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1027254 data_alloc: 184549376 data_used: 241664
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c01000/0x0/0x1bfc00000, data 0x2df0fee/0x2e8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:08.845702+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c01000/0x0/0x1bfc00000, data 0x2df0fee/0x2e8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:09.845826+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:10.845979+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:11.846136+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:12.846252+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1027254 data_alloc: 184549376 data_used: 241664
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:13.846360+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:14.846549+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 95617024 unmapped: 36749312 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 24.874217987s of 24.979799271s, submitted: 25
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8c01000/0x0/0x1bfc00000, data 0x2df0fee/0x2e8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:15.846701+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100335616 unmapped: 32030720 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:16.846854+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100335616 unmapped: 32030720 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:17.846999+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1149506 data_alloc: 184549376 data_used: 245760
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99377152 unmapped: 32989184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:18.847163+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99377152 unmapped: 32989184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:19.847309+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99377152 unmapped: 32989184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:20.847480+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b7cd1000/0x0/0x1bfc00000, data 0x3d20fee/0x3dbd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99385344 unmapped: 32980992 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:21.847669+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99385344 unmapped: 32980992 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:22.847812+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1149506 data_alloc: 184549376 data_used: 245760
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99385344 unmapped: 32980992 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:23.847964+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98705408 unmapped: 33660928 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:24.848105+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98713600 unmapped: 33652736 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 ms_handle_reset con 0x5581ceffec00 session 0x5581cc16dc20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:25.848235+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.438953400s of 10.872389793s, submitted: 113
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98754560 unmapped: 33611776 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b916c000/0x0/0x1bfc00000, data 0x2886fde/0x2922000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 ms_handle_reset con 0x5581cefff000 session 0x5581ce085c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:26.848428+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98779136 unmapped: 33587200 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:27.848720+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 990853 data_alloc: 184549376 data_used: 241664
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98779136 unmapped: 33587200 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:28.848949+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b916c000/0x0/0x1bfc00000, data 0x2886fbb/0x2921000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98779136 unmapped: 33587200 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:29.849091+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 98779136 unmapped: 33587200 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:30.849212+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99123200 unmapped: 33243136 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b8a03000/0x0/0x1bfc00000, data 0x2fef02d/0x308b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:31.849353+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 109 ms_handle_reset con 0x5581cefff400 session 0x5581cce8fa40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99131392 unmapped: 33234944 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:32.849474+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1065491 data_alloc: 184549376 data_used: 249856
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99221504 unmapped: 33144832 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:33.849635+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99221504 unmapped: 33144832 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:34.849815+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99221504 unmapped: 33144832 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:35.850015+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b89fd000/0x0/0x1bfc00000, data 0x2ff12a6/0x3091000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.286343575s of 10.023146629s, submitted: 79
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 99221504 unmapped: 33144832 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:36.850178+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b89fe000/0x0/0x1bfc00000, data 0x2ff1283/0x3090000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100286464 unmapped: 32079872 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 110 ms_handle_reset con 0x5581cefff800 session 0x5581cc62d0e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:37.850313+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1069069 data_alloc: 184549376 data_used: 258048
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100286464 unmapped: 32079872 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:38.850497+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100286464 unmapped: 32079872 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:39.850712+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100286464 unmapped: 32079872 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:40.850950+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 110 heartbeat osd_stat(store_statfs(0x1b89f9000/0x0/0x1bfc00000, data 0x2ff34c4/0x3093000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 110 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100401152 unmapped: 31965184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:41.851182+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100417536 unmapped: 31948800 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:42.851360+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1070939 data_alloc: 184549376 data_used: 262144
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100417536 unmapped: 31948800 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:43.851529+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100417536 unmapped: 31948800 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:44.851706+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b89f8000/0x0/0x1bfc00000, data 0x2ff55ba/0x3096000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100417536 unmapped: 31948800 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:45.851860+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100417536 unmapped: 31948800 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:46.852004+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 100417536 unmapped: 31948800 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:47.852159+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b89f8000/0x0/0x1bfc00000, data 0x2ff55ba/0x3096000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.881049156s of 11.994025230s, submitted: 39
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1071011 data_alloc: 184549376 data_used: 262144
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103366656 unmapped: 28999680 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:48.852332+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b8084000/0x0/0x1bfc00000, data 0x39695ba/0x3a0a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 101851136 unmapped: 30515200 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:49.852450+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102858752 unmapped: 29507584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:50.852601+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b7d20000/0x0/0x1bfc00000, data 0x3ccd5ba/0x3d6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102858752 unmapped: 29507584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:51.852790+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102858752 unmapped: 29507584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:52.852946+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1178039 data_alloc: 184549376 data_used: 262144
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102875136 unmapped: 29491200 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:53.853128+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b7d20000/0x0/0x1bfc00000, data 0x3ccd5ba/0x3d6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102875136 unmapped: 29491200 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:54.853291+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102875136 unmapped: 29491200 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:55.853476+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102907904 unmapped: 29458432 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b7cfb000/0x0/0x1bfc00000, data 0x3cf25ba/0x3d93000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:56.853662+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102907904 unmapped: 29458432 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:57.853794+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1175495 data_alloc: 184549376 data_used: 266240
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102907904 unmapped: 29458432 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:58.853964+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102907904 unmapped: 29458432 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:04:59.854118+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b7cfb000/0x0/0x1bfc00000, data 0x3cf25ba/0x3d93000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.635066986s of 12.118112564s, submitted: 135
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102907904 unmapped: 29458432 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:00.854260+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:01.854431+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:02.854574+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1018947 data_alloc: 184549376 data_used: 258048
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:03.854727+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b88ca000/0x0/0x1bfc00000, data 0x288d525/0x292b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:04.854900+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b88ca000/0x0/0x1bfc00000, data 0x288d525/0x292b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:05.855079+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:06.855251+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:07.855379+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1018947 data_alloc: 184549376 data_used: 258048
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:08.855544+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b88ca000/0x0/0x1bfc00000, data 0x288d525/0x292b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:09.855740+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:10.855951+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:11.856862+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:12.857049+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1018947 data_alloc: 184549376 data_used: 258048
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102924288 unmapped: 29442048 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:13.857213+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102924288 unmapped: 29442048 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b88ca000/0x0/0x1bfc00000, data 0x288d525/0x292b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:14.857384+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102924288 unmapped: 29442048 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:15.857727+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102924288 unmapped: 29442048 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:16.857880+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102924288 unmapped: 29442048 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefffc00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.191143036s of 17.366828918s, submitted: 50
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:17.858068+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1026462 data_alloc: 184549376 data_used: 266240
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 112 ms_handle_reset con 0x5581cefffc00 session 0x5581ce6dbc20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102957056 unmapped: 29409280 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:18.859061+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102957056 unmapped: 29409280 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:19.859170+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 112 handle_osd_map epochs [112,113], i have 112, src has [1,113]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102989824 unmapped: 29376512 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 113 ms_handle_reset con 0x5581ceffec00 session 0x5581ce6db680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 113 heartbeat osd_stat(store_statfs(0x1b915a000/0x0/0x1bfc00000, data 0x2891999/0x2932000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:20.859334+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103792640 unmapped: 28573696 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 113 heartbeat osd_stat(store_statfs(0x1b915a000/0x0/0x1bfc00000, data 0x2891999/0x2932000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:21.859673+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103792640 unmapped: 28573696 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:22.859840+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1028118 data_alloc: 184549376 data_used: 274432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103792640 unmapped: 28573696 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:23.860020+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103792640 unmapped: 28573696 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 113 heartbeat osd_stat(store_statfs(0x1b915a000/0x0/0x1bfc00000, data 0x2891999/0x2932000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:24.860357+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103792640 unmapped: 28573696 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 113 heartbeat osd_stat(store_statfs(0x1b915a000/0x0/0x1bfc00000, data 0x2891999/0x2932000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:25.860621+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103792640 unmapped: 28573696 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:26.860779+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102793216 unmapped: 29573120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:27.860937+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1029991 data_alloc: 184549376 data_used: 274432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102793216 unmapped: 29573120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:28.861185+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102793216 unmapped: 29573120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:29.861370+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102793216 unmapped: 29573120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:30.861597+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102793216 unmapped: 29573120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:31.861820+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102793216 unmapped: 29573120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:32.862061+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1029991 data_alloc: 184549376 data_used: 274432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102793216 unmapped: 29573120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:33.862273+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102793216 unmapped: 29573120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:34.862478+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102793216 unmapped: 29573120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:35.862688+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102801408 unmapped: 29564928 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:36.862825+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102801408 unmapped: 29564928 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:37.863018+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1029991 data_alloc: 184549376 data_used: 274432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102801408 unmapped: 29564928 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:38.863222+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102801408 unmapped: 29564928 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:39.863392+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102801408 unmapped: 29564928 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:40.863566+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102801408 unmapped: 29564928 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:41.863704+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102801408 unmapped: 29564928 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:42.863878+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1029991 data_alloc: 184549376 data_used: 274432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102801408 unmapped: 29564928 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:43.864079+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102817792 unmapped: 29548544 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:44.864239+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102817792 unmapped: 29548544 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:45.864407+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102817792 unmapped: 29548544 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:46.864821+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102817792 unmapped: 29548544 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:47.864969+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1029991 data_alloc: 184549376 data_used: 274432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102817792 unmapped: 29548544 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:48.865081+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102817792 unmapped: 29548544 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:49.865218+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102817792 unmapped: 29548544 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:50.865354+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102817792 unmapped: 29548544 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:51.865454+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:52.865586+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1029991 data_alloc: 184549376 data_used: 274432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:53.865763+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:54.865912+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:55.866051+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:56.866201+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:57.866397+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1029991 data_alloc: 184549376 data_used: 274432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:58.866592+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:05:59.866783+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:00.866921+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:01.867058+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:02.867221+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1029991 data_alloc: 184549376 data_used: 274432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:03.867368+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9158000/0x0/0x1bfc00000, data 0x2893a8f/0x2935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:04.867500+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:05.867635+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102825984 unmapped: 29540352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.009002686s of 49.154537201s, submitted: 52
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:06.867780+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102850560 unmapped: 29515776 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 ms_handle_reset con 0x5581cefff400 session 0x5581ce6daf00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:07.867942+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102858752 unmapped: 29507584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1035087 data_alloc: 184549376 data_used: 282624
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9154000/0x0/0x1bfc00000, data 0x289609f/0x2939000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:08.868329+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102858752 unmapped: 29507584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:09.868547+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102875136 unmapped: 29491200 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:10.869209+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102916096 unmapped: 29450240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:11.869314+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 102957056 unmapped: 29409280 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 116 handle_osd_map epochs [116,117], i have 116, src has [1,117]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 117 ms_handle_reset con 0x5581cea0ec00 session 0x5581ceedb2c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 117 heartbeat osd_stat(store_statfs(0x1b654c000/0x0/0x1bfc00000, data 0x509a508/0x5141000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 117 ms_handle_reset con 0x5581cea0f800 session 0x5581ce094f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:12.869464+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111386624 unmapped: 20979712 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce5b0400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 117 ms_handle_reset con 0x5581ce5b0400 session 0x5581ce6dba40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1375977 data_alloc: 184549376 data_used: 290816
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:13.869648+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103030784 unmapped: 29335552 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 118 ms_handle_reset con 0x5581cea0ec00 session 0x5581ce4881e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:14.869800+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103112704 unmapped: 29253632 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 119 ms_handle_reset con 0x5581cefff800 session 0x5581ce6dad20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 119 heartbeat osd_stat(store_statfs(0x1b5543000/0x0/0x1bfc00000, data 0x609ecb5/0x6149000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 119 ms_handle_reset con 0x5581cea0f800 session 0x5581ce4601e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:15.869948+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103170048 unmapped: 29196288 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 119 heartbeat osd_stat(store_statfs(0x1b5543000/0x0/0x1bfc00000, data 0x609ecb5/0x6149000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:16.870095+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103194624 unmapped: 29171712 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.844464302s of 10.419473648s, submitted: 98
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 120 ms_handle_reset con 0x5581ceffec00 session 0x5581d07f8f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:17.870290+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103366656 unmapped: 28999680 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 120 heartbeat osd_stat(store_statfs(0x1b8d40000/0x0/0x1bfc00000, data 0x28a1071/0x294c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 120 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 120 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 120 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1065698 data_alloc: 184549376 data_used: 290816
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 121 ms_handle_reset con 0x5581cefff400 session 0x5581cd3985a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:18.870519+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103440384 unmapped: 28925952 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:19.870903+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103456768 unmapped: 28909568 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 122 ms_handle_reset con 0x5581cea0ec00 session 0x5581cee7d4a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:20.871237+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103514112 unmapped: 28852224 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 123 heartbeat osd_stat(store_statfs(0x1b8d36000/0x0/0x1bfc00000, data 0x28a73f9/0x2957000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 123 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 123 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:21.871557+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103587840 unmapped: 28778496 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:22.871690+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103636992 unmapped: 28729344 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1080018 data_alloc: 184549376 data_used: 290816
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 125 ms_handle_reset con 0x5581cea0f800 session 0x5581cee7cf00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:23.871822+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103636992 unmapped: 28729344 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:24.871978+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103710720 unmapped: 28655616 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:25.872150+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103710720 unmapped: 28655616 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 126 handle_osd_map epochs [125,127], i have 126, src has [1,127]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:26.872345+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103759872 unmapped: 28606464 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 127 ms_handle_reset con 0x5581ceffec00 session 0x5581cee7cb40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 127 heartbeat osd_stat(store_statfs(0x1b8d25000/0x0/0x1bfc00000, data 0x28afd55/0x2968000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.826321602s of 10.175097466s, submitted: 167
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:27.872484+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103784448 unmapped: 28581888 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1087590 data_alloc: 184549376 data_used: 290816
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:28.872821+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 128 heartbeat osd_stat(store_statfs(0x1b8d20000/0x0/0x1bfc00000, data 0x28b1f67/0x296b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103784448 unmapped: 28581888 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:29.872939+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103817216 unmapped: 28549120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:30.873106+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103817216 unmapped: 28549120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:31.873248+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103849984 unmapped: 28516352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 130 ms_handle_reset con 0x5581cefff800 session 0x5581cee7c5a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:32.873453+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103751680 unmapped: 28614656 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 130 heartbeat osd_stat(store_statfs(0x1b8d1a000/0x0/0x1bfc00000, data 0x28b62b9/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1094814 data_alloc: 184549376 data_used: 290816
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:33.873684+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103817216 unmapped: 28549120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:34.873846+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea09400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 131 ms_handle_reset con 0x5581cea09400 session 0x5581cc16c780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 103817216 unmapped: 28549120 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 131 handle_osd_map epochs [132,132], i have 131, src has [1,132]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:35.874055+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104882176 unmapped: 27484160 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 132 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:36.874190+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 132 ms_handle_reset con 0x5581cea0ec00 session 0x5581cc16da40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104906752 unmapped: 27459584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 132 heartbeat osd_stat(store_statfs(0x1b8d14000/0x0/0x1bfc00000, data 0x28ba791/0x2979000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:37.874441+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104906752 unmapped: 27459584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1098702 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:38.874694+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104906752 unmapped: 27459584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:39.874885+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104906752 unmapped: 27459584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 8033 writes, 34K keys, 8033 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 8033 writes, 1906 syncs, 4.21 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3169 writes, 12K keys, 3169 commit groups, 1.0 writes per commit group, ingest: 12.35 MB, 0.02 MB/s
                                                          Interval WAL: 3169 writes, 1306 syncs, 2.43 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:40.875177+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104906752 unmapped: 27459584 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.049814224s of 14.190037727s, submitted: 72
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:41.875347+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d15000/0x0/0x1bfc00000, data 0x28ba781/0x2978000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 27426816 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:42.875505+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 27426816 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1101676 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:43.875683+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 27426816 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:44.875916+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 27426816 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:45.876095+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 27426816 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d12000/0x0/0x1bfc00000, data 0x28bc897/0x297b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:46.876256+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 27426816 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:47.876450+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 ms_handle_reset con 0x5581cea0f800 session 0x5581cc16c5a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 27426816 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1104052 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:48.876649+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d12000/0x0/0x1bfc00000, data 0x28bc8f9/0x297c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 27426816 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:49.876811+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 104939520 unmapped: 27426816 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:50.876952+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 ms_handle_reset con 0x5581ceffec00 session 0x5581ce2a3e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105037824 unmapped: 27328512 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d12000/0x0/0x1bfc00000, data 0x28bc8f9/0x297c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:51.879738+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105037824 unmapped: 27328512 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:52.879882+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105037824 unmapped: 27328512 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1102610 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.119197845s of 12.268100739s, submitted: 46
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:53.880041+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 ms_handle_reset con 0x5581cefff800 session 0x5581ce7ae3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105046016 unmapped: 27320320 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:54.880290+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105046016 unmapped: 27320320 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d13000/0x0/0x1bfc00000, data 0x28bc897/0x297b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:55.880438+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105046016 unmapped: 27320320 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb14800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 ms_handle_reset con 0x5581cfb14800 session 0x5581ce7ae780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:56.880649+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105054208 unmapped: 27312128 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:57.880829+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 ms_handle_reset con 0x5581cea0ec00 session 0x5581ce7ae960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105062400 unmapped: 27303936 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 ms_handle_reset con 0x5581cea0f800 session 0x5581ce7aeb40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1103670 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:58.881035+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105078784 unmapped: 27287552 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:06:59.881199+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105078784 unmapped: 27287552 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:00.881329+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d12000/0x0/0x1bfc00000, data 0x28bc897/0x297b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105078784 unmapped: 27287552 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:01.881452+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105078784 unmapped: 27287552 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:02.881606+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105078784 unmapped: 27287552 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1103670 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:03.881800+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.004660606s of 10.056367874s, submitted: 12
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105111552 unmapped: 27254784 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:04.882065+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105127936 unmapped: 27238400 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 ms_handle_reset con 0x5581ceffec00 session 0x5581ce7aef00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d13000/0x0/0x1bfc00000, data 0x28bc897/0x297b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:05.882236+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105136128 unmapped: 27230208 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d13000/0x0/0x1bfc00000, data 0x28bc897/0x297b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:06.882396+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105136128 unmapped: 27230208 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:07.882660+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105136128 unmapped: 27230208 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1103515 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:08.882900+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105136128 unmapped: 27230208 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:09.883138+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105136128 unmapped: 27230208 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:10.883331+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105136128 unmapped: 27230208 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d13000/0x0/0x1bfc00000, data 0x28bc897/0x297b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:11.883500+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d13000/0x0/0x1bfc00000, data 0x28bc897/0x297b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105136128 unmapped: 27230208 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:12.883695+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105144320 unmapped: 27222016 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1103515 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:13.883932+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105144320 unmapped: 27222016 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.830871582s of 10.923226357s, submitted: 24
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:14.884074+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105291776 unmapped: 27074560 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 44
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:15.884293+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105496576 unmapped: 26869760 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d0d000/0x0/0x1bfc00000, data 0x28c16cd/0x2981000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:16.884462+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105496576 unmapped: 26869760 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:17.884678+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105496576 unmapped: 26869760 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1107091 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:18.884951+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105504768 unmapped: 26861568 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:19.885150+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105504768 unmapped: 26861568 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:20.885490+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d07000/0x0/0x1bfc00000, data 0x28c7f7b/0x2987000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105529344 unmapped: 26836992 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 45
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:21.885716+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105496576 unmapped: 26869760 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:22.885890+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105496576 unmapped: 26869760 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1106065 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d05000/0x0/0x1bfc00000, data 0x28ca0e9/0x2989000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:23.886064+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105496576 unmapped: 26869760 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:24.886248+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105496576 unmapped: 26869760 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:25.886390+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105496576 unmapped: 26869760 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:26.886770+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.112289429s of 12.226535797s, submitted: 28
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105521152 unmapped: 26845184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:27.886965+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105521152 unmapped: 26845184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1105831 data_alloc: 184549376 data_used: 294912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:28.887171+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d04000/0x0/0x1bfc00000, data 0x28cb274/0x298a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105521152 unmapped: 26845184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:29.887646+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105521152 unmapped: 26845184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:30.887806+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105521152 unmapped: 26845184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 ms_handle_reset con 0x5581cefff800 session 0x5581ce7af2c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee82c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:31.887951+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 heartbeat osd_stat(store_statfs(0x1b8d01000/0x0/0x1bfc00000, data 0x28cbd8a/0x298d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105521152 unmapped: 26845184 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 134 ms_handle_reset con 0x5581cee82c00 session 0x5581ce7af4a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:32.888223+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105562112 unmapped: 26804224 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 135 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 135 ms_handle_reset con 0x5581cea0ec00 session 0x5581cf4441e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1126649 data_alloc: 184549376 data_used: 303104
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:33.888409+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105586688 unmapped: 26779648 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:34.888572+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 136 heartbeat osd_stat(store_statfs(0x1b8cf2000/0x0/0x1bfc00000, data 0x28d2a14/0x2999000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 136 ms_handle_reset con 0x5581cea0f800 session 0x5581ce7afa40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105594880 unmapped: 26771456 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:35.888722+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105619456 unmapped: 26746880 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:36.888889+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.810070992s of 10.005300522s, submitted: 47
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cefff800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105799680 unmapped: 26566656 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 137 ms_handle_reset con 0x5581ceffec00 session 0x5581ce7afc20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:37.889040+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae3c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 137 ms_handle_reset con 0x5581d1ae3c00 session 0x5581ce489860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105897984 unmapped: 26468352 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 138 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce6dab40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:38.889212+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1143592 data_alloc: 184549376 data_used: 311296
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 105988096 unmapped: 26378240 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 139 ms_handle_reset con 0x5581ce4a1000 session 0x5581cfe6cb40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 139 heartbeat osd_stat(store_statfs(0x1b8ce7000/0x0/0x1bfc00000, data 0x28db130/0x29a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 139 ms_handle_reset con 0x5581cea0ec00 session 0x5581ccdd94a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:39.889363+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106004480 unmapped: 26361856 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:40.889520+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106053632 unmapped: 26312704 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:41.889684+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 140 ms_handle_reset con 0x5581cea0f800 session 0x5581ce080f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 140 heartbeat osd_stat(store_statfs(0x1b8cdc000/0x0/0x1bfc00000, data 0x28e48fa/0x29b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106094592 unmapped: 26271744 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:42.889880+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106119168 unmapped: 26247168 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 141 ms_handle_reset con 0x5581ceffec00 session 0x5581cc127680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:43.890034+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1152671 data_alloc: 184549376 data_used: 335872
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae3c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106119168 unmapped: 26247168 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 142 ms_handle_reset con 0x5581d1ae3c00 session 0x5581cfe6c780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:44.890208+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106176512 unmapped: 26189824 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 143 ms_handle_reset con 0x5581ce4a1000 session 0x5581cea47e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:45.890342+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 143 handle_osd_map epochs [142,143], i have 143, src has [1,143]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106299392 unmapped: 26066944 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 144 ms_handle_reset con 0x5581cea0ec00 session 0x5581cea47c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:46.890560+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.308780670s of 10.003767014s, submitted: 231
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106397696 unmapped: 25968640 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 145 ms_handle_reset con 0x5581cea0f800 session 0x5581ce095e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 145 heartbeat osd_stat(store_statfs(0x1b8cc6000/0x0/0x1bfc00000, data 0x28f4550/0x29c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:47.890792+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106438656 unmapped: 25927680 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:48.891371+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1161927 data_alloc: 184549376 data_used: 335872
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106438656 unmapped: 25927680 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:49.891541+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 145 heartbeat osd_stat(store_statfs(0x1b8cc6000/0x0/0x1bfc00000, data 0x28f4550/0x29c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106438656 unmapped: 25927680 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:50.891758+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106496000 unmapped: 25870336 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:51.891908+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106512384 unmapped: 25853952 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:52.892059+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106176512 unmapped: 26189824 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:53.892205+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1175584 data_alloc: 184549376 data_used: 348160
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 148 ms_handle_reset con 0x5581ceffec00 session 0x5581cfaf7a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106233856 unmapped: 26132480 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae3c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 149 ms_handle_reset con 0x5581d1ae3c00 session 0x5581ce0954a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:54.892491+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106266624 unmapped: 26099712 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 149 heartbeat osd_stat(store_statfs(0x1b8ca5000/0x0/0x1bfc00000, data 0x290d40d/0x29e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:55.892666+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 150 heartbeat osd_stat(store_statfs(0x1b8ca5000/0x0/0x1bfc00000, data 0x290e640/0x29e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106307584 unmapped: 26058752 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 150 ms_handle_reset con 0x5581ce4a1000 session 0x5581cfe6c3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:56.892841+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.475079536s of 10.002815247s, submitted: 155
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 150 ms_handle_reset con 0x5581cea0ec00 session 0x5581cce8ef00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 106463232 unmapped: 25903104 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:57.893000+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 107511808 unmapped: 24854528 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:58.893153+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1182364 data_alloc: 184549376 data_used: 352256
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 107528192 unmapped: 24838144 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:07:59.893293+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 107560960 unmapped: 24805376 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:00.893490+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 151 heartbeat osd_stat(store_statfs(0x1b8c93000/0x0/0x1bfc00000, data 0x2922dfb/0x29fb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108625920 unmapped: 23740416 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:01.893713+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 151 ms_handle_reset con 0x5581cea0f800 session 0x5581cd399a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 151 heartbeat osd_stat(store_statfs(0x1b8c8d000/0x0/0x1bfc00000, data 0x29250dd/0x2a00000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108625920 unmapped: 23740416 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:02.893824+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 151 handle_osd_map epochs [152,152], i have 151, src has [1,152]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108625920 unmapped: 23740416 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:03.893964+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1194869 data_alloc: 184549376 data_used: 364544
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108691456 unmapped: 23674880 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae3800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:04.894104+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 152 ms_handle_reset con 0x5581d1ae3800 session 0x5581ceedb860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 46
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108888064 unmapped: 23478272 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:05.894224+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b8c75000/0x0/0x1bfc00000, data 0x293b3c5/0x2a18000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108888064 unmapped: 23478272 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:06.894376+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b8c77000/0x0/0x1bfc00000, data 0x293b2b4/0x2a17000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.692279816s of 10.002397537s, submitted: 114
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108986368 unmapped: 23379968 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:07.894737+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108658688 unmapped: 23707648 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:08.894927+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1196055 data_alloc: 184549376 data_used: 364544
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108658688 unmapped: 23707648 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:09.895042+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b8c61000/0x0/0x1bfc00000, data 0x29517ec/0x2a2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108658688 unmapped: 23707648 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:10.895175+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 152 handle_osd_map epochs [152,153], i have 152, src has [1,153]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108675072 unmapped: 23691264 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:11.895332+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108158976 unmapped: 24207360 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:12.895503+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108158976 unmapped: 24207360 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:13.895714+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1197117 data_alloc: 184549376 data_used: 380928
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108158976 unmapped: 24207360 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:14.895907+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 heartbeat osd_stat(store_statfs(0x1b8c50000/0x0/0x1bfc00000, data 0x2962768/0x2a3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108158976 unmapped: 24207360 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:15.896089+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108158976 unmapped: 24207360 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:16.896261+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.831487656s of 10.001299858s, submitted: 49
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108183552 unmapped: 24182784 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:17.896403+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 heartbeat osd_stat(store_statfs(0x1b8c4c000/0x0/0x1bfc00000, data 0x2966d1f/0x2a42000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108183552 unmapped: 24182784 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:18.896598+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1197219 data_alloc: 184549376 data_used: 380928
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108183552 unmapped: 24182784 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:19.896792+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108183552 unmapped: 24182784 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:20.896967+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108183552 unmapped: 24182784 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:21.897888+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 heartbeat osd_stat(store_statfs(0x1b8c4c000/0x0/0x1bfc00000, data 0x2966d1f/0x2a42000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108183552 unmapped: 24182784 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:22.898087+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108183552 unmapped: 24182784 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:23.898256+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1193319 data_alloc: 184549376 data_used: 380928
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108191744 unmapped: 24174592 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:24.898425+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108191744 unmapped: 24174592 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:25.898726+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 heartbeat osd_stat(store_statfs(0x1b8c4a000/0x0/0x1bfc00000, data 0x2968e93/0x2a44000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108191744 unmapped: 24174592 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:26.898863+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.575002670s of 10.589783669s, submitted: 3
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108191744 unmapped: 24174592 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:27.899035+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108191744 unmapped: 24174592 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:28.899235+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1193999 data_alloc: 184549376 data_used: 380928
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 heartbeat osd_stat(store_statfs(0x1b8c43000/0x0/0x1bfc00000, data 0x296f81d/0x2a4b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108191744 unmapped: 24174592 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:29.899395+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108191744 unmapped: 24174592 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:30.899938+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108199936 unmapped: 24166400 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:31.900116+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108199936 unmapped: 24166400 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:32.900333+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 heartbeat osd_stat(store_statfs(0x1b8c36000/0x0/0x1bfc00000, data 0x297b410/0x2a58000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108199936 unmapped: 24166400 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:33.900674+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1197703 data_alloc: 184549376 data_used: 380928
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108199936 unmapped: 24166400 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:34.900898+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108199936 unmapped: 24166400 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:35.901141+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108208128 unmapped: 24158208 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:36.901349+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108224512 unmapped: 24141824 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:37.901477+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 heartbeat osd_stat(store_statfs(0x1b8c27000/0x0/0x1bfc00000, data 0x298bcd1/0x2a67000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 153 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.211355209s of 10.350407600s, submitted: 28
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108232704 unmapped: 24133632 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:38.901803+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1201075 data_alloc: 184549376 data_used: 389120
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108232704 unmapped: 24133632 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:39.902084+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108232704 unmapped: 24133632 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae3400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 154 ms_handle_reset con 0x5581d1ae3400 session 0x5581ceedbc20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:40.902325+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108240896 unmapped: 24125440 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:41.902487+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108249088 unmapped: 24117248 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:42.902747+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 154 heartbeat osd_stat(store_statfs(0x1b8c21000/0x0/0x1bfc00000, data 0x298ec66/0x2a6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 154 ms_handle_reset con 0x5581ce4a1000 session 0x5581ceedbe00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108249088 unmapped: 24117248 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:43.902925+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1202165 data_alloc: 184549376 data_used: 389120
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108249088 unmapped: 24117248 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:44.903068+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 154 ms_handle_reset con 0x5581cea0ec00 session 0x5581cfe6c3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108249088 unmapped: 24117248 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:45.903319+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 154 heartbeat osd_stat(store_statfs(0x1b8c21000/0x0/0x1bfc00000, data 0x298ec66/0x2a6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae3800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581cea0f800 session 0x5581cc4f23c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581d1ae3800 session 0x5581cea47c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae3000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae2c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108339200 unmapped: 24027136 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:46.903516+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581d1ae3000 session 0x5581cc3ab2c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581d1ae2c00 session 0x5581ccdd94a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 23945216 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:47.903777+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 23945216 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:48.904055+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1206306 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 23945216 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:49.904237+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.088663101s of 12.351757050s, submitted: 89
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581ce4a1000 session 0x5581d06705a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 23945216 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8c1e000/0x0/0x1bfc00000, data 0x2990d5c/0x2a70000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:50.904412+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108421120 unmapped: 23945216 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581cea0ec00 session 0x5581d0670960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:51.904598+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8c1f000/0x0/0x1bfc00000, data 0x2990cfa/0x2a6f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108429312 unmapped: 23937024 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:52.904816+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108429312 unmapped: 23937024 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:53.904991+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1204729 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108429312 unmapped: 23937024 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:54.905764+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8c1f000/0x0/0x1bfc00000, data 0x2990cfa/0x2a6f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108429312 unmapped: 23937024 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:55.905931+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108437504 unmapped: 23928832 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:56.906121+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108437504 unmapped: 23928832 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8c17000/0x0/0x1bfc00000, data 0x2998530/0x2a77000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:57.906282+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108437504 unmapped: 23928832 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:58.906441+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1205449 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 108437504 unmapped: 23928832 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:08:59.906649+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8c17000/0x0/0x1bfc00000, data 0x2998530/0x2a77000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.107388496s of 10.208693504s, submitted: 25
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109559808 unmapped: 22806528 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:00.906798+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109559808 unmapped: 22806528 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:01.906949+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109559808 unmapped: 22806528 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:02.907144+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581cea0f800 session 0x5581d0670b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae3800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109568000 unmapped: 22798336 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581d1ae3800 session 0x5581d0670f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:03.907232+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8c0a000/0x0/0x1bfc00000, data 0x29a572b/0x2a84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1207913 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109600768 unmapped: 22765568 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:04.907400+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109600768 unmapped: 22765568 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:05.907587+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109600768 unmapped: 22765568 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:06.907953+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109633536 unmapped: 22732800 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581ce4a1000 session 0x5581d06710e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:07.908122+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8bf3000/0x0/0x1bfc00000, data 0x29bceec/0x2a9b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109633536 unmapped: 22732800 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:08.908303+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1211261 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 47
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109674496 unmapped: 22691840 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:09.908412+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581cea0ec00 session 0x5581d0312780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581cea0f800 session 0x5581d06712c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae2c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.840666771s of 10.122661591s, submitted: 69
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109486080 unmapped: 22880256 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581d1ae2c00 session 0x5581d0671680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:10.908549+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae2800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581d1ae2800 session 0x5581d0671a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b82b1000/0x0/0x1bfc00000, data 0x32fd3cd/0x33dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109502464 unmapped: 22863872 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:11.908704+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581ce4a1000 session 0x5581d0671c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581cea0ec00 session 0x5581d0671e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109420544 unmapped: 22945792 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:12.942102+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109420544 unmapped: 22945792 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:13.942250+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1218253 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109420544 unmapped: 22945792 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:14.942409+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109420544 unmapped: 22945792 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:15.942570+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109420544 unmapped: 22945792 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:16.942749+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8bda000/0x0/0x1bfc00000, data 0x29d5469/0x2ab4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109510656 unmapped: 22855680 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:17.942881+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:18.943052+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109510656 unmapped: 22855680 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1217661 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:19.943212+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109510656 unmapped: 22855680 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:20.943388+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109510656 unmapped: 22855680 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.050721169s of 10.379554749s, submitted: 75
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:21.943551+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109527040 unmapped: 22839296 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:22.943685+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109543424 unmapped: 22822912 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8bc0000/0x0/0x1bfc00000, data 0x29edac2/0x2acd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:23.944776+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109543424 unmapped: 22822912 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1222269 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:24.944962+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109543424 unmapped: 22822912 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:25.945168+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109543424 unmapped: 22822912 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:26.945347+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 109584384 unmapped: 22781952 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:27.946509+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 110665728 unmapped: 21700608 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8ba1000/0x0/0x1bfc00000, data 0x2a0e15f/0x2aed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:28.948382+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 110665728 unmapped: 21700608 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1222003 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:29.948565+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 110673920 unmapped: 21692416 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:30.948801+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 110755840 unmapped: 21610496 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.728881836s of 10.003469467s, submitted: 65
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8793000/0x0/0x1bfc00000, data 0x2a1ceeb/0x2afb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8793000/0x0/0x1bfc00000, data 0x2a1ceeb/0x2afb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:31.949076+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 110772224 unmapped: 21594112 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:32.950754+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 110813184 unmapped: 21553152 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581cea0f800 session 0x5581d06705a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:33.951007+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 110911488 unmapped: 21454848 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1226407 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:34.951257+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 110927872 unmapped: 21438464 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:35.951531+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae2c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 110952448 unmapped: 21413888 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8766000/0x0/0x1bfc00000, data 0x2a4826c/0x2b27000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581d1ae2c00 session 0x5581d0670960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:36.951859+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 ms_handle_reset con 0x5581ceffec00 session 0x5581cea470e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111484928 unmapped: 20881408 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b8765000/0x0/0x1bfc00000, data 0x2a48a1e/0x2b28000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:37.952061+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111501312 unmapped: 20865024 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:38.952350+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111501312 unmapped: 20865024 heap: 132366336 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 48
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1332891 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b7f5d000/0x0/0x1bfc00000, data 0x3251ced/0x3331000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:39.952683+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111575040 unmapped: 29188096 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:40.953040+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111575040 unmapped: 29188096 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.854370117s of 10.221952438s, submitted: 242
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:41.953460+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b674f000/0x0/0x1bfc00000, data 0x4a601b0/0x4b3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111599616 unmapped: 29163520 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:42.953722+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111607808 unmapped: 29155328 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:43.953985+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111583232 unmapped: 29179904 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1607451 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:44.954317+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 120504320 unmapped: 20258816 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:45.954478+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111730688 unmapped: 29032448 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b3740000/0x0/0x1bfc00000, data 0x7a6f8c8/0x7b4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:46.954729+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111779840 unmapped: 28983296 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:47.954883+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111796224 unmapped: 28966912 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:48.955067+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 111902720 unmapped: 28860416 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2070723 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:49.955300+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 112033792 unmapped: 28729344 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1b072c000/0x0/0x1bfc00000, data 0xaa8282c/0xab62000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:50.955476+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 121593856 unmapped: 19169280 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.219146729s of 10.104035378s, submitted: 67
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:51.955682+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 114343936 unmapped: 26419200 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:52.955937+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122822656 unmapped: 17940480 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:53.956115+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 114483200 unmapped: 26279936 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2456745 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:54.956324+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 114565120 unmapped: 26198016 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1ad70f000/0x0/0x1bfc00000, data 0xda9fc4b/0xdb7f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:55.956512+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123011072 unmapped: 17752064 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:56.956737+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 114647040 unmapped: 26116096 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:57.956922+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 114835456 unmapped: 25927680 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:58.957157+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 114835456 unmapped: 25927680 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2727347 data_alloc: 184549376 data_used: 401408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:09:59.957335+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 114909184 unmapped: 25853952 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 heartbeat osd_stat(store_statfs(0x1a9d5b000/0x0/0x1bfc00000, data 0x102b3a64/0x10393000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:00.957470+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 114925568 unmapped: 25837568 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.470504761s of 10.174757957s, submitted: 62
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:01.957603+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 115163136 unmapped: 25600000 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:02.958419+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 115236864 unmapped: 25526272 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:03.958559+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 115277824 unmapped: 25485312 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2887575 data_alloc: 184549376 data_used: 409600
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:04.958702+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 156 heartbeat osd_stat(store_statfs(0x1a852b000/0x0/0x1bfc00000, data 0x11ae03fc/0x11bc2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123748352 unmapped: 17014784 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:05.958853+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 156 heartbeat osd_stat(store_statfs(0x1a751e000/0x0/0x1bfc00000, data 0x12aee0f9/0x12bd0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 115458048 unmapped: 25305088 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:06.958992+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 115531776 unmapped: 25231360 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:07.959113+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 124002304 unmapped: 16760832 heap: 140763136 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:08.959271+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 115761152 unmapped: 33398784 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 3220847 data_alloc: 184549376 data_used: 409600
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:09.959427+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 124289024 unmapped: 24870912 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 156 heartbeat osd_stat(store_statfs(0x1a44f2000/0x0/0x1bfc00000, data 0x15b1b4c9/0x15bfc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:10.959721+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 115974144 unmapped: 33185792 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.955376625s of 10.036594391s, submitted: 97
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:11.959910+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 117096448 unmapped: 32063488 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 157 heartbeat osd_stat(store_statfs(0x1a2ce7000/0x0/0x1bfc00000, data 0x1732387a/0x17406000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:12.960035+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125509632 unmapped: 23650304 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:13.960189+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 117219328 unmapped: 31940608 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 3608675 data_alloc: 184549376 data_used: 421888
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:14.960337+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 117350400 unmapped: 31809536 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:15.960496+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 126017536 unmapped: 23142400 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 158 handle_osd_map epochs [158,159], i have 158, src has [1,159]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 158 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:16.960679+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 159 heartbeat osd_stat(store_statfs(0x19f49e000/0x0/0x1bfc00000, data 0x1ab67d34/0x1ac4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 126279680 unmapped: 22880256 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 159 handle_osd_map epochs [160,160], i have 159, src has [1,160]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:17.960869+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 160 heartbeat osd_stat(store_statfs(0x19dc95000/0x0/0x1bfc00000, data 0x1c36e679/0x1c456000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 118013952 unmapped: 31145984 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 160 handle_osd_map epochs [160,161], i have 160, src has [1,161]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 161 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce0952c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:18.961053+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 118087680 unmapped: 31072256 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 4060469 data_alloc: 184549376 data_used: 434176
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:19.961244+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 118104064 unmapped: 31055872 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:20.961416+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 161 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 161 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 162 ms_handle_reset con 0x5581cea0ec00 session 0x5581cd376960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 118243328 unmapped: 30916608 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:21.961580+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.305226326s of 10.413676262s, submitted: 212
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 163 ms_handle_reset con 0x5581cea0f800 session 0x5581d05423c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 117997568 unmapped: 31162368 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:22.961684+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 163 ms_handle_reset con 0x5581ceffec00 session 0x5581d0670f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 163 heartbeat osd_stat(store_statfs(0x1b746c000/0x0/0x1bfc00000, data 0x2b94cde/0x2c81000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 118005760 unmapped: 31154176 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:23.961807+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae2c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 163 ms_handle_reset con 0x5581d1ae2c00 session 0x5581d0543680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 163 heartbeat osd_stat(store_statfs(0x1b7442000/0x0/0x1bfc00000, data 0x2bbc4ed/0x2cab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 163 ms_handle_reset con 0x5581ce4a1000 session 0x5581d06712c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 118030336 unmapped: 31129600 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1345820 data_alloc: 184549376 data_used: 434176
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:24.961959+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 163 heartbeat osd_stat(store_statfs(0x1b7443000/0x0/0x1bfc00000, data 0x2bbc417/0x2ca9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 118136832 unmapped: 31023104 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:25.962109+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 118136832 unmapped: 31023104 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:26.962275+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 164 ms_handle_reset con 0x5581cea0ec00 session 0x5581ce7af0e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 119226368 unmapped: 29933568 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:27.962430+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 ms_handle_reset con 0x5581cea0f800 session 0x5581d07f8780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 119275520 unmapped: 29884416 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:28.962625+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 heartbeat osd_stat(store_statfs(0x1b7418000/0x0/0x1bfc00000, data 0x2be4ab5/0x2cd5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 119275520 unmapped: 29884416 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 ms_handle_reset con 0x5581ceffec00 session 0x5581cd398000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb14000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1349078 data_alloc: 184549376 data_used: 434176
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 ms_handle_reset con 0x5581cfb14000 session 0x5581d0312d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:29.963814+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 119291904 unmapped: 29868032 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:30.963968+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 119422976 unmapped: 29736960 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:31.964141+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.500541687s of 10.031413078s, submitted: 172
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 119422976 unmapped: 29736960 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 ms_handle_reset con 0x5581ce4a1000 session 0x5581d10a5680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:32.964308+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 119447552 unmapped: 29712384 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:33.964521+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 ms_handle_reset con 0x5581cea0ec00 session 0x5581d10a5a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 119898112 unmapped: 29261824 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1360098 data_alloc: 184549376 data_used: 434176
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:34.964702+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 heartbeat osd_stat(store_statfs(0x1b6220000/0x0/0x1bfc00000, data 0x2c3d4b3/0x2d2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 120946688 unmapped: 28213248 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 heartbeat osd_stat(store_statfs(0x1b6220000/0x0/0x1bfc00000, data 0x2c3d4b3/0x2d2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:35.964869+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 ms_handle_reset con 0x5581cfb8b800 session 0x5581d10a5c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 120930304 unmapped: 28229632 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 166 ms_handle_reset con 0x5581cfb8b400 session 0x5581d0546b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:36.965138+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 120954880 unmapped: 28205056 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 166 handle_osd_map epochs [166,167], i have 166, src has [1,167]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:37.965305+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122052608 unmapped: 27107328 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:38.965543+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122093568 unmapped: 27066368 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 49
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1377408 data_alloc: 184549376 data_used: 446464
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 167 heartbeat osd_stat(store_statfs(0x1b61f2000/0x0/0x1bfc00000, data 0x2c63ca7/0x2d59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 167 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 167 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 167 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:39.965710+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122044416 unmapped: 27115520 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 168 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8ac00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 169 ms_handle_reset con 0x5581cfb8b000 session 0x5581d30f21e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 169 ms_handle_reset con 0x5581cfb8ac00 session 0x5581d0671e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:40.965871+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122085376 unmapped: 27074560 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 170 ms_handle_reset con 0x5581ce4a1000 session 0x5581d0545e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:41.966101+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.811440468s of 10.000844002s, submitted: 211
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 121462784 unmapped: 27697152 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 171 ms_handle_reset con 0x5581cea0ec00 session 0x5581d05472c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:42.966473+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 171 heartbeat osd_stat(store_statfs(0x1b61a3000/0x0/0x1bfc00000, data 0x2cae2cf/0x2daa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 121634816 unmapped: 27525120 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:43.966696+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 121651200 unmapped: 27508736 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1390907 data_alloc: 184549376 data_used: 450560
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:44.966862+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 172 ms_handle_reset con 0x5581cfb8b800 session 0x5581d0547680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 172 ms_handle_reset con 0x5581cfb8b400 session 0x5581d30f2b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122789888 unmapped: 26370048 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 172 heartbeat osd_stat(store_statfs(0x1b616c000/0x0/0x1bfc00000, data 0x2ce3052/0x2dde000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 50
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:45.967069+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122798080 unmapped: 26361856 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:46.967274+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122773504 unmapped: 26386432 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:47.967433+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 174 ms_handle_reset con 0x5581ce4a1000 session 0x5581cf84c000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122798080 unmapped: 26361856 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:48.967637+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122798080 unmapped: 26361856 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 174 heartbeat osd_stat(store_statfs(0x1b6169000/0x0/0x1bfc00000, data 0x2ce7816/0x2de3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1396499 data_alloc: 184549376 data_used: 446464
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:49.967792+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122822656 unmapped: 26337280 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 174 heartbeat osd_stat(store_statfs(0x1b6154000/0x0/0x1bfc00000, data 0x2cfe853/0x2dfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:50.967965+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122839040 unmapped: 26320896 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:51.968132+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.539497375s of 10.001732826s, submitted: 189
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122880000 unmapped: 26279936 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:52.968299+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 175 heartbeat osd_stat(store_statfs(0x1b613b000/0x0/0x1bfc00000, data 0x2d14b87/0x2e12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122912768 unmapped: 26247168 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:53.968472+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122912768 unmapped: 26247168 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1403465 data_alloc: 184549376 data_used: 458752
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:54.968674+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122912768 unmapped: 26247168 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:55.968860+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 175 heartbeat osd_stat(store_statfs(0x1b613b000/0x0/0x1bfc00000, data 0x2d14b87/0x2e12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 175 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122937344 unmapped: 26222592 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:56.968970+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122945536 unmapped: 26214400 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:57.969105+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122945536 unmapped: 26214400 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:58.969308+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 176 heartbeat osd_stat(store_statfs(0x1b6138000/0x0/0x1bfc00000, data 0x2d16c7d/0x2e15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122855424 unmapped: 26304512 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1403879 data_alloc: 184549376 data_used: 458752
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:10:59.969526+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122855424 unmapped: 26304512 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:00.969703+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122855424 unmapped: 26304512 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:01.969875+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.925263405s of 10.007609367s, submitted: 26
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122855424 unmapped: 26304512 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:02.970028+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122855424 unmapped: 26304512 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 176 ms_handle_reset con 0x5581cfb8b400 session 0x5581cc16c960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:03.970161+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 176 ms_handle_reset con 0x5581cfb8b800 session 0x5581ceedb4a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122855424 unmapped: 26304512 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1405071 data_alloc: 184549376 data_used: 458752
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:04.970337+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 176 heartbeat osd_stat(store_statfs(0x1b6137000/0x0/0x1bfc00000, data 0x2d16d45/0x2e16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122855424 unmapped: 26304512 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:05.970503+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122855424 unmapped: 26304512 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:06.970671+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 122855424 unmapped: 26304512 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 176 ms_handle_reset con 0x5581cfb8a800 session 0x5581d07f90e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:07.970808+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3100000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 176 ms_handle_reset con 0x5581d3100000 session 0x5581cea47680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123002880 unmapped: 26157056 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:08.970986+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123002880 unmapped: 26157056 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1464258 data_alloc: 184549376 data_used: 458752
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:09.971152+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 176 heartbeat osd_stat(store_statfs(0x1b5a20000/0x0/0x1bfc00000, data 0x342ed43/0x352e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123002880 unmapped: 26157056 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 176 ms_handle_reset con 0x5581ce4a1000 session 0x5581cd377a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:10.971308+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123027456 unmapped: 26132480 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:11.971460+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.240368843s of 10.487688065s, submitted: 54
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123043840 unmapped: 26116096 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:12.971635+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 177 heartbeat osd_stat(store_statfs(0x1b5a16000/0x0/0x1bfc00000, data 0x3431152/0x3536000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 177 ms_handle_reset con 0x5581cfb8a800 session 0x5581d10a52c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123109376 unmapped: 26050560 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:13.971862+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 26042368 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1475522 data_alloc: 184549376 data_used: 466944
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:14.972025+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 177 ms_handle_reset con 0x5581cfb8b800 session 0x5581cd3994a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 26042368 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8ac00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 178 ms_handle_reset con 0x5581cfb8ac00 session 0x5581d0313a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:15.972173+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123174912 unmapped: 25985024 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 178 handle_osd_map epochs [178,179], i have 178, src has [1,179]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 179 ms_handle_reset con 0x5581cfb8a400 session 0x5581d06705a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:16.972346+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 179 ms_handle_reset con 0x5581cfb8b400 session 0x5581d06703c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 179 ms_handle_reset con 0x5581cfb8a400 session 0x5581cf84c3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123199488 unmapped: 25960448 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:17.972490+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123207680 unmapped: 25952256 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 179 heartbeat osd_stat(store_statfs(0x1b5a0f000/0x0/0x1bfc00000, data 0x343556d/0x353d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:18.972729+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123240448 unmapped: 25919488 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 180 ms_handle_reset con 0x5581cfb8a800 session 0x5581cce8e3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1496653 data_alloc: 184549376 data_used: 483328
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 180 ms_handle_reset con 0x5581ce4a1000 session 0x5581d30f25a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:19.972926+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123240448 unmapped: 25919488 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:20.973082+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8ac00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 180 ms_handle_reset con 0x5581cfb8ac00 session 0x5581d0312000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 180 ms_handle_reset con 0x5581ce4a1000 session 0x5581cee7d860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123248640 unmapped: 25911296 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:21.973243+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.691867828s of 10.025359154s, submitted: 83
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123256832 unmapped: 25903104 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:22.973408+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 180 ms_handle_reset con 0x5581cfb8a800 session 0x5581cce8f4a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 180 ms_handle_reset con 0x5581cfb8a400 session 0x5581ce080b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 180 ms_handle_reset con 0x5581cfb8b800 session 0x5581cfb201e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123265024 unmapped: 25894912 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0620400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 180 ms_handle_reset con 0x5581d0620400 session 0x5581ceedbc20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:23.973553+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123248640 unmapped: 25911296 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 heartbeat osd_stat(store_statfs(0x1b5a0c000/0x0/0x1bfc00000, data 0x343779c/0x3542000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 ms_handle_reset con 0x5581ce4a1000 session 0x5581cee7c1e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1501775 data_alloc: 184549376 data_used: 495616
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:24.973750+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 ms_handle_reset con 0x5581cfb8b400 session 0x5581ce094d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123248640 unmapped: 25911296 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:25.973952+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 heartbeat osd_stat(store_statfs(0x1b5a08000/0x0/0x1bfc00000, data 0x34399ac/0x3545000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123281408 unmapped: 25878528 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:26.974110+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123281408 unmapped: 25878528 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 ms_handle_reset con 0x5581cfb8a400 session 0x5581d30f2780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 ms_handle_reset con 0x5581cfb8a800 session 0x5581cf4443c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:27.974257+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 ms_handle_reset con 0x5581cfb8b800 session 0x5581ce7ae000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123305984 unmapped: 25853952 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 heartbeat osd_stat(store_statfs(0x1b5a08000/0x0/0x1bfc00000, data 0x3439a0e/0x3546000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6caf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 181 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:28.974478+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 182 ms_handle_reset con 0x5581cfb8a400 session 0x5581cf445a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123330560 unmapped: 25829376 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1511940 data_alloc: 184549376 data_used: 499712
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:29.974666+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 183 ms_handle_reset con 0x5581ce4a1000 session 0x5581ceedb680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123420672 unmapped: 25739264 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:30.974804+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 183 ms_handle_reset con 0x5581cfb8a800 session 0x5581cc16dc20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123437056 unmapped: 25722880 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:31.974970+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 183 heartbeat osd_stat(store_statfs(0x1b5604000/0x0/0x1bfc00000, data 0x343ddbc/0x354a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 184 ms_handle_reset con 0x5581cfb8b400 session 0x5581ce6daf00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf7ca800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.347510338s of 10.001162529s, submitted: 169
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123510784 unmapped: 25649152 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 184 ms_handle_reset con 0x5581cf7ca800 session 0x5581d0546960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:32.975118+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 184 ms_handle_reset con 0x5581ce4a1000 session 0x5581cfb210e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 184 ms_handle_reset con 0x5581cfb8a400 session 0x5581ce022f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123527168 unmapped: 25632768 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:33.975280+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123527168 unmapped: 25632768 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1514290 data_alloc: 184549376 data_used: 507904
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:34.975522+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123527168 unmapped: 25632768 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 184 heartbeat osd_stat(store_statfs(0x1b55fe000/0x0/0x1bfc00000, data 0x344005e/0x354f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:35.975708+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 184 heartbeat osd_stat(store_statfs(0x1b55fd000/0x0/0x1bfc00000, data 0x344006e/0x3550000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 184 handle_osd_map epochs [186,186], i have 184, src has [1,186]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 184 handle_osd_map epochs [185,186], i have 184, src has [1,186]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 186 ms_handle_reset con 0x5581cfb8a800 session 0x5581cfe6c780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 123584512 unmapped: 25575424 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:36.975859+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 186 handle_osd_map epochs [185,186], i have 186, src has [1,186]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 186 ms_handle_reset con 0x5581cfb8b400 session 0x5581cfaf72c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 186 handle_osd_map epochs [185,186], i have 186, src has [1,186]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ceffe800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 186 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 186 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 186 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 133586944 unmapped: 15572992 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 187 ms_handle_reset con 0x5581ceffe800 session 0x5581ce081860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:37.975996+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125493248 unmapped: 23666688 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:38.976162+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 187 heartbeat osd_stat(store_statfs(0x1b487a000/0x0/0x1bfc00000, data 0x41bd684/0x42d3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 187 handle_osd_map epochs [187,188], i have 187, src has [1,188]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 188 heartbeat osd_stat(store_statfs(0x1b4879000/0x0/0x1bfc00000, data 0x41bd704/0x42d3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125591552 unmapped: 23568384 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 188 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce6da960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 188 ms_handle_reset con 0x5581cfb8a400 session 0x5581ce7afc20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1632398 data_alloc: 184549376 data_used: 540672
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:39.976307+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125509632 unmapped: 23650304 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 188 ms_handle_reset con 0x5581cfb8a800 session 0x5581cf4cc3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:40.976497+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 188 heartbeat osd_stat(store_statfs(0x1b55f0000/0x0/0x1bfc00000, data 0x3448814/0x355d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 188 handle_osd_map epochs [188,189], i have 188, src has [1,189]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125517824 unmapped: 23642112 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:41.976677+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 189 ms_handle_reset con 0x5581cfb8b400 session 0x5581cc5b21e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125509632 unmapped: 23650304 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.566718102s of 10.271141052s, submitted: 209
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:42.976839+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125509632 unmapped: 23650304 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:43.976992+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125517824 unmapped: 23642112 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1542325 data_alloc: 184549376 data_used: 565248
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:44.977148+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125526016 unmapped: 23633920 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:45.977319+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 190 heartbeat osd_stat(store_statfs(0x1b55ea000/0x0/0x1bfc00000, data 0x344cc06/0x3563000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1598c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 190 ms_handle_reset con 0x5581d1598c00 session 0x5581d0547a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125558784 unmapped: 23601152 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:46.977493+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 191 heartbeat osd_stat(store_statfs(0x1b65c4000/0x0/0x1bfc00000, data 0x344f03b/0x3569000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125575168 unmapped: 23584768 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 191 ms_handle_reset con 0x5581ce4a1000 session 0x5581d10a2000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:47.977717+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125575168 unmapped: 23584768 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:48.977922+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125575168 unmapped: 23584768 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:49.978111+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1549071 data_alloc: 184549376 data_used: 577536
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125575168 unmapped: 23584768 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:50.978277+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 191 heartbeat osd_stat(store_statfs(0x1b65c5000/0x0/0x1bfc00000, data 0x344f03b/0x3569000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125575168 unmapped: 23584768 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:51.978458+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125583360 unmapped: 23576576 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:52.979250+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.884938240s of 10.213072777s, submitted: 117
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 192 ms_handle_reset con 0x5581cfb8a400 session 0x5581cd398f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125591552 unmapped: 23568384 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:53.979406+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125624320 unmapped: 23535616 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 193 ms_handle_reset con 0x5581cfb8a800 session 0x5581d30f32c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:54.979570+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1562361 data_alloc: 184549376 data_used: 589824
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125640704 unmapped: 23519232 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:55.979714+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 194 heartbeat osd_stat(store_statfs(0x1b65bc000/0x0/0x1bfc00000, data 0x34537c9/0x3572000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125673472 unmapped: 23486464 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:56.979880+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 195 ms_handle_reset con 0x5581cfb8b400 session 0x5581cc5b23c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 125698048 unmapped: 23461888 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:57.980057+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb14400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 126754816 unmapped: 22405120 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:58.980310+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 197 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 197 handle_osd_map epochs [196,197], i have 197, src has [1,197]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 197 heartbeat osd_stat(store_statfs(0x1b65aa000/0x0/0x1bfc00000, data 0x345c188/0x3582000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 197 ms_handle_reset con 0x5581cfb14400 session 0x5581ce7aeb40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 126787584 unmapped: 22372352 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:11:59.980485+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1579046 data_alloc: 184549376 data_used: 598016
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 198 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce489680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 126836736 unmapped: 22323200 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:00.980669+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 198 ms_handle_reset con 0x5581cfb8a800 session 0x5581ce095a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 198 ms_handle_reset con 0x5581cfb8a400 session 0x5581ce6d9a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 127942656 unmapped: 21217280 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:01.980827+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 127959040 unmapped: 21200896 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:02.980985+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 200 heartbeat osd_stat(store_statfs(0x1b659f000/0x0/0x1bfc00000, data 0x34623e9/0x358c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.975617409s of 10.716296196s, submitted: 230
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 127959040 unmapped: 21200896 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:03.981132+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 200 ms_handle_reset con 0x5581cfb8b400 session 0x5581cce8f4a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 127139840 unmapped: 22020096 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:04.981298+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1587833 data_alloc: 184549376 data_used: 618496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 127139840 unmapped: 22020096 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 200 heartbeat osd_stat(store_statfs(0x1b65a0000/0x0/0x1bfc00000, data 0x3462367/0x358c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:05.981723+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 200 heartbeat osd_stat(store_statfs(0x1b65a0000/0x0/0x1bfc00000, data 0x3462367/0x358c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 200 handle_osd_map epochs [200,201], i have 200, src has [1,201]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 127148032 unmapped: 22011904 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:06.981945+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 201 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 201 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 201 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 127172608 unmapped: 21987328 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:07.982107+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 127180800 unmapped: 21979136 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb15c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:08.982281+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 202 ms_handle_reset con 0x5581cfb15c00 session 0x5581ccdd94a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 127188992 unmapped: 21970944 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:09.982479+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1602248 data_alloc: 184549376 data_used: 647168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 131481600 unmapped: 17678336 heap: 149159936 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:10.982592+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 202 heartbeat osd_stat(store_statfs(0x1b6597000/0x0/0x1bfc00000, data 0x3466864/0x3597000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135987200 unmapped: 17375232 heap: 153362432 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:11.983013+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 133234688 unmapped: 20127744 heap: 153362432 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:12.983128+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.553852081s of 10.024692535s, submitted: 195
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137453568 unmapped: 15908864 heap: 153362432 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:13.983325+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 133332992 unmapped: 20029440 heap: 153362432 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:14.988786+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2998291 data_alloc: 184549376 data_used: 647168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 133439488 unmapped: 24125440 heap: 157564928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:15.988934+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 202 heartbeat osd_stat(store_statfs(0x1a7199000/0x0/0x1bfc00000, data 0x1286676d/0x12995000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 133472256 unmapped: 24092672 heap: 157564928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:16.989092+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134578176 unmapped: 22986752 heap: 157564928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:17.989205+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 202 heartbeat osd_stat(store_statfs(0x1a2199000/0x0/0x1bfc00000, data 0x1786676d/0x17995000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,1,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 136912896 unmapped: 20652032 heap: 157564928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:18.989342+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:19.989475+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 132775936 unmapped: 24788992 heap: 157564928 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 4356129 data_alloc: 184549376 data_used: 647168
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:20.989636+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 133095424 unmapped: 28672000 heap: 161767424 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 202 ms_handle_reset con 0x5581cfb8a800 session 0x5581d0313680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:21.989775+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 133308416 unmapped: 36855808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 203 ms_handle_reset con 0x5581cfb8b400 session 0x5581d0313e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 203 ms_handle_reset con 0x5581cfb8a400 session 0x5581d30f2960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:22.989922+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 133701632 unmapped: 36462592 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 203 ms_handle_reset con 0x5581ce4a1000 session 0x5581ccdd8b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 203 ms_handle_reset con 0x5581d0036000 session 0x5581ceeda3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 203 ms_handle_reset con 0x5581ce4a1000 session 0x5581ceedbc20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 203 handle_osd_map epochs [203,204], i have 203, src has [1,204]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 6.422959805s of 10.002857208s, submitted: 352
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:23.990090+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 133816320 unmapped: 36347904 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 204 heartbeat osd_stat(store_statfs(0x197992000/0x0/0x1bfc00000, data 0x220689b0/0x2219a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 204 ms_handle_reset con 0x5581cea0ec00 session 0x5581cf444f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 204 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 205 ms_handle_reset con 0x5581cfb8a400 session 0x5581ccdd8780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:24.990234+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 132915200 unmapped: 37249024 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1748775 data_alloc: 184549376 data_used: 667648
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 205 ms_handle_reset con 0x5581cfb8a800 session 0x5581d05432c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:25.990361+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 132923392 unmapped: 37240832 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:26.990528+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 132939776 unmapped: 37224448 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 207 handle_osd_map epochs [206,207], i have 207, src has [1,207]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 207 heartbeat osd_stat(store_statfs(0x1b4983000/0x0/0x1bfc00000, data 0x3471314/0x35a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 207 ms_handle_reset con 0x5581cfb8b400 session 0x5581d0544960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 207 handle_osd_map epochs [207,208], i have 207, src has [1,208]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:27.990682+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 132923392 unmapped: 37240832 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:28.990864+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 132923392 unmapped: 37240832 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 209 ms_handle_reset con 0x5581ce4a1000 session 0x5581cfe6cd20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:29.991101+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 132931584 unmapped: 37232640 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 209 ms_handle_reset con 0x5581cea0ec00 session 0x5581d07f81e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1763668 data_alloc: 184549376 data_used: 671744
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 209 heartbeat osd_stat(store_statfs(0x1b617f000/0x0/0x1bfc00000, data 0x347582f/0x35ae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:30.991238+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 210 ms_handle_reset con 0x5581cfb8a400 session 0x5581ce7ae5a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 132947968 unmapped: 37216256 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 211 ms_handle_reset con 0x5581d0036000 session 0x5581d30f25a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 211 ms_handle_reset con 0x5581cfb8a800 session 0x5581d0542000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:31.991363+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 132980736 unmapped: 37183488 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 211 heartbeat osd_stat(store_statfs(0x1b6176000/0x0/0x1bfc00000, data 0x3479dd9/0x35b7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:32.991469+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134037504 unmapped: 36126720 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 212 ms_handle_reset con 0x5581d0036400 session 0x5581d30f2780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.086586952s of 10.004636765s, submitted: 361
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:33.991595+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 212 ms_handle_reset con 0x5581cea0ec00 session 0x5581d10a4780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134062080 unmapped: 36102144 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 212 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce095860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 212 ms_handle_reset con 0x5581cfb8a400 session 0x5581cc1265a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 212 ms_handle_reset con 0x5581d0036000 session 0x5581d0670f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:34.991764+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134103040 unmapped: 36061184 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 212 handle_osd_map epochs [213,213], i have 212, src has [1,213]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 213 ms_handle_reset con 0x5581ce4a1000 session 0x5581d0546f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1775963 data_alloc: 184549376 data_used: 688128
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain rsyslogd[754]: imjournal from <localhost:ceph-osd>: begin to drop messages due to rate-limiting
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:35.991883+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134103040 unmapped: 36061184 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 213 ms_handle_reset con 0x5581cea0ec00 session 0x5581d0671a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:36.992016+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134160384 unmapped: 36003840 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 214 heartbeat osd_stat(store_statfs(0x1b616a000/0x0/0x1bfc00000, data 0x3480371/0x35c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:37.992136+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134160384 unmapped: 36003840 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:38.992325+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134160384 unmapped: 36003840 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 214 handle_osd_map epochs [214,215], i have 214, src has [1,215]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:39.992462+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134176768 unmapped: 35987456 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1785259 data_alloc: 184549376 data_used: 684032
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:40.992659+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134176768 unmapped: 35987456 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 215 ms_handle_reset con 0x5581cfb8a400 session 0x5581cf4cd2c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:41.992835+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134209536 unmapped: 35954688 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 216 ms_handle_reset con 0x5581d0036400 session 0x5581cfe6de00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 216 heartbeat osd_stat(store_statfs(0x1b6168000/0x0/0x1bfc00000, data 0x34825ff/0x35c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:42.992958+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134234112 unmapped: 35930112 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 216 handle_osd_map epochs [217,217], i have 216, src has [1,217]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.596132278s of 10.057441711s, submitted: 143
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:43.993127+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134234112 unmapped: 35930112 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 217 ms_handle_reset con 0x5581d0036800 session 0x5581cc62d0e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 217 ms_handle_reset con 0x5581ce4a1000 session 0x5581d0312d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:44.993293+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134234112 unmapped: 35930112 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1792084 data_alloc: 184549376 data_used: 692224
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 217 ms_handle_reset con 0x5581cea0ec00 session 0x5581d0313860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:45.993544+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134266880 unmapped: 35897344 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:46.993703+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134266880 unmapped: 35897344 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 217 ms_handle_reset con 0x5581cfb8a400 session 0x5581d05454a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:47.993845+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 218 heartbeat osd_stat(store_statfs(0x1b6163000/0x0/0x1bfc00000, data 0x3486895/0x35cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134291456 unmapped: 35872768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:48.994008+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134299648 unmapped: 35864576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:49.994149+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134299648 unmapped: 35864576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1800009 data_alloc: 184549376 data_used: 704512
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:50.994311+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134299648 unmapped: 35864576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:51.994459+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134299648 unmapped: 35864576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 220 ms_handle_reset con 0x5581d0036400 session 0x5581cfaf7c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 220 heartbeat osd_stat(store_statfs(0x1b615c000/0x0/0x1bfc00000, data 0x348ace9/0x35d2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:52.994650+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134299648 unmapped: 35864576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:53.994803+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134299648 unmapped: 35864576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:54.994996+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134299648 unmapped: 35864576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1809003 data_alloc: 184549376 data_used: 716800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:55.995147+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134299648 unmapped: 35864576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.132311821s of 12.571419716s, submitted: 135
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 220 ms_handle_reset con 0x5581d0036c00 session 0x5581d06705a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 ms_handle_reset con 0x5581ce4a1000 session 0x5581d06703c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:56.995285+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134332416 unmapped: 35831808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 ms_handle_reset con 0x5581cea0ec00 session 0x5581cf444960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:57.995453+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134332416 unmapped: 35831808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 heartbeat osd_stat(store_statfs(0x1b6155000/0x0/0x1bfc00000, data 0x348eee1/0x35d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:12:59.005254+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134332416 unmapped: 35831808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:00.005405+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134332416 unmapped: 35831808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1810071 data_alloc: 184549376 data_used: 729088
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:01.005564+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134332416 unmapped: 35831808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 heartbeat osd_stat(store_statfs(0x1b6155000/0x0/0x1bfc00000, data 0x348eee1/0x35d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:02.005718+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134332416 unmapped: 35831808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:03.005882+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134332416 unmapped: 35831808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:04.006019+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134332416 unmapped: 35831808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:05.006169+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134340608 unmapped: 35823616 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1810247 data_alloc: 184549376 data_used: 729088
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:06.006298+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134340608 unmapped: 35823616 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 heartbeat osd_stat(store_statfs(0x1b6155000/0x0/0x1bfc00000, data 0x348eee1/0x35d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.086022377s of 10.290522575s, submitted: 58
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 ms_handle_reset con 0x5581cfb8a400 session 0x5581cc3b1c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:07.006413+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134365184 unmapped: 35799040 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:08.006554+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134389760 unmapped: 35774464 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 ms_handle_reset con 0x5581d0036400 session 0x5581d07f9a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:09.006697+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134389760 unmapped: 35774464 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 heartbeat osd_stat(store_statfs(0x1b6156000/0x0/0x1bfc00000, data 0x348eee1/0x35d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:10.006822+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134397952 unmapped: 35766272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 heartbeat osd_stat(store_statfs(0x1b6156000/0x0/0x1bfc00000, data 0x348eee1/0x35d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1810100 data_alloc: 184549376 data_used: 729088
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:11.006944+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134397952 unmapped: 35766272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:12.007129+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134397952 unmapped: 35766272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:13.007292+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134397952 unmapped: 35766272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:14.007461+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134397952 unmapped: 35766272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 heartbeat osd_stat(store_statfs(0x1b6155000/0x0/0x1bfc00000, data 0x348ef7c/0x35d9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:15.007646+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134397952 unmapped: 35766272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1811306 data_alloc: 184549376 data_used: 729088
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0037000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 ms_handle_reset con 0x5581d0037000 session 0x5581ce6d8f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:16.007788+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134397952 unmapped: 35766272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:17.007965+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.237521172s of 10.327370644s, submitted: 20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 134397952 unmapped: 35766272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 ms_handle_reset con 0x5581ce4a1000 session 0x5581cd398780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 ms_handle_reset con 0x5581cea0ec00 session 0x5581ccdd8000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:18.008079+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135471104 unmapped: 34693120 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 222 handle_osd_map epochs [222,223], i have 222, src has [1,223]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 223 ms_handle_reset con 0x5581cfb8a400 session 0x5581cfe6d2c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:19.008227+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135479296 unmapped: 34684928 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 223 heartbeat osd_stat(store_statfs(0x1b6149000/0x0/0x1bfc00000, data 0x3493476/0x35e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:20.008364+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 223 ms_handle_reset con 0x5581d0036400 session 0x5581d0671c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135479296 unmapped: 34684928 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1823771 data_alloc: 184549376 data_used: 737280
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:21.008501+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0037000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 223 ms_handle_reset con 0x5581d0037000 session 0x5581cfaf6000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135487488 unmapped: 34676736 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 224 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce4885a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 224 heartbeat osd_stat(store_statfs(0x1b6148000/0x0/0x1bfc00000, data 0x3495702/0x35e5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:22.008667+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135520256 unmapped: 34643968 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 224 heartbeat osd_stat(store_statfs(0x1b6148000/0x0/0x1bfc00000, data 0x3495702/0x35e5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:23.008839+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 224 ms_handle_reset con 0x5581cea0ec00 session 0x5581ce6dbe00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135536640 unmapped: 34627584 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:24.008954+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135536640 unmapped: 34627584 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 224 ms_handle_reset con 0x5581cfb8a400 session 0x5581cc16d680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:25.009077+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135536640 unmapped: 34627584 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1831027 data_alloc: 184549376 data_used: 749568
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:26.009239+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135536640 unmapped: 34627584 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:27.009419+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.364039421s of 10.005603790s, submitted: 197
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 225 ms_handle_reset con 0x5581d0036400 session 0x5581ce2a2b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135561216 unmapped: 34603008 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0037400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 225 ms_handle_reset con 0x5581d0037400 session 0x5581cfe6d0e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:28.009594+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135602176 unmapped: 34562048 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 226 heartbeat osd_stat(store_statfs(0x1b6141000/0x0/0x1bfc00000, data 0x34978ab/0x35ec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 226 ms_handle_reset con 0x5581ce4a1000 session 0x5581d0312b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:29.009839+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135651328 unmapped: 34512896 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 ms_handle_reset con 0x5581cea0ec00 session 0x5581ce080b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:30.009994+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 heartbeat osd_stat(store_statfs(0x1b6137000/0x0/0x1bfc00000, data 0x349be70/0x35f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 ms_handle_reset con 0x5581cfb8a400 session 0x5581cfe6d860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135667712 unmapped: 34496512 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1851977 data_alloc: 184549376 data_used: 774144
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0036400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0037800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 ms_handle_reset con 0x5581d0037800 session 0x5581d0546d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0037c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 ms_handle_reset con 0x5581d0036400 session 0x5581cfe6c960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 ms_handle_reset con 0x5581d0037c00 session 0x5581d10a4f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:31.010128+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135766016 unmapped: 34398208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce286000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:32.010270+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135774208 unmapped: 34390016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:33.010415+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 ms_handle_reset con 0x5581cea0ec00 session 0x5581d03130e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135774208 unmapped: 34390016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:34.010540+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 228 ms_handle_reset con 0x5581cfb8a400 session 0x5581d05461e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135782400 unmapped: 34381824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:35.010705+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135782400 unmapped: 34381824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 228 heartbeat osd_stat(store_statfs(0x1b613a000/0x0/0x1bfc00000, data 0x349df6d/0x35f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1851137 data_alloc: 184549376 data_used: 786432
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0037800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 228 ms_handle_reset con 0x5581d0037800 session 0x5581d03132c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:36.010840+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135790592 unmapped: 34373632 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:37.010988+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.326186180s of 10.008395195s, submitted: 200
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135806976 unmapped: 34357248 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 229 ms_handle_reset con 0x5581ce4a1000 session 0x5581d30f3860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:38.011124+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 230 ms_handle_reset con 0x5581cea0ec00 session 0x5581d10a41e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135847936 unmapped: 34316288 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 230 ms_handle_reset con 0x5581cfb8a400 session 0x5581d05472c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:39.011286+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135864320 unmapped: 34299904 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0037c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 230 ms_handle_reset con 0x5581d0037c00 session 0x5581d30f2f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 230 ms_handle_reset con 0x5581cc836800 session 0x5581cfb210e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 230 ms_handle_reset con 0x5581cc836800 session 0x5581cf474780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:40.011433+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135888896 unmapped: 34275328 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1863588 data_alloc: 184549376 data_used: 806912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:41.011570+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce0223c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 heartbeat osd_stat(store_statfs(0x1b612f000/0x0/0x1bfc00000, data 0x34a4569/0x35ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135921664 unmapped: 34242560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 ms_handle_reset con 0x5581cea0f000 session 0x5581cea46960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 ms_handle_reset con 0x5581cea0ec00 session 0x5581cc3aef00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:42.011670+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135938048 unmapped: 34226176 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8bc00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 ms_handle_reset con 0x5581cfb8bc00 session 0x5581cce8ed20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 heartbeat osd_stat(store_statfs(0x1b6130000/0x0/0x1bfc00000, data 0x34a4507/0x35fe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:43.011745+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135938048 unmapped: 34226176 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:44.011862+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135938048 unmapped: 34226176 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:45.012000+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135946240 unmapped: 34217984 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 ms_handle_reset con 0x5581cc836800 session 0x5581cf4cd680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1869496 data_alloc: 184549376 data_used: 806912
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 ms_handle_reset con 0x5581cea0ec00 session 0x5581cfe6da40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 ms_handle_reset con 0x5581ce4a1000 session 0x5581d0546960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:46.012125+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 ms_handle_reset con 0x5581cea0f000 session 0x5581cea472c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135774208 unmapped: 34390016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 heartbeat osd_stat(store_statfs(0x1b5dc4000/0x0/0x1bfc00000, data 0x380b6d8/0x396a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cfb8a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 232 ms_handle_reset con 0x5581cfb8a400 session 0x5581cc5b2b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:47.012265+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135774208 unmapped: 34390016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.819275856s of 10.480998039s, submitted: 146
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 232 ms_handle_reset con 0x5581cc836800 session 0x5581cfaf7680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:48.012407+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135782400 unmapped: 34381824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 233 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce081e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 233 ms_handle_reset con 0x5581cea0f000 session 0x5581cfaf6b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 233 ms_handle_reset con 0x5581cea0ec00 session 0x5581cfb21680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d0037c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:49.012559+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 233 handle_osd_map epochs [233,234], i have 233, src has [1,234]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135839744 unmapped: 34324480 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 234 ms_handle_reset con 0x5581d0037c00 session 0x5581cfaf70e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 234 ms_handle_reset con 0x5581cc836800 session 0x5581cd3770e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:50.012755+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 234 ms_handle_reset con 0x5581cea0ec00 session 0x5581d07f92c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135872512 unmapped: 34291712 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 235 ms_handle_reset con 0x5581ce4a1000 session 0x5581cfaf6d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1892168 data_alloc: 184549376 data_used: 835584
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 235 ms_handle_reset con 0x5581cea0f000 session 0x5581cce8f860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:51.013121+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135782400 unmapped: 34381824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:52.013491+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135806976 unmapped: 34357248 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 235 heartbeat osd_stat(store_statfs(0x1b611f000/0x0/0x1bfc00000, data 0x34acec2/0x360d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 235 ms_handle_reset con 0x5581cf329c00 session 0x5581ce6d9c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:53.013852+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135806976 unmapped: 34357248 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 235 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce489680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 236 ms_handle_reset con 0x5581cc836800 session 0x5581cfaf6f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:54.014031+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 236 heartbeat osd_stat(store_statfs(0x1b611e000/0x0/0x1bfc00000, data 0x34acf44/0x3610000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135815168 unmapped: 34349056 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 236 ms_handle_reset con 0x5581cea0f000 session 0x5581cc5b23c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 236 ms_handle_reset con 0x5581cea0ec00 session 0x5581ce023c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:55.014181+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135815168 unmapped: 34349056 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1907365 data_alloc: 184549376 data_used: 843776
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 236 heartbeat osd_stat(store_statfs(0x1b6119000/0x0/0x1bfc00000, data 0x34af1d2/0x3614000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 236 handle_osd_map epochs [236,237], i have 236, src has [1,237]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3103000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 237 ms_handle_reset con 0x5581d3103000 session 0x5581cc127c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:56.014314+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 237 ms_handle_reset con 0x5581cf329000 session 0x5581ce7af680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 237 ms_handle_reset con 0x5581cc836800 session 0x5581cc3b0b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135872512 unmapped: 34291712 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 237 ms_handle_reset con 0x5581ce4a1000 session 0x5581cf474960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 237 handle_osd_map epochs [237,238], i have 237, src has [1,238]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:57.014449+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 135913472 unmapped: 34250752 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.596669197s of 10.436882019s, submitted: 207
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:58.014592+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 239 heartbeat osd_stat(store_statfs(0x1b5d10000/0x0/0x1bfc00000, data 0x34b352d/0x361c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 239 ms_handle_reset con 0x5581cea0ec00 session 0x5581ce095a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 136986624 unmapped: 33177600 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:13:59.014757+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0f000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 136986624 unmapped: 33177600 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 239 heartbeat osd_stat(store_statfs(0x1b5d0c000/0x0/0x1bfc00000, data 0x34b573d/0x361f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 239 handle_osd_map epochs [240,240], i have 239, src has [1,240]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 240 ms_handle_reset con 0x5581cea0f000 session 0x5581cf4cd860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:00.014911+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137043968 unmapped: 33120256 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1923690 data_alloc: 184549376 data_used: 868352
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:01.015076+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 240 ms_handle_reset con 0x5581cc836800 session 0x5581d0546000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137052160 unmapped: 33112064 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 240 ms_handle_reset con 0x5581cea0ec00 session 0x5581cc16de00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 241 ms_handle_reset con 0x5581cf329000 session 0x5581cc127680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:02.015212+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 241 ms_handle_reset con 0x5581ce4a1000 session 0x5581cf4741e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3103000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 241 ms_handle_reset con 0x5581d3103000 session 0x5581cfaf65a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137183232 unmapped: 32980992 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 241 ms_handle_reset con 0x5581cc836800 session 0x5581ce287680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:03.015383+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 241 heartbeat osd_stat(store_statfs(0x1b5cdf000/0x0/0x1bfc00000, data 0x34e2bc2/0x364e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 241 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce2a23c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cea0ec00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137224192 unmapped: 32940032 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 241 ms_handle_reset con 0x5581cf329000 session 0x5581d30f2b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3102c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 241 ms_handle_reset con 0x5581d3102c00 session 0x5581cd398d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:04.015506+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 242 ms_handle_reset con 0x5581cea0ec00 session 0x5581cc16c3c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137281536 unmapped: 32882688 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:05.015699+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137396224 unmapped: 32768000 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 242 ms_handle_reset con 0x5581cc836800 session 0x5581cc16d2c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1927114 data_alloc: 184549376 data_used: 872448
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 242 handle_osd_map epochs [242,243], i have 242, src has [1,243]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:06.015825+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 243 ms_handle_reset con 0x5581ce4a1000 session 0x5581cee7c000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137371648 unmapped: 32792576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 243 ms_handle_reset con 0x5581cf329000 session 0x5581ce6da000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3102c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 243 heartbeat osd_stat(store_statfs(0x1b5cc5000/0x0/0x1bfc00000, data 0x34fc008/0x3668000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:07.015957+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 244 ms_handle_reset con 0x5581d3102c00 session 0x5581ce6da5a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137494528 unmapped: 32669696 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:08.016126+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137601024 unmapped: 32563200 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:09.016363+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137609216 unmapped: 32555008 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.875143051s of 11.792132378s, submitted: 252
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:10.016524+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137764864 unmapped: 32399360 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1941672 data_alloc: 184549376 data_used: 880640
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:11.016706+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 137912320 unmapped: 32251904 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 244 heartbeat osd_stat(store_statfs(0x1b5c8b000/0x0/0x1bfc00000, data 0x3536450/0x36a3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:12.016927+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 138108928 unmapped: 32055296 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:13.017049+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 139378688 unmapped: 30785536 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:14.017157+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 heartbeat osd_stat(store_statfs(0x1b5c2c000/0x0/0x1bfc00000, data 0x35932b8/0x3701000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 139657216 unmapped: 30507008 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:15.017587+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 139657216 unmapped: 30507008 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1950794 data_alloc: 184549376 data_used: 892928
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:16.017874+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 heartbeat osd_stat(store_statfs(0x1b5c2c000/0x0/0x1bfc00000, data 0x35932b8/0x3701000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 139575296 unmapped: 30588928 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:17.018044+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d1ae2800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 ms_handle_reset con 0x5581d1ae2800 session 0x5581ce6d8960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 heartbeat osd_stat(store_statfs(0x1b5c10000/0x0/0x1bfc00000, data 0x35b0c7e/0x371e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 139624448 unmapped: 30539776 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 ms_handle_reset con 0x5581cc836800 session 0x5581ce461a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:18.018158+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 139673600 unmapped: 30490624 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce6d83c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:19.018339+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 139706368 unmapped: 30457856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:20.018492+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.012815475s of 10.295495033s, submitted: 71
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 140034048 unmapped: 30130176 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 1959042 data_alloc: 184549376 data_used: 892928
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 ms_handle_reset con 0x5581cf329000 session 0x5581d0313e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3102c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 ms_handle_reset con 0x5581d3102c00 session 0x5581cf84da40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:21.018641+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 150192128 unmapped: 19972096 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 ms_handle_reset con 0x5581cf326c00 session 0x5581ccdd83c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 heartbeat osd_stat(store_statfs(0x1b51f0000/0x0/0x1bfc00000, data 0x3fcf1c8/0x413e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [0,0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 ms_handle_reset con 0x5581cc836800 session 0x5581cc3a8960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:22.018788+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 142278656 unmapped: 27885568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:23.018936+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 142327808 unmapped: 27836416 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce4a1000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 ms_handle_reset con 0x5581cf326c00 session 0x5581cee7c1e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:24.019081+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3102c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 144326656 unmapped: 25837568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 246 ms_handle_reset con 0x5581cf329000 session 0x5581ccdd81e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 246 handle_osd_map epochs [246,247], i have 246, src has [1,247]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:25.019218+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 247 ms_handle_reset con 0x5581d3102c00 session 0x5581cc4f25a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 247 ms_handle_reset con 0x5581ce4a1000 session 0x5581ce6da5a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 142352384 unmapped: 27811840 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2134814 data_alloc: 184549376 data_used: 905216
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:26.019373+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 141254656 unmapped: 28909568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 247 heartbeat osd_stat(store_statfs(0x1b474d000/0x0/0x1bfc00000, data 0x4a6b937/0x4be1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:27.019549+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 141377536 unmapped: 28786688 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:28.019688+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 247 ms_handle_reset con 0x5581cf326c00 session 0x5581cf474b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf329000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 247 ms_handle_reset con 0x5581cc836800 session 0x5581ccf65c20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 247 heartbeat osd_stat(store_statfs(0x1b46fa000/0x0/0x1bfc00000, data 0x4abde20/0x4c34000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3102c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 142639104 unmapped: 27525120 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc78a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 247 ms_handle_reset con 0x5581d3102c00 session 0x5581cee7c000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:29.019865+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 248 ms_handle_reset con 0x5581cc78a400 session 0x5581cf4cc780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc78b400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 248 ms_handle_reset con 0x5581cc78b400 session 0x5581cf4741e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 248 heartbeat osd_stat(store_statfs(0x1b3c7a000/0x0/0x1bfc00000, data 0x553da1c/0x56b4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 248 ms_handle_reset con 0x5581cf329000 session 0x5581cc4f3a40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 143343616 unmapped: 26820608 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:30.019967+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.162355423s of 10.152432442s, submitted: 192
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 143474688 unmapped: 26689536 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2248922 data_alloc: 184549376 data_used: 913408
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:31.020121+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc78a400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 144547840 unmapped: 25616384 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 249 ms_handle_reset con 0x5581cc78a400 session 0x5581d0546000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:32.020251+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 144908288 unmapped: 25255936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc836800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:33.020403+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 144908288 unmapped: 25255936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 250 ms_handle_reset con 0x5581cc836800 session 0x5581ce7afa40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:34.020537+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 145170432 unmapped: 24993792 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 250 heartbeat osd_stat(store_statfs(0x1b4663000/0x0/0x1bfc00000, data 0x4b52661/0x4cca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf326c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:35.020680+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 251 ms_handle_reset con 0x5581cf326c00 session 0x5581cc16da40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3102c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 144588800 unmapped: 25575424 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2028425 data_alloc: 184549376 data_used: 937984
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 251 ms_handle_reset con 0x5581d3102c00 session 0x5581cc62d0e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:36.020818+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 251 heartbeat osd_stat(store_statfs(0x1b5989000/0x0/0x1bfc00000, data 0x382c16e/0x39a2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 145637376 unmapped: 24526848 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:37.020964+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 145768448 unmapped: 24395776 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:38.021080+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 145768448 unmapped: 24395776 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:39.021273+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 145768448 unmapped: 24395776 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:40.021443+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.383242607s of 10.152204514s, submitted: 221
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 145481728 unmapped: 24682496 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2038777 data_alloc: 184549376 data_used: 937984
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:41.021580+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 145481728 unmapped: 24682496 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 252 heartbeat osd_stat(store_statfs(0x1b5921000/0x0/0x1bfc00000, data 0x3894e8e/0x3a0d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 252 handle_osd_map epochs [252,253], i have 252, src has [1,253]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:42.021702+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 145874944 unmapped: 24289280 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:43.021857+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 146063360 unmapped: 24100864 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:44.021987+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 146939904 unmapped: 23224320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:45.022160+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 147038208 unmapped: 23126016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2136547 data_alloc: 184549376 data_used: 950272
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:46.022385+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 147111936 unmapped: 23052288 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 253 heartbeat osd_stat(store_statfs(0x1b4da5000/0x0/0x1bfc00000, data 0x440c9b6/0x4589000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:47.022557+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 147365888 unmapped: 22798336 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:48.022731+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 147374080 unmapped: 22790144 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:49.022919+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 147374080 unmapped: 22790144 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:50.023052+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.330633163s of 10.003045082s, submitted: 137
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 253 heartbeat osd_stat(store_statfs(0x1b4d6b000/0x0/0x1bfc00000, data 0x44477ae/0x45c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 147767296 unmapped: 22396928 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2144043 data_alloc: 184549376 data_used: 954368
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:51.023170+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 148922368 unmapped: 21241856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 253 heartbeat osd_stat(store_statfs(0x1b3b9d000/0x0/0x1bfc00000, data 0x4474317/0x45f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:52.023347+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 253 heartbeat osd_stat(store_statfs(0x1b3b5a000/0x0/0x1bfc00000, data 0x44b7c38/0x4634000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 148922368 unmapped: 21241856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:53.023520+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 149184512 unmapped: 20979712 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:54.023692+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 149184512 unmapped: 20979712 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:55.023856+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 149200896 unmapped: 20963328 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2162823 data_alloc: 184549376 data_used: 954368
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:56.024004+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 149078016 unmapped: 21086208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:57.024169+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 253 heartbeat osd_stat(store_statfs(0x1b3a6d000/0x0/0x1bfc00000, data 0x45a06fd/0x4720000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 150159360 unmapped: 20004864 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:58.024328+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 253 heartbeat osd_stat(store_statfs(0x1b3a58000/0x0/0x1bfc00000, data 0x45b54f9/0x4735000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 150274048 unmapped: 19890176 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:14:59.024514+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce5af400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 152920064 unmapped: 17244160 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:00.024702+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.255170822s of 10.056875229s, submitted: 143
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 152748032 unmapped: 17416192 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2259357 data_alloc: 184549376 data_used: 954368
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:01.024921+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 154542080 unmapped: 15622144 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:02.025064+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 153731072 unmapped: 16433152 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:03.025204+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 254 heartbeat osd_stat(store_statfs(0x1b2e12000/0x0/0x1bfc00000, data 0x51fce71/0x537c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 154812416 unmapped: 15351808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:04.025351+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 155000832 unmapped: 15163392 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:05.025499+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 154566656 unmapped: 15597568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2275877 data_alloc: 184549376 data_used: 966656
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:06.025670+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 154566656 unmapped: 15597568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:07.025824+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 154566656 unmapped: 15597568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 254 handle_osd_map epochs [254,255], i have 254, src has [1,255]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:08.025987+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 154746880 unmapped: 15417344 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:09.026146+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b2d28000/0x0/0x1bfc00000, data 0x52e28f8/0x5464000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b2d28000/0x0/0x1bfc00000, data 0x52e28f8/0x5464000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 154755072 unmapped: 15409152 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3100400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 255 ms_handle_reset con 0x5581d3100400 session 0x5581cd398f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:10.026262+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3101000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3100800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 256 ms_handle_reset con 0x5581d3100800 session 0x5581cf4cd860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156876800 unmapped: 13287424 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2289855 data_alloc: 184549376 data_used: 978944
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:11.026398+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.986414909s of 10.593873978s, submitted: 165
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 256 handle_osd_map epochs [256,257], i have 256, src has [1,257]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 257 ms_handle_reset con 0x5581d3101000 session 0x5581ce7ae000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156884992 unmapped: 13279232 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b1b81000/0x0/0x1bfc00000, data 0x52e6d87/0x546c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 257 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:12.026569+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156884992 unmapped: 13279232 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:13.026709+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee22c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 258 ms_handle_reset con 0x5581cee22c00 session 0x5581d06712c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156909568 unmapped: 13254656 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:14.026878+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf7cc800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cc78a000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 259 ms_handle_reset con 0x5581cc78a000 session 0x5581ce287860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156917760 unmapped: 13246464 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:15.027033+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 260 ms_handle_reset con 0x5581cf7cc800 session 0x5581cc67c960
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156917760 unmapped: 13246464 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2304848 data_alloc: 184549376 data_used: 991232
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:16.027153+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee22c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156917760 unmapped: 13246464 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:17.027326+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 260 ms_handle_reset con 0x5581cee22c00 session 0x5581cc3a9680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156925952 unmapped: 13238272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 261 heartbeat osd_stat(store_statfs(0x1b1b73000/0x0/0x1bfc00000, data 0x52ef570/0x547a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:18.027510+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 261 heartbeat osd_stat(store_statfs(0x1b1b73000/0x0/0x1bfc00000, data 0x52ef570/0x547a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3101400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 261 ms_handle_reset con 0x5581d3101400 session 0x5581ce085860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156925952 unmapped: 13238272 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:19.027712+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3101000
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3100800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156934144 unmapped: 13230080 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 262 ms_handle_reset con 0x5581d3100800 session 0x5581cc16cb40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 262 heartbeat osd_stat(store_statfs(0x1b1b6e000/0x0/0x1bfc00000, data 0x52f17fe/0x547e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:20.027915+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 262 handle_osd_map epochs [263,263], i have 262, src has [1,263]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 263 ms_handle_reset con 0x5581d3101000 session 0x5581cf444b40
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156958720 unmapped: 13205504 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2319200 data_alloc: 184549376 data_used: 999424
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee22c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:21.028079+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.040943146s of 10.238163948s, submitted: 66
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 264 ms_handle_reset con 0x5581cee22c00 session 0x5581cea47680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 264 handle_osd_map epochs [263,264], i have 264, src has [1,264]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 156991488 unmapped: 13172736 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 264 handle_osd_map epochs [264,265], i have 264, src has [1,265]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf7cc800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:22.029691+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157007872 unmapped: 13156352 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 265 heartbeat osd_stat(store_statfs(0x1b175f000/0x0/0x1bfc00000, data 0x52f7e17/0x5489000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 265 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:23.030403+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 266 ms_handle_reset con 0x5581cf7cc800 session 0x5581d07f8780
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 266 heartbeat osd_stat(store_statfs(0x1b175f000/0x0/0x1bfc00000, data 0x52f7e17/0x5489000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157032448 unmapped: 13131776 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:24.030710+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157032448 unmapped: 13131776 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:25.030923+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157040640 unmapped: 13123584 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 266 ms_handle_reset con 0x5581ce5af400 session 0x5581d05445a0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2325686 data_alloc: 184549376 data_used: 995328
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:26.031078+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3100800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 266 ms_handle_reset con 0x5581d3100800 session 0x5581cea46d20
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157073408 unmapped: 13090816 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:27.031237+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157073408 unmapped: 13090816 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:28.031424+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b2d68000/0x0/0x1bfc00000, data 0x3cf1168/0x3e84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 267 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 267 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 267 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157081600 unmapped: 13082624 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 268 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:29.031604+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 268 heartbeat osd_stat(store_statfs(0x1b2d63000/0x0/0x1bfc00000, data 0x3cf33fd/0x3e88000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157081600 unmapped: 13082624 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:30.031769+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157081600 unmapped: 13082624 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2172633 data_alloc: 184549376 data_used: 1003520
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:31.031944+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.636757851s of 10.066541672s, submitted: 158
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 268 heartbeat osd_stat(store_statfs(0x1b2d67000/0x0/0x1bfc00000, data 0x3cf3336/0x3e87000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157081600 unmapped: 13082624 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 268 handle_osd_map epochs [269,269], i have 268, src has [1,269]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:32.032153+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157081600 unmapped: 13082624 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:33.032358+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 270 handle_osd_map epochs [269,270], i have 270, src has [1,270]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157138944 unmapped: 13025280 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3101400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:34.032504+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 271 ms_handle_reset con 0x5581d3101400 session 0x5581d07f9e00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 12984320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b2d5b000/0x0/0x1bfc00000, data 0x3cf998f/0x3e90000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:35.032634+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 12984320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2180623 data_alloc: 184549376 data_used: 1015808
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:36.032788+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 12984320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 271 handle_osd_map epochs [271,272], i have 271, src has [1,272]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:37.032930+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 272 handle_osd_map epochs [272,272], i have 272, src has [1,272]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 12984320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:38.033109+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 12984320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:39.033317+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 272 heartbeat osd_stat(store_statfs(0x1b2d5c000/0x0/0x1bfc00000, data 0x3cfb973/0x3e91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 12984320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:40.033461+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 12984320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 872415232 meta_used: 2179619 data_alloc: 184549376 data_used: 1028096
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:41.033676+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.732331276s of 10.141363144s, submitted: 136
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 12984320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce5af400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:42.033912+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 273 ms_handle_reset con 0x5581ce5af400 session 0x5581d05450e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157179904 unmapped: 12984320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:43.034055+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cee22c00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 274 ms_handle_reset con 0x5581cee22c00 session 0x5581d05441e0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157204480 unmapped: 12959744 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:44.034200+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581cf7cc800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 274 ms_handle_reset con 0x5581cf7cc800 session 0x5581cea46f00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3100800
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157220864 unmapped: 12943360 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 274 heartbeat osd_stat(store_statfs(0x1b2d55000/0x0/0x1bfc00000, data 0x3cffdf5/0x3e99000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:45.034316+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 51
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 275 ms_handle_reset con 0x5581d3100800 session 0x5581cfe6d680
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 275 handle_osd_map epochs [275,275], i have 275, src has [1,275]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157376512 unmapped: 12787712 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2196934 data_alloc: 184549376 data_used: 1052672
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:46.034505+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581d3100400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 275 ms_handle_reset con 0x5581d3100400 session 0x5581cf475860
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: handle_auth_request added challenge on 0x5581ce5af400
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157384704 unmapped: 12779520 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 275 ms_handle_reset con 0x5581ce5af400 session 0x5581cf4743c0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:47.034686+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157392896 unmapped: 12771328 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:48.034819+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b2d53000/0x0/0x1bfc00000, data 0x3d01f38/0x3e9b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157392896 unmapped: 12771328 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:49.034988+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157392896 unmapped: 12771328 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:50.035147+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157392896 unmapped: 12771328 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2191948 data_alloc: 184549376 data_used: 1048576
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:51.035304+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b2d54000/0x0/0x1bfc00000, data 0x3d01e42/0x3e9a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.720602989s of 10.008458138s, submitted: 86
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157392896 unmapped: 12771328 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:52.035475+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157401088 unmapped: 12763136 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:53.035640+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 276 heartbeat osd_stat(store_statfs(0x1b2d4f000/0x0/0x1bfc00000, data 0x3d03fd3/0x3e9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:54.035809+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:55.035984+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2198602 data_alloc: 184549376 data_used: 1060864
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:56.036146+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:57.036331+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157409280 unmapped: 12754944 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:58.036468+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157409280 unmapped: 12754944 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:15:59.036683+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 276 heartbeat osd_stat(store_statfs(0x1b2d4d000/0x0/0x1bfc00000, data 0x3d04109/0x3ea0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:00.036817+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2199520 data_alloc: 184549376 data_used: 1060864
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:01.036969+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 276 heartbeat osd_stat(store_statfs(0x1b2d4e000/0x0/0x1bfc00000, data 0x3d04109/0x3ea0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.934391975s of 10.023045540s, submitted: 30
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:02.037148+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 276 handle_osd_map epochs [276,277], i have 276, src has [1,277]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:03.037309+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:04.037489+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:05.037745+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2205122 data_alloc: 184549376 data_used: 1069056
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:06.037935+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:07.038126+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 277 heartbeat osd_stat(store_statfs(0x1b2d49000/0x0/0x1bfc00000, data 0x3d06382/0x3ea4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:08.038316+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 277 heartbeat osd_stat(store_statfs(0x1b2d49000/0x0/0x1bfc00000, data 0x3d06382/0x3ea4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 12738560 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:09.038560+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158490624 unmapped: 11673600 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:10.038710+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158490624 unmapped: 11673600 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2203568 data_alloc: 184549376 data_used: 1069056
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:11.038847+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158490624 unmapped: 11673600 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:12.039040+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.381987572s of 10.549871445s, submitted: 36
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b2d47000/0x0/0x1bfc00000, data 0x3d083dd/0x3ea6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158498816 unmapped: 11665408 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:13.039247+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b2d48000/0x0/0x1bfc00000, data 0x3d08342/0x3ea5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158498816 unmapped: 11665408 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:14.039375+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b2d48000/0x0/0x1bfc00000, data 0x3d08342/0x3ea5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158498816 unmapped: 11665408 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:15.039533+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158507008 unmapped: 11657216 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2205482 data_alloc: 184549376 data_used: 1081344
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:16.039681+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158507008 unmapped: 11657216 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:17.039845+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158507008 unmapped: 11657216 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:18.040017+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b2d4a000/0x0/0x1bfc00000, data 0x3d082a7/0x3ea4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158507008 unmapped: 11657216 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:19.040200+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158507008 unmapped: 11657216 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:20.040399+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158507008 unmapped: 11657216 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2205482 data_alloc: 184549376 data_used: 1081344
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:21.040558+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158507008 unmapped: 11657216 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:22.040695+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.938141823s of 10.003154755s, submitted: 25
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b2d4a000/0x0/0x1bfc00000, data 0x3d082a7/0x3ea4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158507008 unmapped: 11657216 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:23.040847+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158507008 unmapped: 11657216 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:24.040989+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158515200 unmapped: 11649024 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:25.041164+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158515200 unmapped: 11649024 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2207058 data_alloc: 184549376 data_used: 1081344
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:26.041349+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158515200 unmapped: 11649024 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:27.041534+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158515200 unmapped: 11649024 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:28.041675+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b2d49000/0x0/0x1bfc00000, data 0x3d08342/0x3ea5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158515200 unmapped: 11649024 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:29.041891+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158523392 unmapped: 11640832 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:30.042066+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158523392 unmapped: 11640832 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b2d48000/0x0/0x1bfc00000, data 0x3d083dd/0x3ea6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2209552 data_alloc: 184549376 data_used: 1081344
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:31.042215+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158523392 unmapped: 11640832 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:32.042361+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.951921463s of 10.000122070s, submitted: 9
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158523392 unmapped: 11640832 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:33.042547+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 278 handle_osd_map epochs [279,279], i have 278, src has [1,279]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 279 heartbeat osd_stat(store_statfs(0x1b2d48000/0x0/0x1bfc00000, data 0x3d083dd/0x3ea6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158539776 unmapped: 11624448 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:34.042738+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158539776 unmapped: 11624448 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:35.042899+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158539776 unmapped: 11624448 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:36.043084+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2215926 data_alloc: 184549376 data_used: 1089536
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 279 heartbeat osd_stat(store_statfs(0x1b2d43000/0x0/0x1bfc00000, data 0x3d0a6fc/0x3eaa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158539776 unmapped: 11624448 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:37.043268+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158539776 unmapped: 11624448 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:38.043430+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158539776 unmapped: 11624448 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:39.043605+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158539776 unmapped: 11624448 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:40.043780+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 20K writes, 78K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s
                                                          Cumulative WAL: 20K writes, 7167 syncs, 2.83 writes per sync, written: 0.06 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 12K writes, 44K keys, 12K commit groups, 1.0 writes per commit group, ingest: 33.24 MB, 0.06 MB/s
                                                          Interval WAL: 12K writes, 5261 syncs, 2.33 writes per sync, written: 0.03 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158564352 unmapped: 11599872 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:41.043958+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2214004 data_alloc: 184549376 data_used: 1089536
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 279 heartbeat osd_stat(store_statfs(0x1b2d45000/0x0/0x1bfc00000, data 0x3d0a5eb/0x3ea9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 279 handle_osd_map epochs [279,280], i have 279, src has [1,280]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158564352 unmapped: 11599872 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:42.044137+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.848137856s of 10.002282143s, submitted: 71
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158564352 unmapped: 11599872 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:43.044404+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 280 heartbeat osd_stat(store_statfs(0x1b2d40000/0x0/0x1bfc00000, data 0x3d0c79c/0x3ead000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158564352 unmapped: 11599872 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:44.044603+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158588928 unmapped: 11575296 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:45.044833+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 280 heartbeat osd_stat(store_statfs(0x1b2d41000/0x0/0x1bfc00000, data 0x3d0c79c/0x3ead000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158588928 unmapped: 11575296 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:46.044991+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2219952 data_alloc: 184549376 data_used: 1101824
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 280 heartbeat osd_stat(store_statfs(0x1b2d41000/0x0/0x1bfc00000, data 0x3d0c79c/0x3ead000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158588928 unmapped: 11575296 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:47.045159+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 11567104 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:48.045340+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 11567104 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:49.045588+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 11567104 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:50.045775+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 11567104 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:51.045952+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2218396 data_alloc: 184549376 data_used: 1101824
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 280 heartbeat osd_stat(store_statfs(0x1b2d42000/0x0/0x1bfc00000, data 0x3d0c701/0x3eac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 280 heartbeat osd_stat(store_statfs(0x1b2d43000/0x0/0x1bfc00000, data 0x3d0c666/0x3eab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 11567104 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:52.046128+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.784289360s of 10.863885880s, submitted: 21
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 11567104 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:53.046320+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 280 handle_osd_map epochs [281,281], i have 280, src has [1,281]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 11567104 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:54.046450+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 11567104 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:55.046598+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 11567104 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:56.046784+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2224206 data_alloc: 184549376 data_used: 1110016
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158605312 unmapped: 11558912 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:57.046966+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 281 heartbeat osd_stat(store_statfs(0x1b2d3d000/0x0/0x1bfc00000, data 0x3d0e97a/0x3eb0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158605312 unmapped: 11558912 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 281 handle_osd_map epochs [282,282], i have 281, src has [1,282]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:58.047093+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 282 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158613504 unmapped: 11550720 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:16:59.047319+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158613504 unmapped: 11550720 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:00.047502+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158613504 unmapped: 11550720 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:01.047713+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2227014 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158613504 unmapped: 11550720 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 282 handle_osd_map epochs [283,283], i have 282, src has [1,283]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:02.047876+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 283 heartbeat osd_stat(store_statfs(0x1b2d36000/0x0/0x1bfc00000, data 0x3d12c6a/0x3eb7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158629888 unmapped: 11534336 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:03.048035+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158629888 unmapped: 11534336 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:04.048242+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 283 heartbeat osd_stat(store_statfs(0x1b2d36000/0x0/0x1bfc00000, data 0x3d12c6a/0x3eb7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.424899101s of 11.610745430s, submitted: 77
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158629888 unmapped: 11534336 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:05.048493+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158629888 unmapped: 11534336 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:06.048705+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2229840 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158629888 unmapped: 11534336 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:07.048890+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 283 handle_osd_map epochs [284,284], i have 283, src has [1,284]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158646272 unmapped: 11517952 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:08.049590+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158646272 unmapped: 11517952 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:09.049853+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d32000/0x0/0x1bfc00000, data 0x3d14dfb/0x3ebb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:10.050029+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:11.050180+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2234566 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:12.050388+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d32000/0x0/0x1bfc00000, data 0x3d14dfb/0x3ebb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:13.050581+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:14.050829+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.966484070s of 10.016953468s, submitted: 21
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:15.051045+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:16.051212+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2232820 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:17.051373+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:18.051533+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:19.051747+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158654464 unmapped: 11509760 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:20.051894+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158662656 unmapped: 11501568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:21.052003+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2232836 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:22.052143+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158662656 unmapped: 11501568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:23.052307+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158662656 unmapped: 11501568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:24.052493+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158662656 unmapped: 11501568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:25.052709+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158662656 unmapped: 11501568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.847666740s of 10.878998756s, submitted: 7
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:26.052865+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158662656 unmapped: 11501568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2234412 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d33000/0x0/0x1bfc00000, data 0x3d14dfb/0x3ebb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:27.053051+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158662656 unmapped: 11501568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:28.053212+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158662656 unmapped: 11501568 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:29.053412+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158670848 unmapped: 11493376 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:30.053647+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158670848 unmapped: 11493376 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:31.053827+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158670848 unmapped: 11493376 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2234412 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d33000/0x0/0x1bfc00000, data 0x3d14dfb/0x3ebb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:32.054804+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d33000/0x0/0x1bfc00000, data 0x3d14dfb/0x3ebb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,1])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158670848 unmapped: 11493376 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:33.054936+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158670848 unmapped: 11493376 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d33000/0x0/0x1bfc00000, data 0x3d14dfb/0x3ebb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:34.055154+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 158670848 unmapped: 11493376 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:35.055339+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159719424 unmapped: 10444800 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:36.055509+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159719424 unmapped: 10444800 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2233738 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.590624809s of 11.623681068s, submitted: 7
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:37.055640+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159727616 unmapped: 10436608 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 52
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:38.055784+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:39.055966+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:40.056167+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:41.056366+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2233738 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:42.056531+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:43.056708+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:44.056864+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:45.057077+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:46.057252+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2233898 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.977633476s of 10.000997543s, submitted: 5
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:47.057408+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:48.057789+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:49.058010+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d33000/0x0/0x1bfc00000, data 0x3d14dfb/0x3ebb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159891456 unmapped: 10272768 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:50.058147+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159899648 unmapped: 10264576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:51.058338+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159899648 unmapped: 10264576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2235506 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:52.058470+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159899648 unmapped: 10264576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:53.058706+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159907840 unmapped: 10256384 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:54.058894+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159916032 unmapped: 10248192 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:55.059044+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159916032 unmapped: 10248192 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:56.059169+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159916032 unmapped: 10248192 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2234624 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:57.059343+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.964854240s of 10.005094528s, submitted: 8
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159916032 unmapped: 10248192 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:58.059537+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159916032 unmapped: 10248192 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:17:59.059827+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159916032 unmapped: 10248192 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:00.060014+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159916032 unmapped: 10248192 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:01.060164+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159932416 unmapped: 10231808 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2238336 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d32000/0x0/0x1bfc00000, data 0x3d14e96/0x3ebc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d32000/0x0/0x1bfc00000, data 0x3d14e96/0x3ebc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:02.060302+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159940608 unmapped: 10223616 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:03.060482+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159940608 unmapped: 10223616 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:04.060710+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159940608 unmapped: 10223616 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d33000/0x0/0x1bfc00000, data 0x3d14dfb/0x3ebb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:05.060904+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159940608 unmapped: 10223616 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:06.061067+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159940608 unmapped: 10223616 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2235526 data_alloc: 184549376 data_used: 1122304
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:07.061216+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.929611206s of 10.007748604s, submitted: 202
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 heartbeat osd_stat(store_statfs(0x1b2d34000/0x0/0x1bfc00000, data 0x3d14d60/0x3eba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160284672 unmapped: 9879552 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 53
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:08.061381+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _renew_subs
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _send_mon_message to mon.np0005541913 at v2:172.18.0.104:3300/0
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 284 handle_osd_map epochs [285,285], i have 284, src has [1,285]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160292864 unmapped: 9871360 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:09.061637+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160301056 unmapped: 9863168 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:10.061784+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160301056 unmapped: 9863168 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:11.061931+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160301056 unmapped: 9863168 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2239728 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:12.062080+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160301056 unmapped: 9863168 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 285 heartbeat osd_stat(store_statfs(0x1b2d2f000/0x0/0x1bfc00000, data 0x3d17009/0x3ebe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:13.062284+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160301056 unmapped: 9863168 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:14.062440+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 285 heartbeat osd_stat(store_statfs(0x1b2d2f000/0x0/0x1bfc00000, data 0x3d17009/0x3ebe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160301056 unmapped: 9863168 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:15.062644+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160301056 unmapped: 9863168 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:16.062777+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160301056 unmapped: 9863168 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2239728 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:17.062906+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160309248 unmapped: 9854976 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 285 handle_osd_map epochs [286,286], i have 285, src has [1,286]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.533912659s of 10.584923744s, submitted: 28
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:18.063042+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160317440 unmapped: 9846784 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:19.063238+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160317440 unmapped: 9846784 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:20.063380+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160317440 unmapped: 9846784 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:21.063573+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160317440 unmapped: 9846784 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:22.063729+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160317440 unmapped: 9846784 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:23.063909+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160317440 unmapped: 9846784 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:24.064067+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160317440 unmapped: 9846784 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:25.064238+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160325632 unmapped: 9838592 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:26.064419+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160325632 unmapped: 9838592 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:27.064552+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160325632 unmapped: 9838592 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:28.064723+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160325632 unmapped: 9838592 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:29.065181+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160325632 unmapped: 9838592 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:30.065337+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160325632 unmapped: 9838592 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:31.065519+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160325632 unmapped: 9838592 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:32.065716+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160325632 unmapped: 9838592 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:33.065874+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160333824 unmapped: 9830400 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:34.066226+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160333824 unmapped: 9830400 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:35.066367+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160333824 unmapped: 9830400 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:36.066526+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160333824 unmapped: 9830400 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:37.066719+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160333824 unmapped: 9830400 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:38.066913+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160333824 unmapped: 9830400 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:39.067058+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160333824 unmapped: 9830400 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:40.067214+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160333824 unmapped: 9830400 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:41.067395+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160342016 unmapped: 9822208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:42.067563+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160342016 unmapped: 9822208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:43.067736+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160342016 unmapped: 9822208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:44.067902+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160342016 unmapped: 9822208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:45.068073+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160342016 unmapped: 9822208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:46.068264+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160342016 unmapped: 9822208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:47.068427+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160342016 unmapped: 9822208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:48.068575+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160342016 unmapped: 9822208 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:49.068815+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160350208 unmapped: 9814016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:50.068996+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160350208 unmapped: 9814016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:51.069183+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160350208 unmapped: 9814016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:52.069353+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160350208 unmapped: 9814016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:53.069556+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160350208 unmapped: 9814016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:54.069749+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160350208 unmapped: 9814016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:55.069933+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160350208 unmapped: 9814016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:56.070124+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160350208 unmapped: 9814016 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:57.070351+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:58.070539+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:18:59.070738+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:00.070921+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:01.071084+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:02.071267+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:03.071432+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:04.071638+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:05.071779+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:06.072016+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160358400 unmapped: 9805824 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2242526 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.298309326s of 49.326419830s, submitted: 15
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 ms_handle_reset con 0x5581cefff800 session 0x5581ce7afe00
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2c000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:07.073450+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160595968 unmapped: 9568256 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:08.073648+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Got map version 54
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/4212177170,v1:172.18.0.108:6811/4212177170]
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160161792 unmapped: 10002432 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:09.073855+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160161792 unmapped: 10002432 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:10.074021+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160161792 unmapped: 10002432 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:11.074188+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160161792 unmapped: 10002432 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:12.074328+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160161792 unmapped: 10002432 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:13.074497+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160186368 unmapped: 9977856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:14.074725+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160186368 unmapped: 9977856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:15.074924+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160186368 unmapped: 9977856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:16.075113+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160186368 unmapped: 9977856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:17.075262+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160186368 unmapped: 9977856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:18.075434+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160186368 unmapped: 9977856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:19.075695+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160186368 unmapped: 9977856 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:20.075874+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160194560 unmapped: 9969664 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:21.076357+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160202752 unmapped: 9961472 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:22.076557+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160202752 unmapped: 9961472 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:23.076747+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160202752 unmapped: 9961472 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:24.076903+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160202752 unmapped: 9961472 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:25.077053+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160202752 unmapped: 9961472 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:26.077216+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160202752 unmapped: 9961472 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:27.077415+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160202752 unmapped: 9961472 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:28.077605+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160202752 unmapped: 9961472 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:29.077816+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160210944 unmapped: 9953280 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:30.077993+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160210944 unmapped: 9953280 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:31.078164+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160210944 unmapped: 9953280 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:32.078350+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160210944 unmapped: 9953280 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:33.078507+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160210944 unmapped: 9953280 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:34.078674+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160210944 unmapped: 9953280 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:35.078836+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160210944 unmapped: 9953280 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:36.079000+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160219136 unmapped: 9945088 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:37.079170+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160227328 unmapped: 9936896 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:38.079855+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160227328 unmapped: 9936896 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:39.080037+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160227328 unmapped: 9936896 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:40.080212+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160227328 unmapped: 9936896 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:41.080360+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160227328 unmapped: 9936896 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:42.080484+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160227328 unmapped: 9936896 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:43.080686+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160227328 unmapped: 9936896 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:44.080857+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160235520 unmapped: 9928704 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:45.081037+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160235520 unmapped: 9928704 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:46.081210+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160235520 unmapped: 9928704 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:47.081410+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160235520 unmapped: 9928704 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:48.081588+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160235520 unmapped: 9928704 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:49.081840+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160235520 unmapped: 9928704 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:50.081994+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160235520 unmapped: 9928704 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:51.082149+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160235520 unmapped: 9928704 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:52.082292+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160243712 unmapped: 9920512 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:53.082457+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160243712 unmapped: 9920512 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:54.082709+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160243712 unmapped: 9920512 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:55.082885+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160251904 unmapped: 9912320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:56.083067+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160251904 unmapped: 9912320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:57.083237+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160251904 unmapped: 9912320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:58.083409+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160251904 unmapped: 9912320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:19:59.083602+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160251904 unmapped: 9912320 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:00.083810+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160260096 unmapped: 9904128 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:01.083987+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160260096 unmapped: 9904128 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:02.084190+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160260096 unmapped: 9904128 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:03.084399+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160260096 unmapped: 9904128 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:04.084652+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160260096 unmapped: 9904128 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:05.084838+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160260096 unmapped: 9904128 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:06.084978+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160260096 unmapped: 9904128 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:07.085165+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160260096 unmapped: 9904128 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:08.085372+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 9895936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:09.085581+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 9895936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:10.085738+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 9895936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:11.085881+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 9895936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:12.086042+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 9895936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:13.086206+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 9895936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:14.086417+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 9895936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:15.086583+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 9895936 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:16.086738+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160276480 unmapped: 9887744 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:17.086914+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160276480 unmapped: 9887744 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:18.087069+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160276480 unmapped: 9887744 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:19.087253+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160276480 unmapped: 9887744 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:20.087410+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160276480 unmapped: 9887744 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:21.087506+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160276480 unmapped: 9887744 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:22.087688+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160276480 unmapped: 9887744 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:23.087854+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160276480 unmapped: 9887744 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:24.087991+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160284672 unmapped: 9879552 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:25.088148+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160284672 unmapped: 9879552 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:26.088285+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160284672 unmapped: 9879552 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:27.088464+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160284672 unmapped: 9879552 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:28.088670+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160284672 unmapped: 9879552 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:29.088820+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160284672 unmapped: 9879552 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:30.089134+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160284672 unmapped: 9879552 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:31.089253+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160284672 unmapped: 9879552 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:32.089360+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160292864 unmapped: 9871360 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:33.089484+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160292864 unmapped: 9871360 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:34.089634+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160292864 unmapped: 9871360 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:35.089811+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160292864 unmapped: 9871360 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:36.089968+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160292864 unmapped: 9871360 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: bluestore.MempoolThread(0x5581cabe3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2241822 data_alloc: 184549376 data_used: 1130496
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: osd.0 286 heartbeat osd_stat(store_statfs(0x1b2d2d000/0x0/0x1bfc00000, data 0x3d1911f/0x3ec1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [1,2,3,4,5] op hist [])
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:37.090119+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: do_command 'config diff' '{prefix=config diff}'
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: do_command 'config show' '{prefix=config show}'
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 160260096 unmapped: 9904128 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: do_command 'counter dump' '{prefix=counter dump}'
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: do_command 'counter schema' '{prefix=counter schema}'
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:38.090248+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159907840 unmapped: 10256384 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:39.090415+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: prioritycache tune_memory target: 3561601228 mapped: 159899648 unmapped: 10264576 heap: 170164224 old mem: 2222054675 new mem: 2222054675
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: tick
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_tickets
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-02T10:20:40.090541+0000)
Dec 02 10:21:10 np0005541913.localdomain ceph-osd[31622]: do_command 'log dump' '{prefix=log dump}'
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1895407980' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/66508782' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3393577515' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.571 281858 DEBUG oslo_concurrency.processutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.576 281858 DEBUG nova.compute.provider_tree [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed in ProviderTree for provider: c79215b2-6762-4f7f-a322-f44db2b0b9bd update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.609 281858 DEBUG nova.scheduler.client.report [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Inventory has not changed for provider c79215b2-6762-4f7f-a322-f44db2b0b9bd based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.610 281858 DEBUG nova.compute.resource_tracker [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Compute_service record updated for np0005541913.localdomain:np0005541913.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 02 10:21:10 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:10.611 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1999813796' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec 02 10:21:10 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4143362805' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: pgmap v829: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/434608318' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/4228538185' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2146630095' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1895407980' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/66508782' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3393577515' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1366845384' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/607218307' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/4154530576' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2620739229' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1999813796' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/4143362805' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2548576217' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1464133244' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2247838553' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2173933401' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1015905978' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1518916062' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/66728782' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 02 10:21:11 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:21:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.
Dec 02 10:21:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.
Dec 02 10:21:11 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.
Dec 02 10:21:12 np0005541913.localdomain podman[342071]: 2025-12-02 10:21:12.018032255 +0000 UTC m=+0.150302699 container health_status 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 02 10:21:12 np0005541913.localdomain podman[342071]: 2025-12-02 10:21:12.048311037 +0000 UTC m=+0.180581481 container exec_died 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 02 10:21:12 np0005541913.localdomain systemd[1]: 34d8d72f1760791efd6fce92829e22cf2c62a563d913233457df3757a8f0a0cb.service: Deactivated successfully.
Dec 02 10:21:12 np0005541913.localdomain podman[342072]: 2025-12-02 10:21:12.060272007 +0000 UTC m=+0.189418887 container health_status 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64)
Dec 02 10:21:12 np0005541913.localdomain podman[342073]: 2025-12-02 10:21:11.986950733 +0000 UTC m=+0.117491900 container health_status 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 02 10:21:12 np0005541913.localdomain podman[342072]: 2025-12-02 10:21:12.092880321 +0000 UTC m=+0.222027211 container exec_died 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 10:21:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:12.130 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:21:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:12.132 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:21:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:12.132 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:21:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:12.132 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:21:12 np0005541913.localdomain systemd[1]: 6736ec798dce98e500dcdb77911f371920e6e3051f1b61d9ef64ab452cccc2f2.service: Deactivated successfully.
Dec 02 10:21:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:12.141 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:21:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:12.142 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:21:12 np0005541913.localdomain podman[342073]: 2025-12-02 10:21:12.182259586 +0000 UTC m=+0.312800693 container exec_died 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 02 10:21:12 np0005541913.localdomain systemd[1]: 89e2cca4d9a47129c03eea427c7a5efb1eb8bdc504c72980bc05f27cf94e4c3e.service: Deactivated successfully.
Dec 02 10:21:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1518916062' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 02 10:21:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/358616255' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 02 10:21:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/66728782' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 02 10:21:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1047167114' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 02 10:21:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1091717628' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 02 10:21:12 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/896269902' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 02 10:21:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:12.611 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:21:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:12.611 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 02 10:21:12 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:12.611 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 02 10:21:12 np0005541913.localdomain systemd[1]: Starting Hostname Service...
Dec 02 10:21:13 np0005541913.localdomain systemd[1]: Started Hostname Service.
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.69893 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: pgmap v830: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.69899 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.59494 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.69911 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.59500 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.69917 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.59509 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.69926 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3920558208' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2706382169' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1162678121' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 02 10:21:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:13.650 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquiring lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 02 10:21:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:13.651 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Acquired lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 02 10:21:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:13.652 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 02 10:21:13 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:13.652 281858 DEBUG nova.objects.instance [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b254bb7f-2891-4b37-9c44-9700e301ce16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "versions"} v 0)
Dec 02 10:21:13 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/535179414' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3736550702' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.59515 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.59521 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.59527 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.69941 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.49746 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.59536 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.49752 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.69953 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.59551 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.49758 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.49767 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1162678121' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2332191097' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/535179414' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/975355669' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3040349159' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3736550702' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2055653369' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:14 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.69968 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: pgmap v831: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.59566 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.49782 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.69983 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.59581 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.49794 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2937001150' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2055653369' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/687034646' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2998395856' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2891361070' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "config dump"} v 0)
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2457193087' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 02 10:21:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:15.788 281858 DEBUG nova.network.neutron [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updating instance_info_cache with network_info: [{"id": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "address": "fa:16:3e:26:b2:03", "network": {"id": "595e1c9b-709c-41d2-9212-0b18b13291a8", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "e2d97696ab6749899bb8ba5ce29a3de2", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a318f6a-b3", "ovs_interfaceid": "4a318f6a-b3c1-4690-8246-f7d046ccd64a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 02 10:21:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:15.814 281858 DEBUG oslo_concurrency.lockutils [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Releasing lock "refresh_cache-b254bb7f-2891-4b37-9c44-9700e301ce16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 02 10:21:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:15.815 281858 DEBUG nova.compute.manager [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] [instance: b254bb7f-2891-4b37-9c44-9700e301ce16] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 02 10:21:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:15.816 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:21:15 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:15.816 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:15 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2048632244' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='client.49806 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='client.49824 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2457193087' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: pgmap v832: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3194501843' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='client.70037 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='client.59641 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/280261908' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2048632244' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "df"} v 0)
Dec 02 10:21:16 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3159311886' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3815186711' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:17.142 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:21:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:17.144 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 02 10:21:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:17.145 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 02 10:21:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:17.145 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:21:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:17.180 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 02 10:21:17 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:17.181 281858 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2635357428' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/661400808' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3159311886' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: from='client.49884 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3855858690' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3815186711' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/4018752893' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3647462322' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec 02 10:21:17 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2890963783' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 02 10:21:17 np0005541913.localdomain kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 02 10:21:17 np0005541913.localdomain kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 02 10:21:17 np0005541913.localdomain kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 02 10:21:17 np0005541913.localdomain kernel: cfg80211: failed to load regulatory.db
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: pgmap v833: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2890963783' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/975817563' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: from='client.70079 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: from='client.59686 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2394582050' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/2444120639' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1802639970' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3603341291' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec 02 10:21:18 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/910381125' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 02 10:21:19 np0005541913.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.
Dec 02 10:21:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/3603341291' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 02 10:21:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/3415027665' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 02 10:21:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/3195466827' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 02 10:21:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2335335146' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 02 10:21:19 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/910381125' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 02 10:21:19 np0005541913.localdomain ceph-mon[298296]: from='client.70118 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:19 np0005541913.localdomain ceph-mon[298296]: from='client.49923 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:19 np0005541913.localdomain podman[343252]: 2025-12-02 10:21:19.443478241 +0000 UTC m=+0.080578199 container health_status f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 10:21:19 np0005541913.localdomain podman[343252]: 2025-12-02 10:21:19.483073313 +0000 UTC m=+0.120173281 container exec_died f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 02 10:21:19 np0005541913.localdomain systemd[1]: f79637bbe02d299ca23081de92d7b3b5c284700b2ff5ad8463e8d3dfa9fcba8a.service: Deactivated successfully.
Dec 02 10:21:19 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec 02 10:21:19 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1530990401' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 02 10:21:19 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:19.830 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: from='client.70124 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/1536255720' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: pgmap v834: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/4248790002' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1530990401' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: from='client.70136 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: from='client.59716 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/440212892' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: from='client.70142 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:20.825 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:21:20 np0005541913.localdomain nova_compute[281854]: 2025-12-02 10:21:20.827 281858 DEBUG oslo_service.periodic_task [None req-023d8683-8286-4dd2-80e5-5e768347515b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd dump"} v 0)
Dec 02 10:21:20 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1984495066' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon) e17 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2751643394' entity='client.admin' cmd={"prefix": "osd numa-status"} : dispatch
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: mon.np0005541913@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: from='client.59722 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: from='client.49947 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/943004398' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/1984495066' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.108:0/2323713709' entity='client.admin' cmd={"prefix": "osd numa-status"} : dispatch
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.106:0/1645213913' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 02 10:21:21 np0005541913.localdomain ceph-mon[298296]: from='client.? 172.18.0.107:0/2751643394' entity='client.admin' cmd={"prefix": "osd numa-status"} : dispatch
